# [Official] AMD R9 295X2 Owners Club



## EliteReplay

subbed


----------



## EliteReplay




----------



## wermad

How are temps in xfire? Open bench does help







. Looking forward to a couple of these in the future but for now, all i can do is







over these bad boyz.

Congrats guys


----------



## EliteReplay

R9 295x2 ITX Build


----------



## bencher

Quote:


> Originally Posted by *EliteReplay*
> 
> R9 295x2 ITX Build


wow lol


----------



## King4x4

Subbed!


----------



## wermad

Quote:


> Originally Posted by *King4x4*
> 
> Subbed!


You ordered one? Or two?









Btw, update that sig rig sir


----------



## King4x4

Quote:


> Originally Posted by *wermad*
> 
> You ordered one? Or two?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, update that sig rig sir


Got a friend of mine in the industry who might get them at a good discount (if I break even with the asle of my quadfire setup I might jump on two







).


----------



## Marc VS

Thanks for the great video and the benchmarks! It´s an impressive GPU and the competition between AMD and Nvidia is more interesting than ever. AMD´s closed loop liquid cooling system is innovative. The Radeon R9 295X2 is the fastest GPU up to date and it has a higher floating point performance than the Titan Z. I would be really curious to see some rendering demos 



 Titan Z has a bit more shading units and TMUs than 295X2. Taking into account the specs and price tags of the two GPUs, I think R9 295X2 is gaming-oriented and Titan Z is video-oriented (i.e. intended for professionals who work on 3D modeling, effects and animations).
Titan Z is definitely too expensive for gamers, while AMD might be an option for some, like this owners club proves







(btw, I´m very jelly as well...) Here is a comparison of the two.
Overall, I think AMD is better taking into account the price - quality ratio. Looking forward to hear more about it!


----------



## NavDigitalStorm

On a scale of 1-10, how excited is everyone?


----------



## wermad

4x mini displayports on a Hawaiian platform, this excited:



(Its a 10 imho, though I don't have anys







).


----------



## Cool Mike

My HIS 295x2 is now installed and running great. I guess some are you are wondering how I got a retail version this soon. Was sheer luck,







Newegg had the Powercolor and HIS version on the site saying "Coming Soon". Tuesday night around 9PM EST I looked and sure enough the HIS version was available. I immediately purchased it and did overnight shipping. Not sure how many Newegg had but they were sold out within 1 hour. Now in my system and running great.

*In my opinion this is AMD's best designed reference card ever.*

First impressions:
Very solid feel, All metal shroud. Does NOT Droop in my system
The 120MM radiator/Fan installed with no problems.
With full load - Very Quite
The red lighted fan will add a RED glow inside of your system. I have a RED theme so great in my system.
The word "RADEON" is also lighted RED.

Currently I am at 1085 on both cores and a very healthy 1700 on the memory (Hynix Memory). Only overclocking in CCC. Both firestrike and Heaven Valley are stable at 4K resolution.

After a 15 minute run in valley temps never exceeded 72C.


----------



## yunshin

Technically, you could put a push/pull setup on that and improve temps a little more... no?


----------



## Cool Mike

Thought about doing just that. Running cool as is. I didn't want to extend the radiator/fan assembly more into my case.
Temps are great. Low 70's even running 4K benches.


----------



## NavDigitalStorm

Hope you are as happy as I am Cool Mike! It's an INSANE card! How did you get it early?


----------



## EliteReplay

Quote:


> My HIS 295x2 is now installed and running great. I guess some are you are wondering how I got a retail version this soon. Was sheer luck, smile.gif Newegg had the Powercolor and HIS version on the site saying "Coming Soon". Tuesday night around 9PM EST I looked and sure enough the HIS version was available. I immediately purchased it and did overnight shipping. Not sure how many Newegg had but they were sold out within 1 hour. Now in my system and running great.
> 
> *In my opinion this is AMD's best designed reference card ever.*
> 
> First impressions:
> Very solid feel, All metal shroud. Does NOT Droop in my system
> The 120MM radiator/Fan installed with no problems.
> With full load - Very Quite
> The red lighted fan will add a RED glow inside of your system. I have a RED theme so great in my system.
> The word "RADEON" is also lighted RED.
> 
> Currently I am at 1085 on both cores and a very healthy 1700 on the memory (Hynix Memory). Only overclocking in CCC. Both firestrike and Heaven Valley are stable at 4K resolution.
> 
> After a 15 minute run in valley temps never exceeded 72C.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


how those that records compare to let say CF290X or SLI 780GTX?
other than that man it seem its a great gpu enjoy as much as you can... lol


----------



## Cool Mike

Very happy. Wonderful card. I explain in the first paragraph on the big post with pics. Was pure Luck.

Guess I'm the first retail owner, You were the first.









*Been benching the card off and on for the last 3 hours. Seems to be settling in. Running 1100 core and 1700 memory now with no issues in valley at 4K. Very solid card and the 290x GPU cores love the cooler environment.







*


----------



## ssiperko

Quote:


> Originally Posted by *EliteReplay*
> 
> how those that records compare to let say CF290X or SLI 780GTX?
> other than that man it seem its a great gpu enjoy as much as you can... lol


That kicks the crap outta my CF 290 at 1100/1400 with 4770k @ 4.7/4.4 ---- my best so far is 165xx

I







over that score.









SS


----------



## MunneY

Thats a pretty monster score on that card... 18000 is a BIG number with a mild overclock... I'd like to see it about 1200 :-D


----------



## Cool Mike

I had two Powercolor 290x PCS+ in crossfire a few weeks ago. the 295x2 beat them by about 500 points on Firestrike. Very good for a single card solution.


----------



## Cool Mike

Not wanting to add any voltage right now. You can see why. Just dished out $1500 for this beast.







I have been amazed with the memory. This must be some top shelf Hynix memory, I have tested 4 different 290x cards including the Lightning and I have never hit 1700 stable on the memory.

Card running solid at 1100 Core and 1700 Memory. Just ran a fresh 3Dmark Firestrike. 18,314 for a single card is great.


----------



## NavDigitalStorm

Great scores!


----------



## NavDigitalStorm

Here is our WIP build.


__
http://instagr.am/p/m6OHQdPBD5%2F/


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Here is our WIP build.
> 
> 
> __
> http://instagr.am/p/m6OHQdPBD5%2F/


What case is that? Do you think that's enough space between the cards to be adequately cool?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> What case is that? Do you think that's enough space between the cards to be adequately cool?


450D. The Push-Pull should push enough air over the cards themselves. I'll be testing it over this weekend so I'll let you know. Worse comes to worse, we can install a fan on the side window or 3D Print a bracket for the GPUS


----------



## axiumone

Awesome! I'm acutally looking to do the same exact build. Will wait for your results.


----------



## MunneY

Quote:


> Originally Posted by *Cool Mike*
> 
> Not wanting to add any voltage right now. You can see why. Just dished out $1500 for this beast.
> 
> 
> 
> 
> 
> 
> 
> I have been amazed with the memory. This must be some top shelf Hynix memory, I have tested 4 different 290x cards including the Lightning and I have never hit 1700 stable on the memory.
> 
> Card running solid at 1100 Core and 1700 Memory. Just ran a fresh 3Dmark Firestrike. 18,314 for a single card is great.


yea i understand, but come on... ive got two Tis on the way and i will push them to the ragged edge immediately.. im crazy though


----------



## Cool Mike

Tried afterburner beta19. No go. Cant Adjust voltages. Maybe next AB beta.


----------



## Sgt Bilko

I am so subbed to this, these things look amazing


----------



## Bartouille

Impressive memory overclocks so far. I don't think the only cause to that is good hynix memory, it can't be. I guess the chips in there are slightly different or something, even reviews were pulling over 1600mhz like it's nothing.


----------



## EliteReplay

Quote:


> Originally Posted by *Bartouille*
> 
> Impressive memory overclocks so far. I don't think the only cause to that is good hynix memory, it can't be. I guess the chips in there are slightly different or something, even reviews were pulling over 1600mhz like it's nothing.


is very impresive for a dual GPU to overclock like this, i think we have a king of the hill


----------



## bossie2000

Hallo Hardware Rep( navdigitalstorm).Very impress with your unbiasness!! You got a 780ti and praising AMD's newest offering. Great for you.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *bossie2000*
> 
> Hallo Hardware Rep( navdigitalstorm).Very impress with your unbiasness!! You got a 780ti and praising AMD's newest offering. Great for you.


It's the beauty of technology! I love anything as long as it doesn't break on me with in two days


----------



## Elmy

I will be a member here soon. I will be running 2 295X2 with EK waterblocks. Just waiting for Club3D to receive their stock. I need them to run my 5 Asus VG248QE's @ 144Hz.

Going to put them in my LAN rig shown here.


----------



## Jpmboy

Quote:


> Originally Posted by *Elmy*
> 
> I will be a member here soon. I will be running 2 295X2 with EK waterblocks. Just waiting for Club3D to receive their stock. I need them to run my 5 Asus VG248QE's @ 144Hz.
> 
> Going to put them in my LAN rig shown here.
> 
> 
> Spoiler: Warning: Spoiler!


anyone got an eta on the ek waterblock??


----------



## Elmy

Quote:


> Originally Posted by *Jpmboy*
> 
> anyone got an eta on the ek waterblock??


I can't say anything... You might want to ask in the EK owners thread. There is a EK rep there.


----------



## Dhalgren65

VEERY Exciting!
I hope to grab one soon-
Congratulations owners!

If I were to get one-
I am thinking of deleting the OE cooling-
I know it's getting praise-but
(Sorry, all solutions so far look kinda bulky & I need to integrate the COMPONENT into MY system, not the PACKAGE!)
I have 2 MCW82's and 2 MCW 82-7900's-
Which seems more suited?
(Are there "steps" on the chips like 7950/7970's)
All of the review photos seem to indicate flat, so the early ones?
Any feedback appreciated-play on!


----------



## EliteReplay

Quote:


> Originally Posted by *Elmy*
> 
> I will be a member here soon. I will be running 2 295X2 with EK waterblocks. Just waiting for Club3D to receive their stock. I need them to run my 5 Asus VG248QE's @ 144Hz.
> 
> Going to put them in my LAN rig shown here.


Congrats! woluld like to see some nice and juicy picture once is done!


----------



## Cool Mike

Update:

Running a lot of benchmarks and BF4 at 4K maxed out (Ultra AAX4) to find the highest stable overclock.

At this time 1090 MHz Core and 1650 Memory at stock voltage seems to be the sweet spot for my example. At 1080P or 1440P the overclocks are slightly higher, Of course, 4K maxed out really loads the two cores so you have to reduce freq. accordingly. The 295x2 is simply a wonderful card. Expensive yes. AMD did a great job with the design.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Update:
> 
> Running a lot of benchmarks and BF4 at 4K maxed out (Ultra AAX4) to find the highest stable overclock.
> 
> At this time 1090 MHz Core and 1650 Memory at stock voltage seems to be the sweet spot for my example. At 1080P or 1440P the overclocks are slightly higher, Of course, 4K maxed out really loads the two cores so you have to reduce freq. accordingly. The 295x2 is simply a wonderful card. Expensive yes. AMD did a great job with the design.


Couldn't agree more! I'm guessing you're the first non-industry affiliate to get this card. Probably the first to get a retail boxed version


----------



## Cool Mike

Seems that way. The retail box was sealed. I'm sure the retailers have them in stock and ready the ship. Wonder if Newegg jumped the gun and shipped the HIS too early. I was very lucky to grab one of these when they showed up. The driver disk that came with card came with Beta 14.4. Guessing that one or a revised version will show up on AMD's site next week.

Will you compile a list of owners as they begin to ship?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Seems that way. The retail box was sealed. I'm sure the retailers have them in stock and ready the ship. Wonder if Newegg jumped the gun and shipped the HIS too early. I was very lucky to grab one of these when they showed up. The driver disk that came with card came with Beta 14.4. Guessing that one or a revised version will show up on AMD's site next week.
> 
> Will you compile a list of owners as they begin to ship?


Yup, just like I posted your review i'll do the same for others once people start to receive theirs.

The shelf date isn't until Monday so Newegg definitely goofed.


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Yup, just like I posted your review i'll do the same for others once people start to receive theirs.
> 
> *The shelf date isn't until Monday so Newegg definitely goofed*.


Would really like to see a 4K valley run, before pulling the trigger on these cards...

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0


the HIS cards were up for like 1/2 day, could have grabbed one. But, ...really need to see some benchmarks, waterblocks on the market (no AIO please







) and OC ceiling.


----------



## Cool Mike

Here's your 4K Valley run. Hope this helps. Cores = 1090 Memory = 1650
With 4K the pixel density is so good 4X AA is not really needed. 51FPS vs. 38FPS

(295x2) Valley Settings - 4K - Ultra - AA 4X



(295x2) Valley Settings - 4K - Ultra - No AA


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> Here's your 4K Valley run. Hope this helps. Cores = 1090 Memory = 1650
> With 4K the pixel density is so good 4X AA is not really needed. 51FPS vs. 38FPS
> 
> (295x2) Valley Settings - 4K - Ultra - AA 4X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> (295x2) Valley Settings - 4K - Ultra - No AA
> 
> 
> Spoiler: Warning: Spoiler!


thanks!! not too bad at 4xAA. Valley is not kind to AMD (drivers). BTW - hit F12 at the end of the run and post your results to the Valley thread - helps to have OCN members see the run.


----------



## Jpmboy

Would be very helpful if as you guys benchmark these cards to post to the local threads:

http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30 (single gpu only...)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jpmboy*
> 
> Would be very helpful if as you guys benchmark these cards to post to the local threads:
> 
> http://www.overclock.net/t/1443196/firestrike-extreme-top-30
> http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
> http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
> http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30 (single gpu only...)


i lol'd at the last one


----------



## NavDigitalStorm

I have the system with two of them In 4-way crossfire all to myself this weekend. Benchmarking non-stop


----------



## Cool Mike

You may see 30K+ firestrike runs.


----------



## NavDigitalStorm

Are you planning on putting a full waterblock on your card?


----------



## The Mac

It would seem pointless to pay the premium for the AIO solution just to remove it and add a full cover waterblock.

Hopefully AIBs will make preblocked skus.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *The Mac*
> 
> It would seem pointless to pay the premium for the AIO solution just to remove it and add a full cover waterblock.
> 
> Hopefully AIBs will make a premade one,


Im pretty sure the AIO was included beacause it was necessary. With that said, people who spend $1,500 on GPU's usually don't mind spending another $300 on a cooling solution.


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> You may see 30K+ firestrike runs.


that's what I'm talkin about.


----------



## NavDigitalStorm

Well I did get around 19K on FireStrike Extreme.


----------



## Jpmboy

nice - i missed that. I did see the 18.3K FS score, which is very good!


----------



## NavDigitalStorm

Only a couple of hours till the precious is mine


----------



## EliteReplay

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Only a couple of hours till the precious is mine


are you getting 2 of this puppys?


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Im pretty sure the AIO was included beacause it was necessary. With that said, people who *spend $1,500 on GPU's usually don't mind spending another $300* on a cooling solution.


^^ *this.*

but probably more like 150 for a block.. i hope.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *EliteReplay*
> 
> are you getting 2 of this puppys?


This

__
http://instagr.am/p/m6OHQdPBD5%2F/
 will be in my home in about 2 hours.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> ^^ *this.*
> 
> but probably more like 150 for a block.. i hope.


\

I added the extra cost just in case people need a radiator, pump and reservoir.


----------



## EliteReplay

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> This
> 
> __
> http://instagr.am/p/m6OHQdPBD5%2F/
> will be in my home in about 2 hours.


i posted that picture yestarday on a Facebook page https://www.facebook.com/groups/westcoastmods/


----------



## NavDigitalStorm

Quote:


> Originally Posted by *EliteReplay*
> 
> i posted that picture yestarday on a Facebook page https://www.facebook.com/groups/westcoastmods/


----------



## heroxoot

Quote:


> Originally Posted by *EliteReplay*
> 
> R9 295x2 ITX Build


How can something so powerful be so cute?


----------



## Sgt Bilko

Quote:


> Originally Posted by *EliteReplay*
> 
> i posted that picture yestarday on a Facebook page https://www.facebook.com/groups/westcoastmods/


oh man....dem comments


----------



## EliteReplay

Quote:


> Originally Posted by *Sgt Bilko*
> 
> oh man....dem comments


yeah, you know what people do when they see something about AMD... they just get full of hate.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *EliteReplay*
> 
> yeah, you know what people do when they see something about AMD... they just get full of hate.


Fanboys... WHY DO YOU EXIST!!!


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Fanboys... WHY DO YOU EXIST!!!


everywhere, on all sides.








I hope someone brings out a 295x2 with a quality WB like: http://www.newegg.com/Product/Product.aspx?Item=N82E16814129299.
Voided warranties = hate.


----------



## NavDigitalStorm

37,640 graphics score on Fire Strike!!!


----------



## EliteReplay

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> 37,640 graphics score on Fire Strike!!!


OMG but put the picture !!!


----------



## NavDigitalStorm

Here is a picture of the complete rig when it is on.


__
http://instagr.am/p/m9FF3evBHZ%2F/


----------



## Cool Mike

Had a feeling you would hit 35K plus. Excellent Scaling. I'm hitting 18K plus with one 295x2

What's the PSU wattage on that monstrosity









The overclocking software out there right now hasn't been updated for the 295x2's yet. Tried Afterburner, Trixx and GPU Tweek. Nothing works. I need about 30 - 50mV and believe I will be 1100 core stable running 4K maxed.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Had a feeling you would hit 35K plus. Excellent Scaling. I'm hitting 18K plus with one 295x2
> 
> What's the PSU wattage on that monstrosity
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The overclocking software out there right now hasn't been updated for the 295x2's yet. Tried Afterburner, Trixx and GPU Tweek. Nothing works. I need about 30 - 50mV and believe I will be 1100 core stable running 4K maxed.


Nice! this 450D with these two in there is a BEAST!


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> 37,640 graphics score on Fire Strike!!!


Quote:


> Originally Posted by *EliteReplay*
> 
> OMG but put the picture !!!


better yet - post the validation URL (yeah, I know the AMD driver is not yet approved by FM). I'm a scientist... a professional skeptic









I hope they can do better than that:


Spoiler: Warning: Spoiler!






I loved my 7970's


----------



## Elmy

Quote:


> Originally Posted by *Cool Mike*
> 
> Had a feeling you would hit 35K plus. Excellent Scaling. I'm hitting 18K plus with one 295x2
> 
> What's the PSU wattage on that monstrosity
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The overclocking software out there right now hasn't been updated for the 295x2's yet. Tried Afterburner, Trixx and GPU Tweek. Nothing works. I need about 30 - 50mV and believe I will be 1100 core stable running 4K maxed.


PSU looks like a Silverstone ST1500. Same one I will be getting for my setup.


----------



## Cool Mike

Newegg has many different brand names now. 6 or 7, I guess.

Looking like Monday


----------



## Cool Mike

With two you would need a 1500W for sure. My Rig is pulling about 900W (max load) at the wall. Indicated on my UPS. Actual PSU draw around 810W considering PSU Eff. My EVGA 1300W is handling the stress very well. Would be pushing it with two 295x2's. Could handle one additional 290x for sure.


----------



## NavDigitalStorm

Yup I'm getting about a 1370w peak load on the system.


----------



## Jpmboy

Quote:


> Originally Posted by *Elmy*
> 
> PSU looks like a Silverstone ST1500. Same one I will be getting for my setup.


Check the ST1500 ... 25A per pcie rail. It's a great PSU (I have one) but if you push the card(s) hard it will trigger OCP and shutdown.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> Check the ST1500 ... 25A per pcie rail. It's a great PSU (I have one) but if you push the card(s) hard it will trigger OCP and shutdown.


I've had it run FireStrike on loop for two hours without an issue.


----------



## NavDigitalStorm

Here is my 3DMark run...

http://www.3dmark.com/3dm/2911077?


----------



## NavDigitalStorm

Quick questions, how do I embed an excel like sheet that other owner's clubs implement on the OP? Google Docs?


----------



## Elmy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I've had it run FireStrike on loop for two hours without an issue.


Was that overclocked? Do you have a kill-a-watt to measure the wattage being used?

Just curious. I was plan on getting the ST1500 as well.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Elmy*
> 
> Was that overclocked? Do you have a kill-a-watt to measure the wattage being used?
> 
> Just curious. I was plan on getting the ST1500 as well.


No OC on the GPU, a 4.5ghz at 1.35v on the 4930K.

Pulled about 1370w on the kill-a-watt at the wall.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Marc VS*
> 
> Thanks for the great video and the benchmarks! It´s an impressive GPU and the competition between AMD and Nvidia is more interesting than ever. AMD´s closed loop liquid cooling system is innovative. The Radeon R9 295X2 is the fastest GPU up to date and it has a higher floating point performance than the Titan Z. I would be really curious to see some rendering demos
> 
> 
> 
> Titan Z has a bit more shading units and TMUs than 295X2. Taking into account the specs and price tags of the two GPUs, I think R9 295X2 is gaming-oriented and Titan Z is video-oriented (i.e. intended for professionals who work on 3D modeling, effects and animations).
> Titan Z is definitely too expensive for gamers, while AMD might be an option for some, like this owners club proves
> 
> 
> 
> 
> 
> 
> 
> (btw, I´m very jelly as well...) Here is a comparison of the two.
> Overall, I think AMD is better taking into account the price - quality ratio. Looking forward to hear more about it!


If you could recommend some software to benchmark these two against each other, i'll gladly do it!


----------



## bossie2000

Is there any new world records yet???


----------



## NavDigitalStorm

Quote:


> Originally Posted by *bossie2000*
> 
> Is there any new world records yet???


Not #1 as those cards are likely cooled with LN2. Definitely broke some top 5 and top 10 scores. I believe I took the # 6 spot on Heaven right now.


----------



## bossie2000

Quote:


> I believe I took the # 6 spot on Heaven right now.


Nice work!!


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I've had it run FireStrike on loop for two hours without an issue.


Looping firestrike at OEM clocks shouldn't OCP that PSU (at all). It really depends on whether you exceed 25A on any given pcie rail (=300W) on the ST1500. There's 8 independent rails(?). Anyway, after talking to silverstone techs, I made a set of PCIE cables that combine rails so that each 8 pin has access to 50A. nearly 1000w thru a single KPE and the st1500 hangs tough (PCPower & Cooling 1200 running the rest of the kit). With my r290x and vrm commands, it could pull over 500W. Each 8 pin is spec'd at 250W. When you overclock and overvolt the card it's easy to ask that PCIE 8 pin for 400W.
Say - do you have a link to that 37K firestrike graphics score? Curious re: clocks.

derped.

Firestrike quads: http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+performance+preset/version+1.1/4+gpu only the top 2-3 are LN2 (not easy for 4 cards !


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Quick questions, how do I embed an excel like sheet that other owner's clubs implement on the OP? Google Docs?


yeah - open up a folder in docs, (wait, let me check mine).. create a frame themn publish the goolge doc


Spoiler: Warning: Spoiler!



https://docs.google.com/spreadsheet/pub?key=0ArgpMyj43ZFjdFVYNUUzZ04xRHBLSGZPdzc1a01ERXc&output=html&widget=true



the [xxxxx] is the doc pointer which comes from Google.
[[a] else it makes the frame... in this post. opps it did make the frame. *I can send you the code if you are interested... or PM Alancsalt*


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Not #1 as those cards are likely cooled with LN2. Definitely broke some top 5 and top 10 scores. I believe I took the # 6 spot on Heaven right now.


post your Heaven score here. I'll add it to the top30.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> post your Heaven score here. I'll add it to the top30.


Done!

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/2170#post_22139091


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Done!
> 
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/2170#post_22139091


sent PM.


----------



## Cool Mike

Nav, Have you tried Afterburner. No voltage control. But I would like to use it for monitoring and setting core and memory freq.

*One thing I noticed is the max memory freq in afterburner is 1625, but in CCC its 2000. Afterburner should duplicate the limits set by AMD in ccc. Thinking a new afterburner revision is needed.*


----------



## NavDigitalStorm

I haven't messed with OCing too much at the moment. Don't want to blow the PSU in there or these two card


----------



## Elmy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> No OC on the GPU, a 4.5ghz at 1.35v on the 4930K.
> 
> Pulled about 1370w on the kill-a-watt at the wall.


would that st1500 handle cpu @ 4.8 and all 4 gpus @ 1100\1650


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Elmy*
> 
> would that st1500 handle cpu @ 4.8 and all 4 gpus @ 1001650


It really depends on the silicon lottery I suppose.


----------



## Jpmboy

Quote:


> Originally Posted by *Elmy*
> 
> would that st1500 handle cpu @ 4.8 and all 4 gpus @ 1100\1650


sure. just don't overvolt the gpus too much. Take a look at the EVGA G2. Lot's of users here really like it. Single rail type.


----------



## Elmy

Quote:


> Originally Posted by *Jpmboy*
> 
> sure. just don't overvolt the gpus too much. Take a look at the EVGA G2. Lot's of users here really like it. Single rail type.


Quote:


> Originally Posted by *Jpmboy*
> 
> sure. just don't overvolt the gpus too much. Take a look at the EVGA G2. Lot's of users here really like it. Single rail type.


What would be the best PSU for running 2 of these 295X2's with either 4770K or 4930K


----------



## Jpmboy

Quote:


> Originally Posted by *Elmy*
> 
> What would be the best PSU for running 2 of these 295X2's with either 4770K or 4930K


if you stay close to the 500W/card TDP, and a 4770K the silverstone is fine, but so would be most any good Q 1200W (corsair, pc power&cooling. that advantage of the single rail design is you have access to the entire power output - overcurrent protection trips above 1200W (actually more like 1300W+)... but that;s the liabiloty. all the AMPs are available to every connection. check out the evga 1300 g2.

add up your power use... an OC 4930K will pull 130W and higher. 2x500+150 cpu + fans +... etc.

edit: the ST1500 can deliver a theroetical 50A with a 8+6 pin PCIE config IF each cable is on its own rail. DON'T use the single PSU plug 8+6 pin PCIE cable for a card.. limited to 300W+/-


----------



## NavDigitalStorm

My fellow members, WE HAVE JUST TOOK THE #5 spot on Heaven!!!

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/2170#post_22139996


----------



## Elmy

Quote:


> Originally Posted by *Jpmboy*
> 
> if you stay close to the 500W/card TDP, and a 4770K the silverstone is fine, but so would be most any good Q 1200W (corsair, pc power&cooling. that advantage of the single rail design is you have access to the entire power output - overcurrent protection trips above 1200W (actually more like 1300W+)... but that;s the liabiloty. all the AMPs are available to every connection. check out the evga 1300 g2.
> 
> add up your power use... an OC 4930K will pull 130W and higher. 2x500+150 cpu + fans +... etc.
> 
> edit: the ST1500 can deliver a theroetical 50A with a 8+6 pin PCIE config IF each cable is on its own rail. DON'T use the single PSU plug 8+6 pin PCIE cable for a card.. limited to 300W+/-


Navi is saying hes pulling 1370watts with his 2 295X2's.

I will need a 1500 watt PSU to push these.


----------



## Cool Mike

EVGA 1300 G2 in my RIG. Highly recommended.







Running two 295x2 with overclocked everything may push the EVGA to the MAX. Single Rail best. Thinking EVGA makes a 1500-1600 Watt?


----------



## NavDigitalStorm

I highly recommend the power supply that I am using for my dual configuration. If you are not in a rush I suggest waiting to see how Corsairs new 1500 Watt power supply is like.


----------



## Cool Mike

Good info. Here's the link. Titanium Rated!

http://news.softpedia.com/news/Corsair-Readies-1500W-80-Plus-Titanium-PSU-the-AX1500i-433908.shtml


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cool Mike*
> 
> Good info. Here's the link. Titanium Rated!
> 
> http://news.softpedia.com/news/Corsair-Readies-1500W-80-Plus-Titanium-PSU-the-AX1500i-433908.shtml


I'd hate to see the launch price in Aus for that beast


----------



## Jpmboy

Quote:


> Originally Posted by *Elmy*
> 
> Navi is saying hes pulling 1370watts with his 2 295X2's.
> 
> I will need a 1500 watt PSU to push these.


Uh - that's at the wall. X ... whatever, 80% efficiency = 1100 watts from the PSU. The point is voltage rails, not total power of the PSU. I think you've made up your mind already.









pick up a P3 kill-a-watt meter, like $20


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> My fellow members, WE HAVE JUST TOOK THE #5 spot on Heaven!!!
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/2170#post_22139996


Good job!

link to the Top30. http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores


----------



## prjindigo

An important little warning for all of you puppies.

The maximum engineering rating of the 4x4 "8 pin" molex is 24A. The manufacturers are insisting it'll pull 28A per socket. Make god-damned sure you have every little tiny bit of all the plugs in full contact with the pins in the socket.


----------



## ssiperko

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> My fellow members, WE HAVE JUST TOOK THE #5 spot on Heaven!!!
> 
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/2170#post_22139996


Nice!!!

but (thars always one) .... a 290X quad is still at the top.









SS


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ssiperko*
> 
> Nice!!!
> 
> but (thars always one) .... a 290X quad is still at the top.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SS


That quad probably was watercooled or extreme cooled haha


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> That quad probably was watercooled or extreme cooled haha


nah, they are just overclocked and watercooled, nothing exotic ... the 295x2 is water cooled also, well to a _degree_.
Can you determine the diameter of the tubing the AIO uses? Do the rad-side connections look like they can be separated? I really want to get 1 or 2 of these cards but not with that AIO cooler. Still no waterblocks on the market but EK has it in the works.


----------



## MunneY

Ok. So now my question is will they tri fire with a 290x


----------



## Sgt Bilko

Quote:


> Originally Posted by *MunneY*
> 
> Ok. So now my question is will they tri fire with a 290x


Don't see why not, 7990 worked with 7970, 6990 with 6970 etc

Still the same GPU just clocked higher that Ref


----------



## MunneY

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Don't see why not, 7990 worked with 7970, 6990 with 6970 etc
> 
> Still the same GPU just clocked higher that Ref


That's what I would a thought but you can never be positive with these things


----------



## pompss

Hope at midnight newegg will put the 295x2 ready to order.
Guys do you think i need to switch my seasonic ss-850 km 80 plus??
In case if have to what is the best Psu 1000w for max 200 dollars??
i always used seasonic and never had a problem.
thanks


----------



## NavDigitalStorm

That seasonic could do the job as my total rig with an OC 4960X ran at about 730W.

The Corsair RM 1000 might be a good option.


----------



## pompss

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> That seasonic could do the job as my total rig with an OC 4960X ran at about 730W.
> 
> The Corsair RM 1000 might be a good option.


Thanks for the info








Now Let see if i get one Tonite


----------



## NavDigitalStorm

Quote:


> Originally Posted by *pompss*
> 
> Thanks for the info
> 
> 
> 
> 
> 
> 
> 
> 
> Now Let see if i get one Tonite


----------



## shadow85

Hey are these available to buy yet? I can't find them anywhere? When are they suppose to be available to stores?


----------



## axiumone

Nav, do you have a corsair air 540 on site? Do you think it can accommodate the 295 with the rads mounted up front?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Nav, do you have a corsair air 540 on site? Do you think it can accommodate the 295 with the rads mounted up front?


I do but don't have the time this week to test it. We have it fitting the Corsair 450D which I believe has the same interior space.

I also added quad-fire benchmarks to the OP!


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I do but don't have the time this week to test it. We have it fitting the Corsair 450D which I believe has the same interior space.
> 
> I also added quad-fire benchmarks to the OP!


Had a chance to take a closer look at the specs. Max gpu length for the air 540 is 320mm and 430mm for the 450D. Looks like I have to go with the 450D.

It's 12:13am est. and newegg hasnt opened any of the cards for sale yet.


----------



## pompss

Quote:


> Originally Posted by *shadow85*
> 
> Hey are these available to buy yet? I can't find them anywhere? When are they suppose to be available to stores?


21 april .
If you wanna get one i suggest you to wait newegg open the orders at midnight (hopefully).
You can order it from Ncix but they are pretty expensive ($1699)


----------



## pompss

Quote:


> Originally Posted by *axiumone*
> 
> It's 12:13am est. and newegg hasnt opened any of the cards for sale yet.


most likely it will be western time ( 3.00 am est time)


----------



## NavDigitalStorm

Be sure to post your pics and proof so I can add you to the list!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Had a chance to take a closer look at the specs. Max gpu length for the air 540 is 320mm and 430mm for the 450D. Looks like I have to go with the 450D.
> 
> It's 12:13am est. and newegg hasnt opened any of the cards for sale yet.


You could always mount the radiator/fan on the top.


----------



## shadow85

Lol I can't buy from newegg anyway because I am in Australia. They don't ship here.

Have no idea when it will be available here.


----------



## Sgt Bilko

Quote:


> Originally Posted by *shadow85*
> 
> Lol I can't buy from newegg anyway because I am in Australia. They don't ship here.
> 
> Have no idea when it will be available here.


Newegg ships to Aus now, just not everything


----------



## Vashanime04

Can't wait till the release it on newegg. The 4870x2 I have in my pc needs replacing along with the mobo,CPU, and ram while I'm at it.


----------



## EliteReplay

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> That seasonic could do the job as my total rig with an OC 4960X ran at about 730W.
> 
> The Corsair RM 1000 might be a good option.


i wouldnt recommend that PSU to anyone with far better optios on the market

CoolerMaster V1000 is a better choice
There are some Seasonic ones too...

those psu from corsair are just that NAME.


----------



## Sgt Bilko

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202108

just became available









At least i think, says unavailable to me but that's because im in Aus









EDIT: Just switched to US so it's out


----------



## NavDigitalStorm

Can't wait to start adding all of you to the roster! Once I figure out how to embed this darn google spreadsheet haha.


----------



## Vashanime04

Yes!!!!!!!! I was hoping they would get in the sapphire's. I just put in my order. Now the waiting game.. ugh its the hardest part


----------



## Elmy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202108
> 
> just became available
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least i think, says unavailable to me but that's because im in Aus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Just switched to US so it's out


They are normally first.. Its their factory where these are all made.


----------



## NavDigitalStorm

Test https://docs.google.com/spreadsheets/d/1LkkqLWiw_Ha7h_ThM2QLi4V7APlVvbwPX8igbDBLaTY/pubhtml?widget=true&headers=false


----------



## NavDigitalStorm

OP has been updated with the Owners List!


----------



## Sgt Bilko

14.4 Driver has been released

http://www2.ati.com/drivers/amd-catalyst-14.4-rc-v1.0-windows-apr17.exe


Spoiler: Improvements and Warnings



Feature Highlights of The AMD Catalyst™ 14.4 Release Candidate Driver for Windows
Support for the AMD Radeon™ R9 295X
CrossFire™ fixes enhancements:
Crysis 3 - frame pacing improvements
Far Cry 3 - 3 and 4 GPU performance improvements at high quality settings, high resolution settings
Anno 2070 - Improved CrossFire scaling up to 34%
Titanfall - Resolved in game flickering with CrossFire enabled
Metro Last Light - Improved Crossfire scaling up to 10%
Eyefinity 3x1 (with three 4K panels) no longer cuts off portions of the application
Stuttering has been improved in certain applications when selecting mid-Eyefinity resolutions with V-sync Enabled
Full support for OpenGL 4.4
OpenGL 4.4 supports the following extensions:
ARB_buffer_storage
ARB_enhanced_layouts
ARB_query_buffer_object
ARB_clear_texture
ARB_texture_mirror_clamp_to_edge
ARB_texture_stencil8
ARB_vertex_type_10f_11f_11f_rev
ARB_multi_bind
ARB_bindless_texture
ARB_spare_texture
ARB_seamless_cubemap_per_texture
ARB_indirect_parameters
ARB_compute_variable_group_size
ARB_shader_draw_parameters
ARB_shader_group_vote
Mantle beta driver improvements:
BattleField 4: Performance slowdown is no longer seen when performing a task switch/Alt-tab
BattleField 4: Fuzzy images when playing in rotated SLS resolution with an A10 Kaveri system
Known Issues
System will TDR or BSOD when encoding with Power Director 11
Driver installation might result in a black screen when installing on a Dual AMD Radeon R9 295X configuration under Windows 8.1 on specific platforms (see below). The issue can be overcome by rebooting the PC; upon reboot the display driver will be installed. The remaining Catalyst components can then be installed.
ASUS Crosshair V Formula-Z (990FX)
ASUS Maximus VI Extreme (Z87
ASUS Rampage IV Extreme (X79)


----------



## NavDigitalStorm

Re-benchmarking with latest drivers, will update here!


----------



## SLADEizGOD

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Re-benchmarking with latest drivers, will update here!


What PSU are you using
Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Here is a picture of the complete rig when it is on.
> 
> 
> __
> http://instagr.am/p/m9FF3evBHZ%2F/


How did you get two of them in that case. But the cards look sweet. by the way wha PSU are you using?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *SLADEizGOD*
> 
> What PSU are you using
> How did you get two of them in that case. But the cards look sweet. by the way wha PSU are you using?


Silverstone ST1500 I believe. Heaven and 3Dmark are giving me the same scores but BF4 gave me a 10fps boost O_O.


----------



## SLADEizGOD

Well i guess I'll be saving up to pick one up. Im running 1440p so I'll just be getting one. Now its on like donkey kong..lol


----------



## NavDigitalStorm

Here is my updated benchmark....


----------



## Cool Mike

Wondering if its the same drivers I recieved with my HIS. 14.4

By the way, I'm sure everyone knows that the reference 295x2 for all name brands are identical. Just differences in lableing, retail box, possibly warranty and assessories which are specified by the respective graphics card vendors like HIS, Sapphire, Powercolor and so on...


----------



## Cool Mike

Time for a 4K monitor.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Wondering if its the same drivers I recieved with my HIS. 14.4
> 
> By the way, I'm sure everyone knows that the reference 295x2 for all name brands are identical. Just differences in lableing, retail box, possiably warranty and assessories which are specified by the respective graphics card vendors like HIS, Sapphire, Powercolor and so on...


When was your driver compiled?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Time for a 4K monitor.


Yup! Not no crappy TN panel either, you need this ASUS PQ321Q that I use.


----------



## Cool Mike

Thinking it was early April. I will verify the compile date when I get home tonight. I hope AMD made some changes

I received my Samsung 4K monitor (TN) about 1.5 weeks ago and is much better than I anticipated. Overall great reviews. I came from a 1440P IPS. Picked it up from NCIX for $599.99 when they were available for maybe one day.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Thinking it was early April. I will verify the compile date when I get home tonight. I hope AMD made some changes
> 
> I received my Samsung 4K monitor (TN) about 1.5 weeks ago and is much better than I anticipated. Overall great reviews. I came from a 1440P IPS. Picked it up from NCIX for $599.99 when they were available for maybe one day.


I still haven't seen that Samsung panel in person so I can't really say. My drivers were compiled on April 17th.


----------



## axiumone

Well.. I caved. I was hoping to hold out until either a gigabyte or an msi card became available, but patience is not one of my strong suits.


----------



## NavDigitalStorm

EDIT; Double post.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Well.. I caved. I was hoping to hold out until either a gigabyte or an msi card became available, but patience is not one of my strong suits.


Wooh! Post a picture once you receive them and ill add you to the roster


----------



## ColeriaX

Excited as ever to get this in. Bye Bye XFire 6950's you served me well.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ColeriaX*
> 
> Excited as ever to get this in. Bye Bye XFire 6950's you served me well.


Be sure to post a picture and tag me so I can add you to the roster when you receive it!


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> Thinking it was early April. I will verify the compile date when I get home tonight. I hope AMD made some changes
> 
> *I received my Samsung 4K monitor (TN) about 1.5 weeks ago and is much better than I anticipated.* Overall great reviews. I came from a 1440P IPS. Picked it up from NCIX for $599.99 when they were available for maybe one day.


I agree, the TN has very good response time colors pop... just that 4K on a 28 inch screen is waay to dense for me. Gave it to my wife for her tax business. Waiting for 40 inch 60Hz panel.

ordered a powercolor 295x2.. should be here in "two business days". Gonna have that AIO thing hanging off the side of my bench rig.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> I agree, the TN has very good response time colors pop... just that 4K on a 28 inch screen is waay to dense for me. Gave it to my wife for her tax business. Waiting for 40 inch 60Hz panel.


With Windows 8.1 new scaling I think 28" might be good actually. the 32" I use looks like a "retina" version of my 1440p 27".


----------



## Cool Mike

The scaling is great with Win 8.1. Guys, this Samsung is great in my eyes. Look at the reviews, This is a great TN 4K monitor.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> The scaling is great with Win 8.1. Guys, this Samsung is great in my eyes. Look at the reviews, This is a great TN 4K monitor.


Yup, I made the mistake of judging on Windows 7. The new 8.1 scaling is pretty damn awesome. Its tough, however. Get a 4K panel or the RoG Swift?


----------



## Cool Mike

Excellent Price on the EVGA 1300 G2 PSU. Mine has run flawlessly for about 4 months.

Guessing all the 295x2's are priced at 1499.99?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Excellent Price on the EVGA 1300 G2 PSU. Mine has run flawlessly for about 4 months.
> 
> Guessing all the 295x2's are priced at 1499.99?


The new revision G2's are actually pretty good.

And yeah, thats the MSRP.


----------



## Cool Mike

I moved my 1440P to my daughters computer upstairs. A nice step up from 1440P to 4K. Pixel density is King.









Now the 120hz 1440P ROG will be sweet also.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> I moved my 1440P to my daughters computer upstairs. A nice step up from 1440P to 4K. Pixel density is King.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now the 120hz 1440P ROG will be sweet also.


It's going to be tough deciding which one to keep lol.


----------



## shadow85

Quote:


> Originally Posted by *Jpmboy*
> 
> I agree, the TN has very good response time colors pop... just that 4K on a 28 inch screen is waay to dense for me. Gave it to my wife for her tax business. Waiting for 40 inch 60Hz panel.


I was just wondering which model samsung 4k monitor u got? I currently have a 1440p Samsung S27B970D PLS monitor and the colour quality on this is outstanding.

So im a bit sceptical on getting a 4k monitor atm, that 4k monitor color wont match that of my current samsung pls monitor.


----------



## Cool Mike

Here's a link on the Samsung 4K.

http://www.newegg.com/Product/Product.aspx?Item=0JC-0007-00009


----------



## pompss

i figure out the the card is too long for my case.
Also i found two 290 for 840 dollars.
Great card for sure but i think nvidia and amd shouldn't charge so higher price.
When the gtx 690 and 7990 was coming out the price was $ 999
I really dont understand the 1500 dollars price target if two 290 sells for 880 dollars new.( half price)
$ 999 should be a honest price.
But this is my opinion


----------



## Cool Mike

The 295x2 has two 290x GPU's. Not 290 GPU's.


----------



## pompss

Quote:


> Originally Posted by *Cool Mike*
> 
> The 295x2 has two 290x GPU's. Not 290 GPU's.


doesn't matter .
Two 290x Brand new sells for $1100. Still 400 dollar more and out perform the 295x2.
Also if you search on eBay and amazon u find used 290x for 400-440.
The 295x2 is still overpriced for what really offers .


----------



## NavDigitalStorm

Quote:


> Originally Posted by *pompss*
> 
> doesn't matter .
> Two 290x Brand new sells for $1100. Still 400 dollar more and out perform the 295x2.
> Also if you search on eBay and amazon u find used 290x for 400-440.
> The 295x2 is still overpriced for what really offers .


The R9 295X2 outperforms two R9 290X and runs cooler. Also, it only takes up two slots making it viable in smaller builds.

Sure, there is a premium but you are buying a premium product.

All those used cards are from miners getting rid of them, I would not purchase a used R9 series right now.


----------



## imran27

Quote:


> Originally Posted by *pompss*
> 
> doesn't matter .
> Two 290x Brand new sells for $1100. Still 400 dollar more and out perform the 295x2.
> Also if you search on eBay and amazon u find used 290x for 400-440.
> The 295x2 is still overpriced for what really offers .


Well, that's the premium of having two monsters in one cave...remember that these 295X2's can overclock a hell lot better than the R9 290/290Xs, I mean that 1150-1200 MHz core and 1600-1700 MHz mem is like a couple of clicks and ticks away. If you consider overclocking also, also the fact that they have Hynix memory you're getting more than what you're paying for.


----------



## Devotii

Only as pre-order on Scan and Overclockers for UK sales and not even on our Amazon lol. Retailing for around £1100 to £1200. I love the idea that more cards in future will have AIO coolers attached!








This is out of my "reasonable" priced limit!







but is worth it most likely!

On overclockers: ETA: 02/05/14 (2nd MAY!)


----------



## Cool Mike

As we know, if you are running a vertical case like most the top card in a crossfire configuration runs significantly hotter than the bottom limiting the overclock. I just sold 2 excellent 290x lightning's and went to the 295x2 mainly for the water cooling, the GPU heat is exhausted outside the case, is as fast or faster than 2x290x and leaves more room inside the case for goodies like a PCIe SSD. In my view worth $300 more over a 2x290x crossfire.


----------



## EliteReplay

I envy you guys







really happy to see people getting this beastly card, if i had the money


----------



## Sgt Bilko

Aquacomputer are the first to showcase a block for this: http://www.techpowerup.com/200106/aqua-computer-announces-radeon-r9-295x2-full-cover-block.html
Quote:


> The block with a copper base is available for 169.90 Euro while a nickel plated variant will follow soon for 184.90 Euro. Both variants will be also available with a smoked Plexiglas top.
> 
> Those who look for the ultimate cooling experience will be also able to purchase a passive and active backplate. They will be available later in May; prices are TBA.


----------



## Jpmboy

Quote:


> Originally Posted by *shadow85*
> 
> I was just wondering which model samsung 4k monitor u got? I currently have a 1440p Samsung S27B970D PLS monitor and the colour quality on this is outstanding.
> 
> So im a bit sceptical on getting a 4k monitor atm, that 4k monitor color wont match that of my current samsung pls monitor.


Samsung U28D590. Great monitor, just to dense for me (win8.1 or Win 7). The wife loves it on her [email protected] rig.


----------



## Jpmboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Aquacomputer are the first to showcase a block for this: http://www.techpowerup.com/200106/aqua-computer-announces-radeon-r9-295x2-full-cover-block.html


----------



## MunneY

Quote:


> Originally Posted by *Jpmboy*


Wellllllll by the looks of that you could easily make that a single slot card :-D


----------



## wermad

Would this beast still run @ 8x 3.0? Essentially splitting it 4x 3.0 for each core?


----------



## Jpmboy

Quote:


> Originally Posted by *wermad*
> 
> Would this beast still run @ 8x 3.0? Essentially splitting it 4x 3.0 for each core?


8x? it's 3.0 x16 bro.


----------



## wermad

Quote:


> Originally Posted by *Jpmboy*
> 
> 8x? it's 3.0 x16 bro.


??? Don't understand but keep in mind 3.0 is ~ twice as fast as 2.0. so technically 4x 3.0 ~ 8x 2.0. I'm helping a buddy piece together a new rig and he's going w/ matx Z87 w/ a sound card. I know 290X will run at 4x 3.0 but I wanna make sure he has no issues w/ a 295x2 running 8x 3.0.


----------



## Jpmboy

Quote:


> Originally Posted by *wermad*
> 
> ??? Don't understand but keep in mind 3.0 is ~ twice as fast as 2.0. so technically 4x 3.0 ~ 8x 2.0. I'm helping a buddy piece together a new rig and he's going w/ matx Z87 w/ a sound card. I know 290X will run at 4x 3.0 but I wanna make sure he has no issues w/ a 295x2 running 8x 3.0.


no worries, I've taken 3 SLI kingpins (1400/7800) down to 3.0 x8 and 2.0 x16.... on a 4K monitor, very little difference in throughput. He can use the concurrentbandwidth test (there's one around for AMD somewhere) to verify the PCIE channels.


----------



## wermad

Quote:


> Originally Posted by *Jpmboy*
> 
> no worries, I've taken 3 SLI kingpins (1400/7800) down to 3.0 x8 and 2.0 x16.... on a 4K monitor, very little difference in throughput. He can use the concurrentbandwidth test (there's one around for AMD somewhere) to verify the PCIE channels.


He's more concerned about running a two core single card on 8x 3.0. I know you can run a 290x at 4x 3.0 but Nvidia doesn't allow this. Also, these guys need as much bandwidth as possible as the crossfire communication runs through the pcie 3.0 now. Looking for anyone who can confirm running 8x 3.0 without issues on a 295x2







.


----------



## Joa3d43

Quote:


> Originally Posted by *wermad*
> 
> He's more concerned about running a two core single card on 8x 3.0. I know you can run a 290x at 4x 3.0 but Nvidia doesn't allow this. Also, these guys need as much bandwidth as possible as the crossfire communication runs through the pcie 3.0 now. Looking for anyone who can confirm running 8x 3.0 without issues on a 295x2
> 
> 
> 
> 
> 
> 
> 
> .


...the card's PLEX chip handles the PCIe lane traffic onboard the 295x2 PCB, so if the mobo slot is PCIe3 x16, then you will get 2x PCIe3 x16, courtesy of PLEX; if the slot of PCIe 3 x8, then you will get 2x PCIe 3 x 8 etc


----------



## wermad

Quote:


> Originally Posted by *Joa3d43*
> 
> ...the card's PLEX chip handles the PCIe lane traffic onboard the 295x2 PCB, so if the mobo slot is PCIe3 x16, then you will get 2x PCIe3 x16, courtesy of PLEX; if the slot of PCIe 3 x8, then you will get 2x PCIe 3 x 8 etc


Thanks


----------



## Jpmboy

Quote:


> Originally Posted by *wermad*
> 
> He's more concerned about running a two core single card on 8x 3.0. I know you can run a 290x at 4x 3.0 but Nvidia doesn't allow this. Also, these guys need as much bandwidth as possible as the crossfire communication runs through the pcie 3.0 now. Looking for anyone who can confirm running 8x 3.0 without issues on a 295x2
> 
> 
> 
> 
> 
> 
> 
> .


so you are saying that if i add another card this bandwidth will drop to 2.0...? nah.



anyway - Jo had the answer.


----------



## wermad

Quote:


> Originally Posted by *Jpmboy*
> 
> so you are saying that if i add another card this bandwidth will drop to 2.0...? nah.
> 
> anyway - Jo had the answer.


Dude, you're all over the place its really hard to comprehend your logic in your responses.

Original question was can a 295x2 run at pcie 3.0 @ 8x speed. Apparently you can since the PlX chip adds more lanes (and more latency btw). My friend is going w/ a small form factor (sff) build and doesn't want a huge mb. If I eventually move on to a couple of these bad boys, my board supports 3.0 natively with Amd and i have two 16x slots available.

I think your mb does support it (pcie 3.0) but no your cpu (I had a 2700K running off a Sniper 3 Z77). If you get an IB cpu, ie 3770k, it will switch to 3.0. Other then that, for single core gpu's like your Titan's, 8x 2.0 should have little affect tbh.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Dude, you're all over the place its really hard to comprehend your logic in your responses.
> 
> Original question was can a 295x2 run at pcie 3.0 @ 8x speed. Apparently you can since the PlX chip adds more lanes (and more latency btw). My friend is going w/ a small form factor (sff) build and doesn't want a huge mb. If I eventually move on to a couple of these bad boys, my board supports 3.0 natively with Amd and i have two 16x slots available.
> 
> I think your mb does support it (pcie 3.0) but no your cpu (I had a 2700K running off a Sniper 3 Z77). If you get an IB cpu, ie 3770k, it will switch to 3.0. Other then that, for single core gpu's like your Ti's, 8x 2.0 should have little affect tbh.


He'd be running it (or them) on Park Banch i'd say, Rampage IV BE.


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> He'd be running it (or them) on Park Banch i'd say, Rampage IV BE.


In that case, 3.0 all the way with this setup


----------



## axiumone

Tom's posted a 295x2 crossfire review.

http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808.html

Not so great. Hopefully some good drivers will get here sooner rather than later.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Tom's posted a 295x2 crossfire review.
> 
> http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808.html
> 
> Not so great. Hopefully some good drivers will get here sooner rather than later.


They are using older drivers, I have unreleased ones that made my quad fps jump from 53 to 70s.


----------



## wermad

Running a 1300w psu, good to know you really won't need 1500w, like quad 290x's









Quote:


> Originally Posted by *axiumone*
> 
> Tom's posted a 295x2 crossfire review.
> 
> http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808.html
> 
> Not so great. Hopefully some good drivers will get here sooner rather than later.


Did you get your pair already?


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> They are using older drivers, I have unreleased ones that made my quad fps jump from 53 to 70s.


I'm glad to hear that.
Quote:


> Originally Posted by *wermad*
> 
> Running a 1300w psu, good to know you really won't need 1500w, like quad 290x's
> 
> 
> 
> 
> 
> 
> 
> 
> Did you get your pair already?


Shipped from Cali today. Hopefully by thurs or Fri.


----------



## wermad

Awesome







. Hopefully this gets those Eizo monitors up and running properly









If not, splurge on triple 4k (Samsung







).


----------



## Vashanime04

Doh. I keep seeing you guys with bigger Psu's.. Makes me nervous maybe I should upgrade my Smart M 850. My 295x2 should be tomorrow.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Vashanime04*
> 
> Doh. I keep seeing you guys with bigger Psu's.. Makes me nervous maybe I should upgrade my Smart M 850. My 295x2 should be tomorrow.


Your PSU should be fine  Just let it run Heaven on loop for about an hour and check to see if the cables get really hot. Reading the specs it should be fine.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Vashanime04*
> 
> Doh. I keep seeing you guys with bigger Psu's.. Makes me nervous maybe I should upgrade my Smart M 850. My 295x2 should be tomorrow.


Whats in the rest of your rig?

Fill out the rig builder and throw it in your sig, makes it a bit easier


----------



## Vashanime04

@Sgt BilkoIt should now be in my sig. I replaced quite a bit I used to have a V8 cooler(way too big) and and a I7 Bloomfield 2.6ghz. But I think the mobo and CPU change will be best.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Vashanime04*
> 
> @Sgt BilkoIt should now be in my sig. I replaced quite a bit I used to have a V8 cooler(way too big) and and a I7 Bloomfield 2.6ghz. But I think the mobo and CPU change will be best.


Ahh ok then, looking good









You shouldn't have any issues at all


----------



## Vashanime04

Unless I grab another to crossfire. So tempting XD


----------



## Sgt Bilko

Quote:


> Originally Posted by *Vashanime04*
> 
> Unless I grab another to crossfire. So tempting XD


Then just grab a EVGA 1300 G2 along with it


----------



## King4x4

Anybody tried crossfiring a 295x2 with a 290x? Is it doable?


----------



## wermad

It should be


----------



## Devotii

http://www.overclockers.co.uk/productlist.php?groupid=701&catid=56&subid=1515

in stock for UK £1100


----------



## axiumone

Quote:


> Originally Posted by *King4x4*
> 
> Anybody tried crossfiring a 295x2 with a 290x? Is it doable?


Yeah, you can do it for sure.


----------



## Vashanime04

Heck ya it came in today. Just waiting for my new mobo/CPU to come in.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Vashanime04*
> 
> 
> Heck ya it came in today. Just waiting for my new mobo/CPU to come in.


Added to the roster!


----------



## Vashanime04

My only disappointment is that I wish it came in the metal case.. On another note It'd be hilarious to photoshop it in different locations lol like the commercial. XD


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Vashanime04*
> 
> My only disappointment is that I wish it came in the metal case.. On another note It'd be hilarious to photoshop it in different locations lol like the commercial. XD


How about that performance though 

Post your FireStrike Extreme scores!


----------



## Jpmboy

one should arrive here tomorrow. Just gotta figure out how to deal with that AIO cooler until a waterblock is out.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> one should arrive here tomorrow. Just gotta figure out how to deal with that AIO cooler until a waterblock is out.


Slap two gentle typhoons on it! Haha.


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Slap two gentle typhoons on it! Haha.


ugh.. want to hook into this:


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> ugh.. want to hook into this:


Sweet mother of god...


----------



## The Mac

Holy Jebus!

Does that thing come with a RAD suit?


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Sweet mother of god...


Quote:


> Originally Posted by *The Mac*
> 
> Holy Jebus!
> Does that thing come with a RAD suit?


Ugly... I know. Rat rod 'puter.


----------



## Cool Mike

One ugly Rig.









I like it, I think?

Looks like it could make Moon Shine.


----------



## NavDigitalStorm

It definitely needs someone with an acquired taste haha.


----------



## Jpmboy

...is in the eye of the beholder:


----------



## Cool Mike

A sweet overclocking rig for sure.


----------



## King4x4

Hows the overclocking on 295x2? Can get 1200+ stable or is it nearly impossible at the moment without a waterblock?

oh wait...... 2x290x suck about 800watts at 1200mhz.... oh lord... Dem 2x8pins are gonna fry


----------



## Jpmboy

Quote:


> Originally Posted by *King4x4*
> 
> Hows the overclocking on 295x2? Can get 1200+ stable or is it nearly impossible at the moment without a waterblock?
> 
> oh wait...... 2x290x suck about 800watts at 1200mhz.... oh lord... *Dem 2x8pins* are gonna fry


this may be the limitation on this dual gpu card. I should be testing one this weekend if I get time. But Joad was able to make 7990's really "sing"...


----------



## Jpmboy

Hey - anyone on 4K with this card? I just installed it and the new beta driver.... not showing 4K wheras my r290x did???


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> Hey - anyone on 4K with this card? I just installed it and the new beta driver.... not showing 4K wheras my r290x did???


All my tests were on 4K. What do you mean not showing 4K? You could try wiping all the AMD drivers from your system and doing a clean install.

Also, you are using a mini DP port right? Not DVI?

EDIT: also, can you provide proof of ownership so I can add you to the roster?


----------



## Joa3d43

Quote:


> Originally Posted by *Jpmboy*
> 
> this may be the limitation on this dual gpu card. I should be testing one this weekend if I get time. But Joad was able to make 7990's really "sing"...


...tx, I am watching this 295x2 thread w/ great interest. It took quite a while though before I had full control of all voltages (GPU 1.4v, VRAM 1.7v) on the 7990s. Fortunately, they were the last ones the VB7 BIOS editor worked with re GPU VRM (new gen Hawaii like my 290X Lightning is a no-go on that - but w/ LN2 BIOS is ok anyways)...and the 7990s have Volterra multiphase VRMs which are known to be quite flexible...in turn allowed for some 'real' PT increases and some semi-lunatic TDP. custom settings.

...apart from cooling the monster 295x2 down (ie via some of the aquacomputer blocks I have seen for it), key will be getting control of the 295x2's VRM in order to get some serious oc-ing (not that they're 'weak' by any stretch of the imagination in stock trim anyways







)...wonder if future updates of MSI AB will allow at least 1.3v ?!?


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> All my tests were on 4K. What do you mean not showing 4K? You could try wiping all the AMD drivers from your system and doing a clean install.
> Also, you are using a mini DP port right? Not DVI?
> EDIT: also, can you provide proof of ownership so I can add you to the roster?


I have to use the dp to hdmi adapter, the bench rig is 4k30Hz. this may be rteh issue, it's not reading my eied (sp) correctly What is very strange is that my R290X was fine with it. What driver are you using.. latest whql? will post a pic if i cvan get the thing to work right.

Quote:


> Originally Posted by *Joa3d43*
> 
> ...tx, I am watching this 295x2 thread w/ great interest. It took quite a while though before I had full control of all voltages (GPU 1.4v, VRAM 1.7v) on the 7990s. Fortunately, they were the last ones the VB7 BIOS editor worked with re GPU VRM (new gen Hawaii like my 290X Lightning is a no-go on that - but w/ LN2 BIOS is ok anyways)...and the 7990s have Volterra multiphase VRMs which are known to be quite flexible...in turn allowed for some 'real' PT increases and some semi-lunatic TDP. custom settings.
> 
> ...apart from cooling the monster 295x2 down (ie via some of the aquacomputer blocks I have seen for it), key will be getting control of the 295x2's VRM in order to get some serious oc-ing (not that they're 'weak' by any stretch of the imagination in stock trim anyways
> 
> 
> 
> 
> 
> 
> 
> )...wonder if future updates of MSI AB will allow at least 1.3v ?!?


I am waiting for a real WB (not this fischer-price nonsense







) Thus far, not a good experience... but it's early!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> I have to use the dp to hdmi adapter, the bench rig is 4k30Hz. this may be rteh issue, it's not reading my eied (sp) correctly What is very strange is that my R290X was fine with it. What driver are you using.. latest whql? will post a pic if i cvan get the thing to work right.
> I am waiting for a real WB (not this fischer-price nonsense
> 
> 
> 
> 
> 
> 
> 
> ) Thus far, not a good experience... but it's early!


Yup that adapter is likely the issue.


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Yup that adapter is likely the issue.


switched to an active DP to DL-DVI adapter and it cleared it up.. .but still the card only allows me to select 1080P (not even 120-Hz). Very frustrating! The included dp to HDMI is active or passive? anyone know?

here's proof:



sad that AMD borked up the resolution eied. r290x was fine with 4K via hdmi. need to see how to get this to work... else a discounted 295x2 will appear in the Market place









taking the Samsung back from the wife is NOT going to go over well at all.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> switched to an active DP to DL-DVI adapter and it cleared it up.. .but still the card only allows me to select 1080P (not even 120-Hz). Very frustrating! The included dp to HDMI is active or passive? anyone know?
> 
> here's proof:
> 
> 
> 
> sad that AMD borked up the resolution eied. r290x was fine with 4K via hdmi. need to see how to get this to work... else a discounted 395x3 will appear in the Market place
> 
> 
> 
> 
> 
> 
> 
> 
> 
> taking the Samsung back from the wife is NOT going to go over well at all.


Which model? (ASUS, sapphire?)


----------



## Joa3d43

Quote:


> Originally Posted by *Jpmboy*
> 
> -snip -
> 
> *taking the Samsung back from the wife* is NOT going to go over well at all.


...have you given any thought as to how you're going to fit that nifty bench setup of yours into the DOGHOUSE


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Which model? (ASUS, sapphire?)


powercolor. Hey - what's the dip switch towards the rear\top of the pcb??

Quote:


> Originally Posted by *Joa3d43*
> 
> ...have you given any thought as to how you're going to fit that nifty bench setup of yours into the DOGHOUSE


so you must be married...








he won't mind:


----------



## NavDigitalStorm

I believe that's the "uber" switch like on the R9 290X.

Also, added to the roster!


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I believe that's the "uber" switch like on the R9 290X.
> 
> Also, added to the roster!


have you fliped it?


----------



## axiumone

Reviewers said it's just a dual bios switch. Unlike the 290x that has silent and uber bios modes, on the 295x2 both bios modes are the same.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Reviewers said it's just a dual bios switch. Unlike the 290x that has silent and uber bios modes, on the 295x2 both bios modes are the same.


BIOS modding here we come!


----------



## ColeriaX

Has anyone replaced the stock fans and done a before/after compare? I have some sp120s and I'm wondering if I should stick them on


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ColeriaX*
> 
> Has anyone replaced the stock fans and done a before/after compare? I have some sp120s and I'm wondering if I should stick them on


SP120s will give an improvement definitely. I slapped them on our review unit but didn't get a chance to get numbers before we shipped it out.


----------



## ColeriaX

Sad face. Oh well even if it only improves Delta a degree or two it's worth it to me. Hoping EK is working on something for a more permanent solution.


----------



## Jpmboy

Quote:


> Originally Posted by *ColeriaX*
> 
> Sad face. Oh well even if it only improves Delta a degree or two it's worth it to me. *Hoping EK is working on something for a more permanent solution*.


^^This!

best I could do w/o artifacts;

http://www.3dmark.com/3dm/2945570


----------



## The Mac

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> SP120s will give an improvement definitely. I slapped them on our review unit but didn't get a chance to get numbers before we shipped it out.


Whom did you ship the unit to, so we can keep an eye out for the review?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *The Mac*
> 
> Whom did you ship the unit to, so we can keep an eye out for the review?


I can't say anything yet but I'll post it here as soon as it comes out.


----------



## The Mac

Thanks


----------



## Jpmboy




----------



## Sgt Bilko

Quote:


> Originally Posted by *Jpmboy*


I gotta say, It's looking pretty sexy









Makes me wish i could afford one of these bad boys, Price here is $1900


----------



## ColeriaX

UPS just called, went and put everything in the house. Will post pics after work


----------



## Jpmboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I gotta say, It's looking pretty sexy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Makes me wish i could afford one of these bad boys, Price here is $1900


$1900 is outrageous.. wait, $1500 is outrageous too.


----------



## The Mac

that AU$, not US$

but still $1760 US$ is a lot. Do you guys have a vat?


----------



## Sgt Bilko

Quote:


> Originally Posted by *The Mac*
> 
> that AU$, not US$
> 
> but still $1760 US$ is a lot. Do you guys have a vat?


$1500 USD is the msrp


----------



## ColeriaX

More pics to come later...


----------



## The Mac

Quote:


> Originally Posted by *Sgt Bilko*
> 
> $1500 USD is the msrp


i am aware, however your 1900 AU$ = 1760 US$, thats an awful lot over MSRP.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ColeriaX*
> 
> 
> 
> More pics to come later...


Added to the roster


----------



## Sgt Bilko

Quote:


> Originally Posted by *The Mac*
> 
> i am aware, however your 1900 AU$ = 1760 US$, thats an awful lot over MSRP.


not including import, distributor or seller fees.


----------



## The Mac

$260? thats outrageous...


----------



## gerardfraser

Must not look at this thread anymore.
Waiting to see some more benchmarks with these cards.
2000 Canadian for the card shipped with tax but the card looks awesome.
Damn I can buy 6 R9 290 cards used and still be cheaper than one R9 295X2.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *gerardfraser*
> 
> Must not look at this thread anymore.
> Waiting to see some more benchmarks with these cards.
> 2000 Canadian for the card shipped with tax but the card looks awesome.
> Damn I can buy 6 R9 290 cards used and still be cheaper than one R9 295X2.


What other benchmarks are you looking for?


----------



## axiumone

Argh... the wait is killing me. UPS is being super slow. Looks like I'm getting mine on monday.


----------



## Cool Mike

Looks like a fun build!


----------



## NavDigitalStorm

Please post plenty of pictures!


----------



## Vashanime04

Damn work.. I'd be done if they didn't call.







Anyway I'm just about done. Just gotta put in the H80i for the CPU and it'll be up and playing games. XD


----------



## gerardfraser

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> What other benchmarks are you looking for?


Well thanks for asking I sold all my NVIDIA cards 780Ti Sli to run R9 290 in Crossfire(which is great ).
Some games I would like to see benched , if possible.Thanks

Witcher2 EE 1080P or higher with ubersampling.
Thief 2014 with Crossfire working with 1080P or higher.(Mantel/DX11 does not matter)
Metro Last light 1080P or higher ,Physx disabled.
Serious sam 3,fully maxed at 1080P or higher.


----------



## ColeriaX

Well, its not perfect (or even close), but its gonna get the job done for now. Willing to hear suggestions about aesthetics, routing etc.. if you guys wanna chime in. I'm just about to start benching. Unfortunately, I don't have my new monitor yet so any requests for 1080p that i can honor I'll give it a shot. Enjoy









P.S. Don't ask to see whats behind the other panel Lol, thats a task for another day


----------



## gerardfraser

Quote:


> Originally Posted by *ColeriaX*
> 
> 
> 
> Well, its not perfect (or even close), but its gonna get the job done for now. Willing to hear suggestions about aesthetics, routing etc.. if you guys wanna chime in. I'm just about to start benching. Unfortunately, I don't have my new monitor yet so any requests for 1080p that i can honor I'll give it a shot. Enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. Don't ask to see whats behind the other panel Lol, thats a task for another day


How about these games.

Witcher2 EE 1080P or higher with ubersampling.
Thief 2014 with Crossfire working with 1080P or higher.(Mantel/DX11 does not matter)
Metro Last light 1080P or higher ,Physx disabled.
Serious sam 3,fully maxed at 1080P or higher.


----------



## bencher

Quote:


> Originally Posted by *Jpmboy*
> 
> $1900 is outrageous.. wait, $1500 is outrageous too.


Says the person with 2 Titans and water blocks.


----------



## Vashanime04

Quote:


> Originally Posted by *ColeriaX*
> 
> 
> 
> Well, its not perfect (or even close), but its gonna get the job done for now. Willing to hear suggestions about aesthetics, routing etc.. if you guys wanna chime in. I'm just about to start benching. Unfortunately, I don't have my new monitor yet so any requests for 1080p that i can honor I'll give it a shot. Enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. Don't ask to see whats behind the other panel Lol, thats a task for another day


What case is that? It looks clean in that case.


----------



## ColeriaX

Quote:


> Originally Posted by *Vashanime04*
> 
> What case is that? It looks clean in that case.


Its the Corsair 540. I really dig this case.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ColeriaX*
> 
> Its the Corsair 540. I really dig this case.


It's definitely a nice case... just too wide for my taste


----------



## pompss

guys i have the 295x2 from a friend for test.
The manual says that my cable need to be capable of 28A in order to work.
My power supply show 25A and i dont know if the cable can handle 28 A?
have to change psu ?
i have the seasonic ss-850KM


----------



## Cool Mike

The Seasonic is a excellent PSU. They tend to be under rated, meaning you can squeeze a small amount more out of them.

For testing purposes and no overclocking you will be OK. disclaimer: Per AMD you need 28A+ available for each 8 pin connector.


----------



## pompss

i get 24 -35 fps in theft at 4k all max quality in the benchmark.
I dont know maybe the 295x2 its not good for this game but 24 fps??


----------



## NABBO

Quote:


> Originally Posted by *pompss*
> 
> i get 24 -35 fps in theft at 4k all max quality in the benchmark.
> I dont know maybe the 295x2 its not good for this game but 24 fps??


Also in this test seems to have problems

http://abload.de/image.php?img=immagine666bqkqf.png


----------



## Jpmboy

Quote:


> Originally Posted by *NABBO*
> 
> Also in this test seems to have problems
> 
> http://abload.de/image.php?img=immagine666bqkqf.png


the cfx driver is not opt'd for those yet. go into CCC and select the 1x1 mode for crossfire. see if it helps.


----------



## pompss

i can only see one gpu running
the other gpu is not running.


----------



## pompss

clean up the driver and reinstall.
still the second gpu is not working


----------



## NavDigitalStorm

Quote:


> Originally Posted by *pompss*
> 
> clean up the driver and reinstall.
> still the second gpu is not working


Is this just in Thief?


----------



## pompss

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Is this just in Thief?


In all the games
valley, crysis 3, tombraider etc

maybe the card have some issues or my seasonic ss-850km is not powerful enough
Anyone use this card with a similar PSU??


----------



## pompss

gpu z witht rendering test i can see both gpu are working but not in gaming


----------



## NavDigitalStorm

Is there a crossfire option in the Catalyst control center? Just a thought.


----------



## pompss

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Is there a crossfire option in the Catalyst control center? Just a thought.


yes i check the box nothing change the second gpu is not working


----------



## NavDigitalStorm

Quote:


> Originally Posted by *pompss*
> 
> yes i check the box nothing change the second gpu is not working


That's really odd. Try restarting the computer when you tick the option.

If anything, im guessing it's not getting enough power.


----------



## Yuri_RP

Hello,

I'm currently considering buying this Card. I'm planning to pair it with 3 x QNIX QX2710 1440p which only had DVI-D DL port. I want to ask which Active Adapter should I use? I believe it has to be Active Mini DisplayPort to DVI-D Dual Link Adapter. Currently browsing on eBay (since buying from Amazon would be at very great shipping cost, I'm avoiding it) would really be great if someone could point the Adapters on eBay.
Since I live in Indonesia and it's kinda hard to find such adapters. Wouldn't want to have the card and the monitor sitting around just because I don't have the correct Adapters.

Thanks for your response.

Regards,

Yuri


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Yuri_RP*
> 
> Hello,
> 
> I'm currently considering buying this Card. I'm planning to pair it with 3 x QNIX QX2710 1440p which only had DVI-D DL port. I want to ask which Active Adapter should I use? I believe it has to be Active Mini DisplayPort to DVI-D Dual Link Adapter. Currently browsing on eBay (since buying from Amazon would be at very great shipping cost, I'm avoiding it) would really be great if someone could point the Adapters on eBay.
> Since I live in Indonesia and it's kinda hard to find such adapters. Wouldn't want to have the card and the monitor sitting around just because I don't have the correct Adapters.
> 
> Thanks for your response.
> 
> Regards,
> 
> Yuri


This might be what you want.

http://www.evga.com/products/Product.aspx?pn=200-DP-1301-L1


----------



## Yuri_RP

How would it work?

R9 295X2 : 4 Mini DisplayPort + 1 DVI-D Dual Link
3 x QNIX : 3 DVI-D Dual Link

What is the setup to make it work in Eyefinity? What I'm currently planning to do is buy 3 x Active MiniDP to DVI-D DL Adapters then connect each monitor separately.

If there is any other way, I would really appreciate the info.

Thank you.


----------



## King4x4

Don't do any amd card with an active adapter... only had pain with them and eyefinity.


----------



## NavDigitalStorm

Oh shoot maybe i read what you need wrong. yeah, you're better off buying three separate DisplayPort 1.2 min dp to full dp cables.


----------



## Yuri_RP

Quote:


> Originally Posted by *King4x4*
> 
> Don't do any amd card with an active adapter... only had pain with them and eyefinity.


Then what do you suggest? Does it mean that I can't use my 3 x QNIX for Eyefinity with R9 295X2?


----------



## Yuri_RP

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Oh shoot maybe i read what you need wrong. yeah, you're better off buying three separate DisplayPort 1.2 min dp to full dp cables.


Okay, will do this. But I can't find any on eBay. What I do find is Accell and StarTech on Amazon. Guess I will have to buy it from Amazon after all. Shpping from US to Indonesia is really expensive though.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Yuri_RP*
> 
> Okay, will do this. But I can't find any on eBay. What I do find is Accell and StarTech on Amazon. Guess I will have to buy it from Amazon after all. Shpping from US to Indonesia is really expensive though.


This is the cable that I used on my ASUS 1440p. http://www.monoprice.com/Product?c_id=102&cp_id=10246&cs_id=1024603&p_id=9474&seq=1&format=2


----------



## Yuri_RP

Dang! Amazon won't do, they won't ship to Indonesia.







Had to get it from eBay eventually.
Quote:


> Originally Posted by *NavDigitalStorm*
> 
> This is the cable that I used on my ASUS 1440p. http://www.monoprice.com/Product?c_id=102&cp_id=10246&cs_id=1024603&p_id=9474&seq=1&format=2


Yes, but i need the MiniDP to DVI-D version of that cable. And also the one that can be shipped to Indonesia. Does Monoprice ships overseas?


----------



## King4x4

Quote:


> Originally Posted by *Yuri_RP*
> 
> Then what do you suggest? Does it mean that I can't use my 3 x QNIX for Eyefinity with R9 295X2?


You can sell two of those screens and buy two DP Versions.

I will not trust Eyefinity with an active adapter what so ever... Lost so many hairs it's not funny anymore


----------



## Yuri_RP

Quote:


> Originally Posted by *King4x4*
> 
> You can sell two of those screens and buy two DP Versions.
> 
> I will not trust Eyefinity with an active adapter what so ever... Lost so many hairs it's not funny anymore


But the QNIX DP version can't/won't be overclockable to 96/120Hz. The DVI-D use Samsung PLS Panel, while the DP use AUO AH-VA Panel. That's why I use the DVI-D DL version.

Would you care to Explain your problems, and what is your solution to it? As to by my future reference.


----------



## King4x4

The only way you can run three Qnix screens OCed is to go nvidia surround.

The Adapters only allow OC to 70mhz. They will never run over 70hz under 1440p just forget it. Bought every adapter under the sky and the best will allow 330mhz pixel clock.


----------



## shadow85

Hey guy'S I'm thinking of getting one of these bad boyz soon, but not sure if I should upgrade my PSU at the same time anf to what?

My current specs:
Thermaltake Toughpower 750W (6 years old but still running strong)
MSI-Gaming gd65
I5-4670k @ 4.0GHz
2x8GB Gskil Trident X @ 2333
Intel 330 SSD 120gb
Raptor 150GB
2TB sata
Evga SC ACX GTX 780

Will a good quality 850w PSU suffice or should i get atleast 1000W?


----------



## ColeriaX

Quote:


> Originally Posted by *shadow85*
> 
> Hey guy'S I'm thinking of getting one of these bad boyz soon, but not sure if I should upgrade my PSU at the same time anf to what?
> 
> My current specs:
> Thermaltake Toughpower 750W (6 years old but still running strong)
> MSI-Gaming gd65
> I5-4670k @ 4.0GHz
> 2x8GB Gskil Trident X @ 2333
> Intel 330 SSD 120gb
> Raptor 150GB
> 2TB sata
> Evga SC ACX GTX 780
> 
> Will a good quality 850w PSU suffice or should i get atleast 1000W?


----------



## ColeriaX

OK so screwed up that last one, oh well. My advice is to at least get an 850 watt PSU. I bought the EVGA Supernova 1300 watt. Got a great deal on it, they also have a 1000 watt version as well. While benching my ups reports about 700 watt load BTW.


----------



## Jpmboy

Quote:


> Originally Posted by *bencher*
> 
> Says the person with 2 Titans and water blocks.


3 kingpins, 2 classifieds, 2 titans , a 290x, a 295x2... and a partridge in a pear tree.


----------



## Jpmboy

Quote:


> Originally Posted by *King4x4*
> 
> Don't do any amd card with an active adapter... *only had pain with them and eyefinity*.


I believe it. I can confirm: I have a half dozen adapters and none will work 4k30 with this card's DP ports. never had this problem before (290x's and 7 NV cards: 3 KPE, 2 Ti Classies, 2 titans).

CRU (AMD custom rez and pixel clock utility) will not correct the isssue. It seems as tho the card/driver cannot read the monitor's edid thru any adapter. I even made a "monitor.inf" with kingpins, saved it with Monitor Asset Manager, but still no edid. So, my 290X reads my native rez as 4K, NV cards read it, but this 295x2 does not. stuck at 1080P on a native 4K panel.


----------



## Jpmboy

Quote:


> Originally Posted by *shadow85*
> 
> Hey guy'S I'm thinking of getting one of these bad boyz soon, but not sure if I should upgrade my PSU at the same time anf to what?
> 
> My current specs:
> Thermaltake Toughpower 750W (6 years old but still running strong)
> MSI-Gaming gd65
> I5-4670k @ 4.0GHz
> 2x8GB Gskil Trident X @ 2333
> Intel 330 SSD 120gb
> Raptor 150GB
> 2TB sata
> Evga SC ACX GTX 780
> 
> Will a good quality 850w PSU suffice or should i get atleast 1000W?


if you can afford to get a SINGLE RAIL 1000w PSU, that's the best choice. If you do get the 850W, make sure it is a single rail OCP.


----------



## pompss

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> That's really odd. Try restarting the computer when you tick the option.
> 
> If anything, im guessing it's not getting enough power.


I dont know if the power is really the problem here.
When i click on the desktop on the right button of the mouse to open or create a new file the windows load icon shows up and its taking for ever.for the menu to appear.
there something wrong with this card.


----------



## Yuri_RP

Quote:


> Originally Posted by *King4x4*
> 
> The only way you can run three Qnix screens OCed is to go nvidia surround.
> 
> The Adapters only allow OC to 70mhz. They will never run over 70hz under 1440p just forget it. Bought every adapter under the sky and the best will allow 330mhz pixel clock.


So, if I don't OCit and just stay at 60Hz will it work in Eyefinity?
Quote:


> Originally Posted by *Jpmboy*
> 
> I believe it. I can confirm: I have a half dozen adapters and none will work 4k30 with this card's DP ports. never had this problem before (290x's and 7 NV cards: 3 KPE, 2 Ti Classies, 2 titans).
> 
> CRU (AMD custom rez and pixel clock utility) will not correct the isssue. It seems as tho the card/driver cannot read the monitor's edid thru any adapter. I even made a "monitor.inf" with kingpins, saved it with Monitor Asset Manager, but still no edid. So, my 290X reads my native rez as 4K, NV cards read it, but this 295x2 does not. stuck at 1080P on a native 4K panel.


Wha?







I already ordered 3 MiniDP to DVI-D DL Active Adapter.







Is there really no solution to make it work? Future driver patch from AMD maybe?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *pompss*
> 
> I dont know if the power is really the problem here.
> When i click on the desktop on the right button of the mouse to open or create a new file the windows load icon shows up and its taking for ever.for the menu to appear.
> there something wrong with this card.


Either that or there is a driver conflict.


----------



## Jpmboy

Quote:


> Originally Posted by *Yuri_RP*
> 
> So, if I don't OCit and just stay at 60Hz will it work in Eyefinity?
> Wha?
> 
> 
> 
> 
> 
> 
> 
> I already ordered 3 MiniDP to DVI-D DL Active Adapter.
> 
> 
> 
> 
> 
> 
> 
> Is there really no solution to make it work? Future driver patch from AMD maybe?


my issue may be restricted to this monitor (4K30Hz)


----------



## pompss

ok there was a driver conflict and i fix it.
But the second gpu is still not working.
i should be able with my seasonic ss850 to run two 290x in crossfire .
I reset my cpu clock to normal and i should be able to run the 295x2 without problem.
Even with default cpu settings the second gpu is still not working.
With gpu-z the second gpu is in default a 300 mhz in gaming and the other gpu is at 1000 mhz.
anyway is there any chat support for amd product or a phone number to call?
Because before i buy a new psu i wanna be sure that the card is not defect.
i found a list of compatible psu and my seasonic in not the list.
http://www.pc-specs.com/gpu/ATI/R-200_Series/Radeon_R9_295X2/2101/Compatible_PSUs


----------



## ColeriaX

Here is my best Heaven score I could manage. Interestingly enough this run was at 1110 / 1495 and scored better than the few runs I did at 1119 / 1525. Max temp was 64C with the sp120 on there. I think it can manage better once we get proper fan control. Not sure if it would run correctly if i just ran it off a 12v cable since it is tied into the pump.


----------



## Jpmboy

yeah, best I could do too:

Not very impressive. For comparison, sli 780Ti's do 146FPS, Titans do 136FPS on the same rig.
we need a voltage unlock and waterblocks for this card.


----------



## Cool Mike

ColeriaX, what brand of memory does your 295x2 have? At this point everyone has Hynix. Most are hitting 1600+ overclock on the memory. I'm at 1650 stable.


----------



## ColeriaX

Quote:


> Originally Posted by *Cool Mike*
> 
> ColeriaX, what brand of memory does your 295x2 have? At this point everyone has Hynix. Most are hitting 1600+ overclock on the memory. I'm at 1650 stable.


Not sure, how do I check without removing the block?

Nvmd..GPU-Z tells me its Hynix


----------



## Cool Mike

Yes! We need the voltage unlocked. 30-50mV and I can hit 1100 core stable running 4K maxed. Currently at 1090 core.


----------



## Cool Mike

You are right. GPUZ tells the truth. You have Hynix.


----------



## ColeriaX

Good call on the memory I guess I didn't push hard enough, benching heaven at 1600 as we speak. Seems fine.

Edit: Didn't make a difference at all...1110 / 1600, maybe bottlenecked by the PCI-E bus speed?


----------



## NavDigitalStorm

Try something other than Heaven. Heaven is highly biased towards NVIDIA cards.


----------



## ColeriaX

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Try something other than Heaven. Heaven is highly biased towards NVIDIA cards.


Ill give 3Dmark and Valley a shot with those clocks.


----------



## ColeriaX

Weird, even with the gpu clocks at 1105 /1650 I still scored worse than I did earlier in the day at 1105 / 1495. Somethings not right


----------



## Cool Mike

Is your 295 in a PCIe 3.0 X16 slot?


----------



## ColeriaX

Quote:


> Originally Posted by *Cool Mike*
> 
> Is your 295 in a PCIe 3.0 X16 slot?


No, I am behind the curve on chipsets, was a P67 early adopter and it only supports 2.0 x16x1 or 8x8. But is 2.0 x16 really saturated with this? Maybe someone can chime in on that. My plan is to upgrade to X99 when it comes out to round out this build.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ColeriaX*
> 
> Weird, even with the gpu clocks at 1105 /1650 I still scored worse than I did earlier in the day at 1105 / 1495. Somethings not right


Quote:


> Originally Posted by *ColeriaX*
> 
> Weird, even with the gpu clocks at 1105 /1650 I still scored worse than I did earlier in the day at 1105 / 1495. Somethings not right


The card could be throttling itself down.


----------



## ColeriaX

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> The card could be throttling itself down.


I've been logging the clocks, no throttling at all. However, I have noticed some interesting usage stats. Alot of the time GPU usage is not 100%


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ColeriaX*
> 
> I've been logging the clocks, no throttling at all. However, I have noticed some interesting usage stats. Alot of the time GPU usage is not 100%


Hmm, AMD might still be working on optimizing the drivers. That, or the benchmarking utilities probably need to be updated. No idea to be honest.


----------



## Jpmboy

Quote:


> Originally Posted by *ColeriaX*
> 
> Weird, even with the gpu clocks at 1105 /1650 I still scored worse than I did earlier in the day at 1105 / 1495. Somethings not right


let's hope amd does improve upon 14.4. Yeah - I can run 1650 memory in firestrike and FS extreme (http://www.3dmark.com/3dm/2962347) but have to back off on the gpu clock. voltage control would let me attempt to address that. I wouldn't say that Heaven is "biased towards Nvidia" ... the R9 series is doing very well in that benchmark vs NV. Valley may be more so, but is it the benchmark or the drivers? I think it's the AMD vs NV drivers.
ColeriaX - the higher mem clock scored lower? try backing off on the gpu clock just a tad.

I've been pushing this card quite hard, good scores in futuremark benches, Heaven is okay, not great (limited OC capability right now, was the same when the 290X came out). with a waterblock and voltage unlock, it should get better, maybe close to cfx 290s or 290x. The waterblocked 7970's I sold a few months ago took time to perform at their best. That was a great AMD card!!


----------



## ColeriaX

Quote:


> Originally Posted by *Jpmboy*
> 
> let's hope amd does improve upon 14.4. Yeah - I can run 1650 memory in firestrike and FS extreme (http://www.3dmark.com/3dm/2962347) but have to back off on the gpu clock. voltage control would let me attempt to address that. I wouldn't say that Heaven is "biased towards Nvidia" ... the R9 series is doing very well in that benchmark vs NV. Valley may be more so, but is it the benchmark or the drivers? I think it's the AMD vs NV drivers.
> ColeriaX - the higher mem clock scored lower? try backing off on the gpu clock just a tad.
> 
> I've been pushing this card quite hard, good scores in futuremark benches, Heaven is okay, not great (limited OC capability right now, was the same when the 290X came out). with a waterblock and voltage unlock, it should get better, maybe close to cfx 290s or 290x. The waterblocked 7970's I sold a few months ago took time to perform at their best. That was a great AMD card!!


I'll give that a shot, but from what I can tell increasing my mem clock does nothing lol. My high score for FS was like 16213 and that was at 1104 / 1495. Ill try to tone the core clock down and see what it does.


----------



## Cool Mike

1085 core and 1650 Memory is the sweet spot for me.


----------



## ColeriaX

1108 / 1700 Just scored 16396 so theres some improvement


----------



## Jpmboy

Quote:


> Originally Posted by *ColeriaX*
> 
> 1108 / 1700 Just scored 16396 so theres some improvement


nice run!

in FSE, I had to back down on gpu little (1085) but it held 1625 mem. Is ther a way to have CCC show gpu in MHz vs %? (that's why i'm using afterburner







)
http://www.3dmark.com/3dm/2962555


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> 1085 core and 1650 Memory is the sweet spot for me.


post some benchmarks... helps others know what this card can do.









http://www.overclock.net/t/1443196/firestrike-extreme-top-30
http://www.overclock.net/t/1464813/3d-mark-11-extreme-top-30
http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


----------



## ColeriaX

Quote:


> Originally Posted by *Jpmboy*
> 
> nice run!
> 
> in FSE, I had to back down on gpu little (1085) but it held 1625 mem. Is ther a way to have CCC show gpu in MHz vs %? (that's why i'm using afterburner
> 
> 
> 
> 
> 
> 
> 
> )
> http://www.3dmark.com/3dm/2962555


Afterburner caused some serious issues for me and now I'm only using CCC to modify clocks until we get proper support. I just update the clocks in CCC and open GPU-z to confirm them.


----------



## shadow85

Hey is everyone here who is OC'ing there 295x2 and benchmarking using custom cooling setups or the standard factory cooling?


----------



## axiumone

So far everyone is on factory cooling. There hasnt been a waterblock available for purchase yet.


----------



## NavDigitalStorm

The coolest you could probably get right now is a push/pull setup and a secondary fan blowing air on the backplate.


----------



## King4x4




----------



## Jpmboy

received this today from EK... looks like they are leaving this market to Aquacomputer.


----------



## Sgt Bilko

Surely they wouldn't?

Well here's hoping the Aquacomputer block is pretty damn good then......


----------



## Jpmboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Surely they wouldn't?
> Well here's hoping the Aquacomputer block is pretty damn good then......


AQ usually does a good job! (it's like 160 euro !! )


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jpmboy*
> 
> AQ usually does a good job! (it's like 160 euro !! )


Well it will have to be good now won't it?


----------



## axiumone

Quote:


> Originally Posted by *Jpmboy*
> 
> received this today from EK... looks like they are leaving this market to Aquacomputer.


Oh, wow. I'm so glad that I decided to leave the stock cooler on this time around. With EK dropping hints like crazy all over the place that a full block is coming, I would be very upset at this outcome.


----------



## NavDigitalStorm

NOO WHY EK! That AQ waterblock doesn't go with any my my color schemes


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> NOO WHY EK! That AQ waterblock doesn't go with any my my color schemes


AQ will have copper, nickel plate with clear or smoke-black tops. I had 2 of their blocks on 7970s and they were beautiful!


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Oh, wow. I'm so glad that I decided to leave the stock cooler on this time around. With EK dropping hints like crazy all over the place that a full block is coming, I would be very upset at this outcome.


well, can't take it off yet anyway...


----------



## axiumone

I am a happy man today! It's going to be a long work day today.


----------



## ColeriaX

Quote:


> Originally Posted by *axiumone*
> 
> I am a happy man today! It's going to be a long work day today.


I guess 2 really is better than 1. Grats man!


----------



## Cool Mike

Your graphics score looks fine. Whats bringing you down is the Physics score.

For example I am running a 4930K at 4.5Ghz. My Physics score is around 16,800. Total score with My rig is 18K.


----------



## ColeriaX

Quote:


> Originally Posted by *Cool Mike*
> 
> Your graphics score looks fine. Whats bringing you down is the Physics score.
> 
> For example I am running a 4930K at 4.5Ghz. My Physics score is around 16,800. Total score with My rig is 18K.


I stayed up til 4am last night troubleshooting. Found out that the P8P67 is notorious for the PCI-E slots going bad, and would also explain why it reads it as PCI E 1.1 @ x4. Going to buy a Z97 board and a 4770k and call it a day. What a nightmare panther point turned out to be as an early adopter.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> I am a happy man today! It's going to be a long work day today.


Approved and added to the roster!


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> Your graphics score looks fine. Whats bringing you down is the Physics score.
> 
> For example I am running a 4930K at 4.5Ghz. My Physics score is around 16,800. *Total score with My rig is 18K*.


I am sure that 18K run was with Tess off in CCC. Last week, the driver state was not detected by FM sysInfo - and I had tess off show as undetected. Run that with 14.4 today - it will report tess state to futuremark sys info now. Anyway, mid 17K is good for a 2 gpu card!
4960X @ 4.7GHz, 2133 [email protected]
Tess on

Tess off


----------



## Cool Mike

A Z97 will be great. Look forward to seeing your new build.


----------



## Cool Mike

Thanks for the heads up. I will look into that tonight. I am running the 14.4 beta currently.

I tried the approved 14.4, but getting a severe error at the start of 3Dmark Firestrike. Reloaded 14.4 beta and problem goes away.


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> Thanks for the heads up. I will look into that tonight. I am running the 14.4 beta currently.
> 
> I tried the approved 14.4, but getting a severe error at the start of 3Dmark Firestrike. Reloaded 14.4 beta and problem goes away.


You never know with drivers. IDK - 14.4 whql has been good so far. but I've only tried BF4 and some benchmarks.


----------



## Cool Mike

Guessing 14.4 whql is working for you with Firestrike?


----------



## Jpmboy

OKay, wife is out for a while, kids too... grabbed the Samsung 4K I gave her and hooked it up with a miniDP to DP adapter. 4K60 working. Can't figure why my 50" 4K30 panel will not cooperate with this video card. 290X, titans, 780Ti Classys and kinpins all were fine. I'm convinced it's the miniDP to HDMI trhat's borking the mix. Anyone know if i need to use an active or passive miniDP to HDMI. Even a DVI-dual link to HDMI adapter did not work.


----------



## The Mac

Quote:


> Originally Posted by *ColeriaX*
> 
> I stayed up til 4am last night troubleshooting. Found out that the P8P67 is notorious for the PCI-E slots going bad, and would also explain why it reads it as PCI E 1.1 @ x4. Going to buy a Z97 board and a 4770k and call it a day. What a nightmare panther point turned out to be as an early adopter.


Cougar point....

my deluxe is trucking right along, not a single problem with it.


----------



## ColeriaX

I wish I could say the same







. Oh well new hardware!


----------



## The Mac

did you get the B3?


----------



## ColeriaX

Quote:


> Originally Posted by *The Mac*
> 
> did you get the B3?


B2


----------



## Cool Mike

Have you guys seen the news on the Nvidia's Titan Z ($3K card). POSTPONED!









The 295X2 caught them with their pants down! HOT OCN DISCUSSION GOING ON NOW.

http://www.overclock.net/t/1485895/wccf-nvidia-gtx-titan-z-postponed-to-a-later-date


----------



## Jpmboy

they'll bring out the Z @ $2000 with 5% more performance vs x2.








I hope.


----------



## NavDigitalStorm

That thread is hilarious!


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> That thread is hilarious!


I agree. went waaay off topic. Almost got shut down.


----------



## axiumone

Soooo. I'm not sure what to do. The last good driver for 4 way 290 crossfire was 13.12 whql. I can't install 13.12 anymore because they dont see the 295x as a compatible device. 14.4 whql is absolutely atrocious in 4 way crossfire. BF4 and titanfall are pretty much unplayable due to flickering and other graphical oddities. Leaving only one card running results provides better results, but the frame rates are undesirable at 5400x1920. I hope we get some better drivers soon.









Edit - A fresh windows install seems to have alleviated some of the issues, even though I was doing clean driver installs. Although, I still have to have vsync enabled in order to get rid of some serious screen tearing across all 5 monitors. This wasnt an issue with 4 way crossfire 290.


----------



## Vashanime04

Hoping for a higher score but not too bad. Way better than what I had. XD
http://www.3dmark.com/fs/2077064


----------



## Paul17041993

would have thought this would just fit into 290/X owners, but I guess the beast of a ref. card it is it makes sense...


----------



## wermad

Quote:


> Originally Posted by *Jpmboy*
> 
> received this today from EK... looks like they are leaving this market to Aquacomputer.


Looks like there's still some hope?
Quote:


> Originally Posted by *derickwm*
> 
> Apologies for the delayed response. We are indeed in the process of finishing up the 295X2 block and should be available soon.


----------



## Jpmboy

Quote:


> Originally Posted by *Vashanime04*
> 
> Hoping for a higher score but not too bad. Way better than what I had. XD
> http://www.3dmark.com/fs/2077064


your graphics score is very good! Physics on a 4-core will always lower overall score (unless you get that 4770K to like 6GHz








Quote:


> Originally Posted by *wermad*
> 
> Looks like there's still some hope?


WTH, Geeze I hope so! I have 5 EK blocks running right now, but wouldn't be disappointed with Aquacomputer's stuff. Their 7970 waterblocks were simply beautiful (and very functional). Wouldn't want this to be a single source consumer option.

EDIT: oops, checked email and... i think this is worth a laugh. But good news nonetheless.


----------



## MrTOOSHORT

So there will be a block from EK for the 295x2:



*http://www.ekwb.com/news/484/19/New-water-cooling-gear-in-the-works-May-2014/*

Seen some folk wondering about it, so I posted it here to clear things up.


----------



## Jpmboy

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> So there will be a block from EK for the 295x2:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> *http://www.ekwb.com/news/484/19/New-water-cooling-gear-in-the-works-May-2014/*
> 
> 
> 
> Seen some folk wondering about it, so I posted it here to clear things up.












first email I got from EK was an unambiguous NO!. Second was a "yes, we will have a 295x2 block". I thik someone just got sent to the mailroom for penance.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> first email I got from EK was an unambiguous NO!. Second was a "yes, we will have a 295x2 block". I thik someone just got sent to the mailroom for penance.


Looks like ill be contacting my EK rep.


----------



## axiumone

Nav, when you installed the 295 in the 450D, did you notice the top card sagging a little? I'll take some pics of my set up soon, but my top card sags in the rampage iv black. It's not really a concern short term, but long term, I think it may disfigure the pcie slot. It happened to my previous motherboards and lighter cards.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Nav, when you installed the 295 in the 450D, did you notice the top card sagging a little? I'll take some pics of my set up soon, but my top card sags in the rampage iv black. It's not really a concern short term, but long term, I think it may disfigure the pcie slot. It happened to my previous motherboards and lighter cards.


I didn't notice it sagging, i'll let you know when I get it back,


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I didn't notice it sagging, i'll let you know when I get it back,


Allrighty. Thanks!









Cards are monsters though! The potential is incredible. I just hope that some really good drivers come sooner, rather than later.

Here is my result with a mild overclock. 1050 core, 1450 mem. 15800 score in firestrike extreme.
http://www.3dmark.com/fs/2079115

Also temps are pretty good. I switched the fans to two corsair sp120 per rad. Temps are in the 50's on load.


----------



## Signia70

Just got one of these yesterday. Havent used it that much other than bf 4. Running 5760x1080 at Ultra and was between 60 and 70+ FPS. I had no idea that the memory was shared between the 2!!! I was using a GTX 690 and the memory wasnt unified. Memory for BF was > 5GB during my session.


----------



## wermad

Thanks to Akira749:
Quote:


> Originally Posted by *akira749*
> 
> During next month, EK will release their new universal GPU solution. It's called EK-Thermosphere. If you can wait a few weeks this might be a better solution for your GPU.
> If EK decide to not release the block for the 295X2, you could take a look at Koolance for a different option over the AC block...and should be around 180$.
> Koolance with fullcover water block for R9 295X2




MSRP: $179.99 USD (pretty typical for Koolance). Its a nice looking block. Will match my acetal nickel blocks (cpu, pch, vrm, & ram).

Btw, looks like EK is a go on their 295X2 blocks.


----------



## Jpmboy

Quote:


> Originally Posted by *wermad*
> 
> Thanks to Akira749:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> MSRP: $179.99 USD (pretty typical for Koolance). Its a nice looking block. Will match my acetal nickel blocks (cpu, pch, vrm, & ram).
> Btw, looks like EK is a go on their 295X2 blocks.


thx!







I use a lot of koolance stuff, QDCs, the 380i... but never one of their vga waterblocks.


----------



## shadow85

Quote:


> Originally Posted by *Jpmboy*
> 
> thx!
> 
> 
> 
> 
> 
> 
> 
> I use a lot of koolance stuff, QDCs, the 380i... but never one of their vga waterblocks.


I have never done water kooling before, does a water block like this mean I can still use it on a 295x2? Or do I need a complete water setup?


----------



## wermad

Quote:


> Originally Posted by *shadow85*
> 
> I have never done water kooling before, does a water block like this mean I can still use it on a 295x2? Or do I need a complete water setup?


The block is designed for a full custom water setup. You'll need a pump, rads, fans, fitting, tube, etc. It can get a bit expensive but this will drop your temps low. There are pieced custom kits that save you the hassle of searching.


----------



## wermad

Hello guys, we could really need your help to keep the red-team ahead. Green team is making a come back and we need more points to stave them off:

http://www.overclock.net/t/1476601/3d-fanboy-overclocking-competition-2014-500-in-prizing

Just a little fyi:

Windows 8 for Firestrike is ok (win7 as well). Catzilla and HWBot Heaven require windows 7 only. There's a been a few systems denied because of Catzilla and Heaven are not run in windows 7.

You can submit single gpu, crossfire, 3-way, and 4-way crossfire scores as long as its the same team (ie red-amd only or green-nvidia only).

Submissions will close @ 11:59 PST (pacific standard time).

Thanks









-wermad


----------



## levism99

I have the AMD R9 295X2 and two 780's what do you want me to do ?


----------



## Roikyou

Thinking about jumping Nvidia ship since going with the Samsung 590 and all the sli issues atm. I like the idea of one card and I like the Koolance block. One thing I didn't catch as I'm running corsair ax850, this card needs a 1500 psu? Whats the census of the best psu for this case? Thanks


----------



## Paul17041993

Quote:


> Originally Posted by *levism99*
> 
> I have the AMD R9 295X2 and two 780's what do you want me to do ?


fold?


----------



## wermad

Quote:


> Originally Posted by *Roikyou*
> 
> Thinking about jumping Nvidia ship since going with the Samsung 590 and all the sli issues atm. I like the idea of one card and I like the Koolance block. One thing I didn't catch as I'm running corsair ax850, this card needs a 1500 psu? Whats the census of the best psu for this case? Thanks


For a single card, 1kw unit is fine. For two cards (effectively quad-fire), its recommended ~1.3kw or >. Reviews for a single card are pulling ~500w for the card itself.


----------



## levism99

Quote:


> Originally Posted by *Paul17041993*
> 
> fold?


It was to this !!!

Hello guys, we could really need your help to keep the red-team ahead. Green team is making a come back and we need more points to stave them off:

http://www.overclock.net/t/1476601/3d-fanboy-overclocking-competition-2014-500-in-prizing

Just a little fyi:

Windows 8 for Firestrike is ok (win7 as well). Catzilla and HWBot Heaven require windows 7 only. There's a been a few systems denied because of Catzilla and Heaven are not run in windows 7.

You can submit single gpu, crossfire, 3-way, and 4-way crossfire scores as long as its the same team (ie red-amd only or green-nvidia only).

Submissions will close @ 11:59 PST (pacific standard time).

Thanks smile.gif


----------



## Roikyou

So many mixed reviews about this card. What I would care about it working the same or better as 780 ti's in SLI. If this card would be equal two or greater, I'd be more than happy with it. I would have to upgrade the power supply though.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Roikyou*
> 
> So many mixed reviews about this card. What I would care about it working the same or better as 780 ti's in SLI. If this card would be equal two or greater, I'd be more than happy with it. I would have to upgrade the power supply though.


http://www.digitalstormonline.com/unlocked/4-way-quad-crossfire-amd-r9-295x2-benchmarks-at-4k-idnum228/


----------



## Paul17041993

Quote:


> Originally Posted by *Roikyou*
> 
> So many mixed reviews about this card. What I would care about it working the same or better as 780 ti's in SLI. If this card would be equal two or greater, I'd be more than happy with it. I would have to upgrade the power supply though.


its virtually identical to 780ti*2 sli, even in power draw, just that its in one card with a good stock cooling solution, and due to hawaii's design it will particularly pull ahead in 4K scenarios, especially when overclocked slightly.

the bridge it uses also boosts performance over standard setups as it allows direct communication between the cores a little faster then most motherboards (full 16 PCIe3.0 lanes between them).

question with these more lies between your budget and brand preference, I choose AMD as both fair better for me, I tend to get annoyed by nvidia's drivers too much...


----------



## ColeriaX

Guys, this thing seriously sags down in my PCI-E slot. What is a solution that will not look ghetto (e.g. no zip ties or twisty ties or anything like that)?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ColeriaX*
> 
> Guys, this thing seriously sags down in my PCI-E slot. What is a solution that will not look ghetto (e.g. no zip ties or twisty ties or anything like that)?


You could cut plexiglass that mounts to two PCIE expansion slots to help hold it up.


----------



## NavDigitalStorm

Another idea would be to have the PCIE cables zip-tied upwards so there is force pulling the card up.


----------



## Majin SSJ Eric

Quote:


> Originally Posted by *ColeriaX*
> 
> Guys, this thing seriously sags down in my PCI-E slot. What is a solution that will not look ghetto (e.g. no zip ties or twisty ties or anything like that)?


This is the number ONE reason why I bought my Silverstone TJ11 case. I never have to worry about any card sagging in my rig!


----------



## Roikyou

So, I'm going to go ahead and do it, jumping ship. Any thoughts if a Corsair AX850 would power one R9 295X2? with 4770k, ssd, raptor 1tb, 500 hd, two pumps for water and 11fans. Ordered the Koolance water block also, so this is going under water... If the power requirements are little sketchy, always liked corsair power supplies, thought about 1200i or rm1000 from corsair, could upgrade. Thanks for the advice everyone.


----------



## Majin SSJ Eric

Whatcha doing with that 780Ti Classy???







Lol, just kidding!

The 295X2 is a beast of a card and with a real water block you are going to have some serious performance outta that thing! Hopefully somebody will come up with full voltage control for it soon and we can see some real bench numbers from guys like you! As far as the PSU is concerned, I admit I know very little about the subject but I'd personally feel a lot better with something bigger than 850W for that setup. I'm pretty sure the AX 850 would handle it, but might as well get something you won't ever have to worry about in the future, at least that's my general thinking on PSU's. I'd go with an EVGA like this personally: http://www.newegg.com/Product/Product.aspx?Item=N82E16817438013


----------



## wermad

Quote:


> Originally Posted by *Roikyou*
> 
> So, I'm going to go ahead and do it, jumping ship. Any thoughts if a Corsair AX850 would power one R9 295X2? with 4770k, ssd, raptor 1tb, 500 hd, two pumps for water and 11fans. Ordered the Koolance water block also, so this is going under water... If the power requirements are little sketchy, always liked corsair power supplies, thought about 1200i or rm1000 from corsair, could upgrade. Thanks for the advice everyone.


At stock gpu clocks and a modest cpu oc, you can probably pull it off but I would recommend jump to a 1kw. If you're interested, look into the Cooler Master V1000. Same cables as the AX, full modular, and got a great review from JG. Its basically a seasonic with a CM badge and better cables. I have two of these powering my current rig.


----------



## Jpmboy

well, grew tired of waiting for EK and Aquacomnputer to get their blocks launched. Just ordered the Koolance:


----------



## ColeriaX

Jelly JP...so jelly


----------



## Majin SSJ Eric

Well this ought to get interesting SOON! Can't wait to see some real OC numbers from that 295X2 JP!


----------



## wermad

Looks like the left ports may line up for tri and quad fire with 290/290x. Very tempting since 290s are selling very cheap these days:


----------



## Jpmboy

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Well this ought to get interesting SOON! Can't wait to see some real OC numbers from that 295X2 JP!


problem is no voltage unlock yet.







really want to compare to the SLI titans I'm currently running on this 2700K rig @ 4K/60. I'm a little worried since on the bench rig (with AIO and a 4960x) the 295x2 managed 55fps, the sli titans on this rig do 64fps @ 4k/60 with only 1050 clocks.

that tri-fire system might be interesting, but you'd do better with separate cards if you have the 3.0x16 slots.

just verified 4K titans in valley. block should get here early next week...


ignore the clocks in OHM, those were BF4 gaming. Valley was run at 1058 on the gpu. will post WC 295x2 results asap!


----------



## wermad

My mine interest in the 295x2 are the four display ports to run 5x1 eyefinity. So I really don't need two 295x2's tbh. I can crossfire two more hawaiis for less money then buying two 295x2's.


----------



## NavDigitalStorm

A 295X2 and a R9 290X in crossfire seems like a good idea.


----------



## ColeriaX

It would seem like a good idea and an even better value, but I don't think it would turn out nearly as well as 2 of the same card. Crossfire drivers are just not quite there yet, but have significantly improved since my Xfire 6950's.


----------



## wermad

Quote:


> Originally Posted by *ColeriaX*
> 
> It would seem like a good idea and an even better value, but I don't think it would turn out nearly as well as 2 of the same card. Crossfire drivers are just not quite there yet, but have significantly improved since my Xfire 6950's.


You can crossfire 295x2 with 290x and/or 290. Same core group. One thing Amd has over Nvidia. That's why you could xfire 7970s with 7950s and 7990s.

I know a few ppl running quad 290x and 290s so the drivers can do 3/4 way. Also, there's a member benching 4k Eyefinity with quad 290Xs.

290s are selling used for ~$300 each and I've heard rumors its so saturated they're going for less then $300 on ebay. I really only need the dp of the 295x2 and the other two cores for quad fire, they can be the cheap 290s for now.


----------



## ColeriaX

Quote:


> Originally Posted by *wermad*
> 
> You can crossfire 295x2 with 290x and/or 290. Same core group. One thing Amd has over Nvidia. That's why you could xfire 7970s with 7950s and 7990s.
> 
> I know a few ppl running quad 290x and 290s so the drivers can do 3/4 way. Also, there's a member benching 4k Eyefinity with quad 290Xs.
> 
> 290s are selling used for ~$300 each and I've heard rumors its so saturated they're going for less then $300 on ebay. I really only need the dp of the 295x2 and the other two cores for quad fire, they can be the cheap 290s for now.


Sure you can. However, until we see some scaling and benchmarks with the 295X2 Xfired with another hawaii card I'd be hesitant to recommend it. Just my 2 cents


----------



## wermad

Quote:


> Originally Posted by *ColeriaX*
> 
> Sure you can. However, until we see some scaling and benchmarks with the 295X2 Xfired with another hawaii card I'd be hesitant to recommend it. Just my 2 cents


How can it not scale? its essentially two 290x cores on single pcb. There's quite a few 3 and 4 way crossfire reviews and setups out there already to give us a good understanding on scaling. You're not gonna see much use on a 1080 monitor for example, but those pushing extreme resolutions, you will see some gains. Its not like amd handicaps dual gpu cards to prevent scaling with its single card variant. The only concern I would have is those running pcie 2.0 like your setup.

From the few reviews out and my personal experience, Nvidia tends to have lackluster scaling on their top tier cards once you push more then two.


----------



## Jpmboy

Quote:


> Originally Posted by *ColeriaX*
> 
> Sure you can. However, until we see some scaling and benchmarks with the 295X2 Xfired with another hawaii card I'd be hesitant to recommend it. Just my 2 cents


----------



## ColeriaX

I'm not saying it will not scale, I am saying I would bet that it would not scale like it would if you used the same cards. To think that it would just be PnP perfect with a 295X2 and a 290/290X is being over optimistic IMO.


----------



## NavDigitalStorm

I'll be testing it this Monday and let you know how it scales.


----------



## ColeriaX

Sweet, looking forward to the results Nav.


----------



## wermad

Quote:


> Originally Posted by *ColeriaX*
> 
> I'm not saying it will not scale, I am saying I would bet that it would not scale like it would if you used the same cards. To think that it would just be PnP perfect with a 295X2 and a 290/290X is being over optimistic IMO.


7990 + 7970 and 7990 + 7970 + 7970 has been done a few times already. Amd knows this game. Same thing with 6990/6970/6950. Amd knows ppl can crossfire this card with a 290/290x or two more. Why limit the crossfire capability since it is capable of 4-way crossfire? Why give amd customers an excuse to switch sides? Whether the sacling is perfect or not, my point is it should be fairly close if not the same as its single card crossfire setups.

Well, as Nav said, he'll get you some #s. Based on prior generations, I do expect to fall close to its single card equivalent setups as I've said.

edit:

Here is a review I found 295x2 +290x:

http://www.tweaktown.com/articles/6265/amd-radeon-r9-295x2-8gb-and-r9-290x-4gb-video-cards-in-crossfirex/index3.html


----------



## Jpmboy

Quote:


> Originally Posted by *wermad*
> 
> 7990 + 7970 and 7990 + 7970 + 7970 has been done a few times already. Amd knows this game. Same thing with 6990/6970/6950. Amd knows ppl can crossfire this card with a 290/290x or two more. Why limit the crossfire capability since it is capable of 4-way crossfire? Why give amd customers an excuse to switch sides? Whether the sacling is perfect or not, my point is it should be fairly close if not the same as its single card crossfire setups.
> Well, as Nav said, he'll get you some #s. Based on prior generations, I do expect to fall close to its single card equivalent setups as I've said.
> edit:
> Here is a review I found 295x2 +290x:
> http://www.tweaktown.com/articles/6265/amd-radeon-r9-295x2-8gb-and-r9-290x-4gb-video-cards-in-crossfirex/index3.html


unless i missed it, they did not compare to 3 290s or 3 290x. ? If so, "[t]weak town review"

"trust, but verify".


----------



## wermad

Quote:


> Originally Posted by *Jpmboy*
> 
> unless i missed it, they did not compare to 3 290s or 3 290x. ? If so, "[t]weak town review"
> 
> "trust, but verify".


just added to at least give us a preview of half the equation for those who are not so optimistic









edit: @ Nav, I hope this can done if you don't mind









295x2 vs 2x 290X
295x2 + 290X vs 3x 290X
295x2 + 290x + 290X vs 4x 290X
2x 295x2 vs 4x 290X


----------



## ColeriaX

Quote:


> Originally Posted by *wermad*
> 
> just added to at least give us a preview of half the equation for those who or no so optimistic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: @ Nav, I hope this can done if you don't mind
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 295x2 vs 2x 290X
> 295x2 + 290X vs 3x 290X
> 295x2 + 290x + 290X vs 4x 290X
> 2x 295x2 vs 4x 290X


+1 please


----------



## NavDigitalStorm

If you guys dont mind messaging me on Monday with a detailed request, i'll try my best to get a it done.


----------



## wermad

^^^Will do









If anyone missed the quad review:

http://www.pcper.com/reviews/Graphics-Cards/Radeon-R9-295X2-CrossFire-4K-Quad-Hawaii-GPU-Powerhouse

They pulled ~1200w at the wall


----------



## NavDigitalStorm

Quote:


> Originally Posted by *wermad*
> 
> ^^^Will do
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone missed the quad review:
> 
> http://www.pcper.com/reviews/Graphics-Cards/Radeon-R9-295X2-CrossFire-4K-Quad-Hawaii-GPU-Powerhouse
> 
> They pulled ~1200w at the wall


My Kill-a-watt read 1350w peak


----------



## EliteReplay

checking on you guys


----------



## Majin SSJ Eric

Lol, my own personal OCD just would not allow me to run two different cards in my setup. If I wanted to CF with a 295X2 I'd have to get another one to match. Silly I know, and normal people may indeed be far better off grabbing a single GPU Hawaii card or two to CF with, but I just personally couldn't stand that in my rig...


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Lol, my own personal OCD just would not allow me to run two different cards in my setup. If I wanted to CF with a 295X2 I'd have to get another one to match. Silly I know, and normal people may indeed be far better off grabbing a single GPU Hawaii card or two to CF with, but I just personally couldn't stand that in my rig...


If you WB them they would like almost the same haha.


----------



## Majin SSJ Eric

Yeah but I would KNOW they were different! I'm a sick, sick man...


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Yeah but I would KNOW they were different! I'm a sick, sick man...


It's okay, I Have 100+ GTX Titan Black reference coolers laying around me, i'm a sick man too.


----------



## Majin SSJ Eric

Lol, I wish I had the means to be that far off the deep end!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Lol, I wish I had the means to be that far off the deep end!


Take some of mine before I drown...


----------



## Roaches

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> It's okay, I Have 100+ GTX Titan Black reference coolers laying around me, i'm a sick man too.


What do you guys actually do with those reference coolers after removing the PCB for watercooling your builds at DigitalStorm?


----------



## Majin SSJ Eric

It sounds like they just sit there!


----------



## wermad

Use them on a single 290X







XD


----------



## Roaches

IDS HABBIDING!!



http://videocardz.com/50430/powercolor-teasing-radeon-r9-295x2-devil13

A new Devil 13!!


----------



## wermad

Is it a custom 295x2 or a 290X2? Or both?


----------



## Sgt Bilko

Looks like a triple slot air cooler, guess it's going to be cheaper than AMD's ref design though.


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> Is it a custom 295x2 or a 290X2? Or both?


I got a feeling we're gonna get heat at Computex!..I'd hold my breath by then!







Hopefully we get both models as 290X2 might sound like the best bang if its equal or less than $1000 compared to the 295X2









Devil Editions are pretty much the only Radeons that give me erections, best looking nonreference cards on the AMD front


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Looks like a triple slot air cooler, guess it's going to be cheaper than AMD's ref design though.


Their custom design for the 7970x2 didn't have the multi displayports of the reference 7990. So it was proned to the nasty screen tearing in Eyefinity.

If this is a 295x2, it should retain, hopefully, those video outputs. If its a custom one, it may look more like the 290X reference output. Well, only way to know is to see it in the flesh


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Their custom design for the 7970x2 didn't have the multi displayports of the reference 7990. So it was proned to the nasty screen tearing in Eyefinity.
> 
> If this is a 295x2, it should retain, hopefully, those video outputs. If its a custom one, it may look more like the 290X reference output. Well, only way to know is to see it in the flesh


I'd be guessing this is a Ref 295x2 board but with the Devil cooler on it (revamped obviously), So it should retain the Mini DP's of the ref design i'd think.

I'm kinda hoping this is a 290x2 though, I don't think an air only cooler will be able to outshine the ref design 295x2.


----------



## NavDigitalStorm

SWEET!


----------



## shadow85

I was just wondering, has anyone seen the sapphire 295x2 OC, here in AUS it costs $100 more and it only increased each core clock to 1030 and memory to 5200.

Is this worth the extra 100 quid? Or could I just get the standard 295x2 and oc it that much?

Is there any hardware difference to the sapphire 295x2 oc vs the non oc?

http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&lid=1&pid=2293&leg=0


----------



## Sgt Bilko

Quote:


> Originally Posted by *shadow85*
> 
> I was just wondering, has anyone seen the sapphire 295x2 OC, here in AUS it costs $100 more and it only increased each core clock to 1030 and memory to 5200.
> 
> Is this worth the extra 100 quid? Or could I just get the standard 295x2 and oc it that much?
> 
> Is there any hardware difference to the sapphire 295x2 oc vs the non oc?
> 
> http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&lid=1&pid=2293&leg=0


I wouldn't imagine it's worth the extra cash, you could OC it yourself to that level or even just flash the OC bios onto the card (not sure how flashing works on Dual GPU cards though)


----------



## ColeriaX

Quote:


> Originally Posted by *wermad*
> 
> 7990 + 7970 and 7990 + 7970 + 7970 has been done a few times already. Amd knows this game. Same thing with 6990/6970/6950. Amd knows ppl can crossfire this card with a 290/290x or two more. Why limit the crossfire capability since it is capable of 4-way crossfire? Why give amd customers an excuse to switch sides? Whether the sacling is perfect or not, my point is it should be fairly close if not the same as its single card crossfire setups.
> 
> Well, as Nav said, he'll get you some #s. Based on prior generations, I do expect to fall close to its single card equivalent setups as I've said.
> 
> edit:
> 
> Here is a review I found 295x2 +290x:
> 
> http://www.tweaktown.com/articles/6265/amd-radeon-r9-295x2-8gb-and-r9-290x-4gb-video-cards-in-crossfirex/index3.html


I thought there would be some drivers issues....and there was. +rep for the article wermad thanks.


----------



## shadow85

Anyone got any benchmarks of the sapphire oc vs the non oc?

And temp difference under load?


----------



## Jpmboy

Hey OP... where's the " R9 295x2 Owners Club Member " banner?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> Hey OP... where's the " R9 295x2 Owners Club Member " banner?


Oh you mean the code for that?


----------



## Jpmboy

copy from my sig if you want.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> copy from my sig if you want.


Done


----------



## NavDigitalStorm

Here is the first custom R9 295X2.


----------



## shadow85

Where did u get the pic? What brand ? Did they really scrap the hybrid cooler for fans?? The hybrid was already very good yeah?


----------



## wermad

Looks like the devil...middle pic is the big giveaway w/ the the print: "devil 13"


----------



## ColeriaX

Anyone get a full cover block on their card yet?


----------



## Jpmboy

A koolance block should arrive this week... ups has it for Friday







Hopfully earlier.


----------



## ColeriaX

Quote:


> Originally Posted by *Jpmboy*
> 
> A koolance block should arrive this week... ups has it for Friday
> 
> 
> 
> 
> 
> 
> 
> Hopfully earlier.


Nice man. I'm going to be ordering the parts for my loop soon. Is there a general consensus for the best high FPI 30mm thick 240 rads? I saw another awesome build with them using ST30's however he is only running his fans at 5V...I run all my at 12







so max FPI and cooling performance is all I care about with the amount of watts Ill have to be dissipating


----------



## Jpmboy

Quote:


> Originally Posted by *ColeriaX*
> 
> Nice man. I'm going to be ordering the parts for my loop soon. Is there a general consensus for the best high FPI 30mm thick 240 rads? I saw another awesome build with them using ST30's however he is only running his fans at 5V...I run all my at 12
> 
> 
> 
> 
> 
> 
> 
> so max FPI and cooling performance is all I care about with the amount of watts Ill have to be dissipating


try: http://martinsliquidlab.org/


----------



## blue1512

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Here is the first custom R9 295X2.


This card has 8+8+8+8 power setup
Definitely a better choice than the ref when a block is put on it.


----------



## Jpmboy

Quote:


> Originally Posted by *blue1512*
> 
> This card has 8+8+8+8 power setup
> Definitely a better choice than the ref when a block is put on it.


Price??


----------



## wermad

Quote:


> Originally Posted by *blue1512*
> 
> This card has 8+8+8+8 power setup
> Definitely a better choice than the ref when a block is put on it.


Its a custom design and very small quantity, boutique, card. So the likely hood of a block is very slim. Only reason its predecessor got a block was due to the absence of a reference 7990 at the time of the Devil 7970x2 launch, so EK was happy to sell one to owners.


----------



## Yuri_RP

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I wouldn't imagine it's worth the extra cash, you could OC it yourself to that level or even just flash the OC bios onto the card (not sure how flashing works on Dual GPU cards though)


Hello, I'm also planning on ordering Sapphire R9 295X2 OC Edition. Aside from the Limited Edition Metal Box, is there any significant advantage over the normal one?

Can we just OC the Ref to the same level as the OC Edition?
Has anyone tried this or can confirm this?


----------



## Paul17041993

Quote:


> Originally Posted by *Yuri_RP*
> 
> Hello, I'm also planning on ordering Sapphire R9 295X2 OC Edition. Aside from the Limited Edition Metal Box, is there any significant advantage over the normal one?
> 
> Can we just OC the Ref to the same level as the OC Edition?
> Has anyone tried this or can confirm this?


the OC edition just looks to be a very mild bump, so yea you can, no guarantees though, but most 290Xs go to 1100 on the core without extra voltage, memory OC however usually needs a little bump in the core voltage.


----------



## shadow85

The mem clock is +200 mhz more on the oc edition, is this a noticeable better increase in performance?


----------



## Paul17041993

Quote:


> Originally Posted by *shadow85*
> 
> The mem clock is +200 mhz more on the oc edition, is this a noticeable better increase in performance?


unless you're on a 4K screen or higher (eg; 1440p eyefinity), no, not really. the memory on hawaii is already very fast on stock, OCing it only really helps bench numbers for the most part, but 4K and eyefinity setups can gain a few FPS on heavy games.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> the OC edition just looks to be a very mild bump, so yea you can, no guarantees though, but most 290Xs go to 1100 on the core without extra voltage, memory OC however usually needs a little bump in the core voltage.


It's an extra 12Mhz on the core and 50Mhz on the memory, I've no doubt a vanilla 295x2 could do that on stock voltage, worst case add +10mV
Quote:


> Originally Posted by *Yuri_RP*
> 
> Hello, I'm also planning on ordering Sapphire R9 295X2 OC Edition. Aside from the Limited Edition Metal Box, is there any significant advantage over the normal one?
> 
> Can we just OC the Ref to the same level as the OC Edition?
> Has anyone tried this or can confirm this?


I honestly can't see one, XFX did the same thing with the Reference design 290/x's everything is the same just a small factory overclock
Quote:


> Originally Posted by *shadow85*
> 
> The mem clock is +200 mhz more on the oc edition, is this a noticeable better increase in performance?


it's 5200Mhz effective memory which means 1300Mhz vs 5000Mhz effective = 1250Mhz

It's a very minor bump and as paul said only eyefinity or 4k setups would benefit from it


----------



## Jpmboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's an extra 12Mhz on the core and 50Mhz on the memory, I've no doubt a vanilla 295x2 could do that on stock voltage, *worst case add +10mV*
> I honestly can't see one, XFX did the same thing with the Reference design 290/x's everything is the same just a small factory overclock
> it's 5200Mhz effective memory which means 1300Mhz vs 5000Mhz effective = 1250Mhz
> It's a very minor bump and as paul said only eyefinity or 4k setups would benefit from it


I'd love to know how to add mV to the core or memory.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jpmboy*
> 
> I'd love to know how to add mV to the core or memory.


295x2 should have voltage support soon









took a couple of weeks for the 290/x's to get it as well.


----------



## Jpmboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 295x2 should have voltage support soon
> 
> 
> 
> 
> 
> 
> 
> 
> 
> took a couple of weeks for the 290/x's to get it as well.


I hope so!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jpmboy*
> 
> I hope so!


As do I









Can't wait to see the numbers you guys get outta these things


----------



## shadow85

I have never ovrrclocked a GPU before. Is 10mV alot, or a small amount?

So pretty much the Sapphire OC edition will be exactly the same as the nonOC edition, except its BIOS will have it pre overclocked?

So we could just do the same OC on the normal edition and they will have exactly same performance, noise, power draw and temps as the OC edition?


----------



## The Mac

im pretty sure the sapphy has upgraded componants as well.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *The Mac*
> 
> im pretty sure the sapphy has upgraded componants as well.


It's just a nicer box and higher factory clock I believe.


----------



## axiumone

Yeah, I really, really doubt that any of the components were changed or improved for a such a low factory overclock.


----------



## NavDigitalStorm

Working on 295X2 + 290X Hybrid right now.


----------



## shadow85

Quote:


> Originally Posted by *axiumone*
> 
> Yeah, I really, really doubt that any of the components were changed or improved for a such a low factory overclock.


Yeah, I find it unreal to charge a $100 premium for a small factory overclock of no other components were changed.

Atleast with evga they only charged around $20 more for the SC ACX edition of my last graphics card, and they put a better cooler on it too.


----------



## levism99

Just got my card


----------



## NavDigitalStorm

Quote:


> Originally Posted by *levism99*
> 
> Just got my card


Added to the roster!


----------



## NavDigitalStorm

http://www.digitalstormonline.com/unlocked/amd-radeon-r9-295x2-and-r9-290x-hybrid-crossfire-review-multi-gpu-scaling-idnum251/


----------



## wermad

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> http://www.digitalstormonline.com/unlocked/amd-radeon-r9-295x2-and-r9-290x-hybrid-crossfire-review-multi-gpu-scaling-idnum251/


Awesome! Any possibility of 4k to directly compare to the 295x2 crossfire and 295x2 reviews?


----------



## Torvi

how it would work against gtx titan? Prices are very close, on uk,pcpartpicker.com titan is actually cheaper. i would like to see the comprasion between those two.


----------



## Paul17041993

Quote:


> Originally Posted by *shadow85*
> 
> I have never ovrrclocked a GPU before. Is 10mV alot, or a small amount?
> 
> So pretty much the Sapphire OC edition will be exactly the same as the nonOC edition, except its BIOS will have it pre overclocked?
> 
> So we could just do the same OC on the normal edition and they will have exactly same performance, noise, power draw and temps as the OC edition?


10mV is like, nothing really, Ive thumped my 290X by +200mV to get as high as 1200/1600 "stable", however it gets very hard to cool at said voltage so I'm waiting for a water loop before setting it there or higher.

main thing with hawaii is to watch PCB and VRM temperatures and to watch power draw if you don't want it pulling too much, example OC above (1200/1600) pulls ~350W vs only ~180W on stock, watercooling however drops the power draw by a significant amount.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *wermad*
> 
> Awesome! Any possibility of 4k to directly compare to the 295x2 crossfire and 295x2 reviews?


When I get a chance, definitely.


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> http://www.digitalstormonline.com/unlocked/amd-radeon-r9-295x2-and-r9-290x-hybrid-crossfire-review-multi-gpu-scaling-idnum251/


nice work and review. Great numbers from the hybrid tri-fire!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> nice work and review. Great numbers from the hybrid tri-fire!


Thanks, ill be updating the OP with all my benchmark test i've done.


----------



## ColeriaX

So I know this may not be a huge deal, but I wasn't happy with my Firestrike score. After a few hours of tweaking I cracked 17k with my 2600k at 5.1 Ghz







295X2 needs voltage control!!!!


----------



## Jpmboy

Quote:


> Originally Posted by *ColeriaX*
> 
> So I know this may not be a huge deal, but I wasn't happy with my Firestrike score. After a few hours of tweaking I cracked 17k with my 2600k at 5.1 Ghz
> 
> 
> 
> 
> 
> 
> 
> 2*95X2 needs voltage control!!!!
> *


Great score!! yup, we need an unlock!!


----------



## ssiperko

Quote:


> Originally Posted by *ColeriaX*
> 
> So I know this may not be a huge deal, but I wasn't happy with my Firestrike score. After a few hours of tweaking I cracked 17k with my 2600k at 5.1 Ghz
> 
> 
> 
> 
> 
> 
> 
> 295X2 needs voltage control!!!!


Well heck man ..... makes me feel good about my system!











2x R9 290's.









SS


----------



## wermad

Quote:


> Originally Posted by *ssiperko*
> 
> Well heck man ..... makes me feel good about my system!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2x R9 290's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SS


290s are beastly







. Extremely difficult fighting the temptation when they're going for ~$260 on ebay







. Sweet score btw


----------



## Majin SSJ Eric

Are they seriously going for $260???


----------



## ColeriaX

Quote:


> Originally Posted by *ssiperko*
> 
> Well heck man ..... makes me feel good about my system!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2x R9 290's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SS


Wow that's an insane score







What 290's are those? Are they unlocked to X's? What were your clocks for that run?


----------



## Jpmboy

Quote:


> Originally Posted by *ColeriaX*
> 
> Wow that's an insane score
> 
> 
> 
> 
> 
> 
> 
> What 290's are those? Are they unlocked to X's? What were your clocks for that run?


once i flashed my 290x to the asus unlocked bios, it was a very fast card (Hotrod717 now owns it)... unlock these 295x2s and they will fly. routine 1700 memory at stock volts!


----------



## hotrod717

Quote:


> Originally Posted by *blue1512*
> 
> This card has 8+8+8+8 power setup
> Definitely a better choice than the ref when a block is put on it.


Have to guess Tul's design. Powercolor/Club3D. Wonder if it's their Devil???









^On Q and didnt even know it. Lol, Taking a backseat to my Lightning at the moment.







One beautiful thing about AMD dual gpu cards, they can actually oc pretty well. Don't think the 295x2 will be any different. Remember when a 6990 at 1000+mhz was a deal. Just 4 yrs ago.


----------



## Majin SSJ Eric

Well I did find a 290 on Ebay going for $285 but its the stupid reference cooler. How do you think the reference 290 would do with folding and limited benching. I'm really wanting to retire my 580 Lightning to the wall but I don't want to have the thing running 95C constantly and being all loud. The 580 Lightning generally runs around 73C and is not too loud during folding...


----------



## Jpmboy

buy a cheap 290 "convertible" and hybrid cfx with that 295x2! although i think the 290(x) vram will be a bit of and anchor.


----------



## gerardfraser

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Well I did find a 290 on Ebay going for $285 but its the stupid reference cooler. How do you think the reference 290 would do with folding and limited benching. I'm really wanting to retire my 580 Lightning to the wall but I don't want to have the thing running 95C constantly and being all loud. The 580 Lightning generally runs around 73C and is not too loud during folding...


R9 290 Crossfire run. Reference and non reference GPU. For the cheap cards,no brainer.
Also not all reference card are loud or hit max temps.

http://www.3dmark.com/3dm/2833578

Here is what temperatures look like with small overclock on the cards.

Crysis 3 runs so cards are working hard.

*R9 290 Crossfire Custom Fan Profile*


*R9 290 Crossfire Auto Fan profile*


----------



## Majin SSJ Eric

That's a fantastic FS score! Beat the Graphics score I got with my Titans posted in my sig! Didn't realize the 290's were so strong in FS?


----------



## wermad

Yup, $250 + $11 shipping. Theres a few sitting under $300 with buy it now. Keep an eye or for the auction ones.

Saw a 295x2 go $1350... but seeing 290s sitting under $300 is a tough choice.


----------



## Majin SSJ Eric

Wonder how much stronger a folder the 290 is than my 580 Lightning? Or in gaming?


----------



## gerardfraser

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> That's a fantastic FS score! Beat the Graphics score I got with my Titans posted in my sig! Didn't realize the 290's were so strong in FS?


Nvidia and AMD have some good cards out there.The R9 290 crossfire also beat my 780TI Sli on the same machine.Gaming wise they trade blows,and turn off FPS counter in games and you could not tell the difference.
I would love to buy 295X2 and Trifire the card,but for me does not make sense.
I love to see more benches in trifire from real users.


----------



## ssiperko

Quote:


> Originally Posted by *ColeriaX*
> 
> Wow that's an insane score
> 
> 
> 
> 
> 
> 
> 
> What 290's are those? Are they unlocked to X's? What were your clocks for that run?


One HIS and one Shap. I wish they were both HIS as that card will do 1225/1625 all day .... neither will unlock but OH well.
That run was 1175/1500.

The physics scores REALLY bump the overall.

If I could get 5.0 outta this 4770K I bet I could be 17600 or better.

IF I could score another gem like this HIS I'd be pulling 17600 at 4.8 I bet.









My best single run was 1250/1700 at 4.9 with my HIS.











SS


----------



## Cool Mike

Hope to see voltage control soon. MSI is selling their version of the reference 295x2 on newegg. Maybe a new beta Afterburner coming.


----------



## Yuri_RP

Hello,

I currently have 2 x PowerColor R9 290 PCs+ in crossfire. I want to ask if it is worth it to upgrade to R9 295X2? Been pondering it this past week and already got my eye on it.

I would like your opinion on this. I'm not overclocking them right now (never done it before).

Also, if it is worth buying R9 295X2, I have few options in my country. MSI for USD 1610, PowerColor for USD 1630, Gigabyte for USD 1670 and the Sapphire OC Edition for USD 1760. The currently ready stock is only the MSI, the other must indent first.

Is there a specific "better" brand? Or I can just go along with any of them?

Sorry for maybe asking too much, wouldn't want to regret it later, as this is going to be my priciest VGA yet.

Thank you.


----------



## gerardfraser

Quote:


> Originally Posted by *Yuri_RP*
> 
> Hello,
> 
> I currently have 2 x PowerColor R9 290 PCs+ in crossfire. I want to ask if it is worth it to upgrade to R9 295X2? Been pondering it this past week and already got my eye on it.
> 
> I would like your opinion on this. I'm not overclocking them right now (never done it before).
> 
> Also, if it is worth buying R9 295X2, I have few options in my country. MSI for USD 1610, PowerColor for USD 1630, Gigabyte for USD 1670 and the Sapphire OC Edition for USD 1760. The currently ready stock is only the MSI, the other must indent first.
> 
> Is there a specific "better" brand? Or I can just go along with any of them?
> 
> Sorry for maybe asking too much, wouldn't want to regret it later, as this is going to be my priciest VGA yet.
> 
> Thank you.


No not worth buying R9 295x2.You already have great cards.Also look back one page my R9 290 cards kick ass.No need too buy this card.

Reason This,you can buy 4 R9 290X cards for cheaper than one R9 295X2 card.Just saying thats all.
Gigabyte GV-R929XOC-4GD-R9 290X - 408 Canadian or 372 US
http://postimage.org/
image hoster


----------



## Yuri_RP

In my country (Indonesia), Non reference R9 290X costs upwards of USD 625 and is very rare to find. While the Reference Card cooler are rubbish.

So, what about performance? Does 2 x R9 290X in crossfire have the same performance as a single R9 295X2?


----------



## King4x4

If they are non referernce cooled and clocked to 1018mhz? definatly!


----------



## Yuri_RP

Okay, so what do you suggest?

Should I stay with my 2 x PowerColor R9 290 PCS+, or should I switch to 2 x Non Reference R9 290X, or should I switch to this R9 295x2?

I'm primarily using it for gaming, and I'm planning to use 3 QNIX for Eyefinity.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Yuri_RP*
> 
> Okay, so what do you suggest?
> 
> Should I stay with my 2 x PowerColor R9 290 PCS+, or should I switch to 2 x Non Reference R9 290X, or should I switch to this R9 295x2?
> 
> I'm primarily using it for gaming, and I'm planning to use 3 QNIX for Eyefinity.


Personally, I would just stick with what you have and wait till the next refresh.


----------



## Yuri_RP

Hello,

I guess I will still buy the R9 295X2. So now the questions is, which brand should I choose? Currently considering the cheapest and ready stock MSI, 2 years warranty. Don't know about the other brand warranty though. The card haven't officially sold in Indonesia yet.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Yuri_RP*
> 
> Hello,
> 
> I guess I will still buy the R9 295X2. So now the questions is, which brand should I choose? Currently considering the cheapest and ready stock MSI, 2 years warranty. Don't know about the other brand warranty though. The card haven't officially sold in Indonesia yet.


you really have no reason to sidegrade to a 295x2.

Considering what you are running now it's just a waste of money.


----------



## Yuri_RP

Quote:


> Originally Posted by *Sgt Bilko*
> 
> you really have no reason to sidegrade to a 295x2.
> 
> Considering what you are running now it's just a waste of money.


Yes. But I just can't resist it.








Bought the MSI R9 295X2, will arrive in 3 days tops. Will share later.

The 2 x PowerColor R9 290 PCS+ Practically blocked all but 1 one of my PCI-E slot. They are huge. Barely able to put the ASUS Xonar Phoebus there, it grazes the backplate of the 2nd PCS+. Afraid it might get scratched or get too hot.








Would like to be able to clear some room, maybe put a USB 3.0 card or maybe some Wireless adapter.









Thanks for all the input.

I have one more questions.
As I am using a HAF X case, and also using a H100i Liquid Closed Loop. What airflow configuration is the best for the case?
I'm currently planning on mounting the R9 295X2 cooler as an Exhaust at the back, and also the H100i as an exhaust at the top.
Intake would be from front and side fan. I thought that because the hot temp would be at the Radiator, it would be best to push it out instead of taking it in.

Any suggestions?

Edit : H100i Rad will be in Push Pull config, maybe will make the 295X2 Rad into push pull also.


----------



## Cool Mike

One key advantage you will have with the 295x2 is less heat in your case and more room in your case. Most of the heat will be exhausted outside via the radiator. Cooler gpu temps also.


----------



## Yuri_RP

Quote:


> Originally Posted by *Cool Mike*
> 
> One key advantage you will have with the 295x2 is less heat in your case and more room in your case. Most of the heat will be exhausted outside via the radiator. Cooler gpu temps also.


Yes, already boxed my PowerColor R9 290 PCS+.. Gonna sell it..








Can't wait for the MSI R9 295X2 to arrive..

Some QC Pic from the seller..


----------



## lowgun

Quote:


> Originally Posted by *wermad*
> 
> Yup, $250 + $11 shipping. Theres a few sitting under $300 with buy it now. Keep an eye or for the auction ones.
> 
> Saw a 295x2 go $1350... but seeing 290s sitting under $300 is a tough choice.


I got that $1350 295x2











When I saw one for that cheap on eBay, and local no-less, figured I had to do it. I've been running dual 780's in SLI, gonna see if its worth my time to switch over from Nvidia.


----------



## Yuri_RP

Quote:


> Originally Posted by *lowgun*
> 
> I got that $1350 295x2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I saw one for that cheap on eBay, and local no-less, figured I had to do it. I've been running dual 780's in SLI, gonna see if its worth my time to switch over from Nvidia.


Wow, wish I had that price in my country..


----------



## Roikyou

Already tore mine apart, going to plumb it tomorrow when I can dedicate the day to run acrylic to it.


----------



## axiumone

Nice, is it heavier or lighter with a full block?


----------



## Roikyou

Quote:


> Originally Posted by *axiumone*
> 
> Nice, is it heavier or lighter with a full block?


Close to the same weight but a little heavier. I've found Koolance products to be very well made, very solid.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *lowgun*
> 
> I got that $1350 295x2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I saw one for that cheap on eBay, and local no-less, figured I had to do it. I've been running dual 780's in SLI, gonna see if its worth my time to switch over from Nvidia.


Quote:


> Originally Posted by *Roikyou*
> 
> Already tore mine apart, going to plumb it tomorrow when I can dedicate the day to run acrylic to it.


Added both to roster!


----------



## Elmy

Quote:


> Originally Posted by *Roikyou*
> 
> Close to the same weight but a little heavier. I've found Koolance products to be very well made, very solid.


Seeing your from Oregon....You ever go to PDXLAN?


----------



## Roikyou

Quote:


> Originally Posted by *Elmy*
> 
> Seeing your from Oregon....You ever go to PDXLAN?


I've heard of it and seen it. Group I gamed with a few times but not recently goes but yet to make time to go up there. I've got a caselabs TH10, so I can't see dragging that two hours to Portland and then trying to pack it in the back of a Ford Focus Hatchback. I'd like to go though...


----------



## EliteReplay

Quote:


> Originally Posted by *Roikyou*
> 
> Already tore mine apart, going to plumb it tomorrow when I can dedicate the day to run acrylic to it.
> 
> 
> Spoiler: Warning: Spoiler!


just


----------



## Elmy

Quote:


> Originally Posted by *Roikyou*
> 
> I've heard of it and seen it. Group I gamed with a few times but not recently goes but yet to make time to go up there. I've got a caselabs TH10, so I can't see dragging that two hours to Portland and then trying to pack it in the back of a Ford Focus Hatchback. I'd like to go though...


I live in Seattle and I drag my big ass computer up there along with 5 monitors 3 times a year LoL.... Its alot of fun you should do it. You could hang out with me if you dont know anyone else going.

Here is a video I took of PDXLAN last July.


----------



## Elmy

http://s1126.photobucket.com/user/Elmnator/media/20140506_162836.jpg.html

This just came in the mail today. WOOT!!!

Can I be in the Club? 

2nd one should be here in the next couple weeks


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> This just came in the mail today. WOOT!!!
> 
> Can I be in the Club?
> 
> 2nd one should be here in the next couple weeks


Which model?


----------



## Elmy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Which model?


AMD


----------



## ssiperko

Quote:


> Originally Posted by *Elmy*
> 
> AMD



















because at this point it doesn't REALLY matter









SS


----------



## ssiperko

Quote:


> Originally Posted by *Yuri_RP*
> 
> Hello,
> 
> I currently have 2 x PowerColor R9 290 PCs+ in crossfire. I want to ask if it is worth it to upgrade to R9 295X2? Been pondering it this past week and already got my eye on it.
> 
> I would like your opinion on this. I'm not overclocking them right now (never done it before).
> 
> Also, if it is worth buying R9 295X2, I have few options in my country. MSI for USD 1610, PowerColor for USD 1630, Gigabyte for USD 1670 and the Sapphire OC Edition for USD 1760. The currently ready stock is only the MSI, the other must indent first.
> 
> Is there a specific "better" brand? Or I can just go along with any of them?
> 
> Sorry for maybe asking too much, wouldn't want to regret it later, as this is going to be my priciest VGA yet.
> 
> Thank you.


Look at my scores (like they really matter in the real world) and make a tough choice ..... sell yer stuff (at a loss now) and spend another $800 (+/-) for the coolness or not.








Me being old and wishing to pass on fiscal responsibility unto today's youth I suggest you keep the (overkill) setup you have and just revile in you're own economic genius.









SS


----------



## ssiperko

Quote:


> Originally Posted by *Cool Mike*
> 
> One key advantage you will have with the 295x2 is less heat in your case and more room in your case. Most of the heat will be exhausted outside via the radiator. Cooler gpu temps also.


For those of us in CF mode you can add A LOT of cooling to yer system for $800 + .................. just sayin!









SS


----------



## ssiperko

Quote:


> Originally Posted by *Sgt Bilko*
> 
> you really have no reason to sidegrade to a 295x2.
> 
> Considering what you are running now it's just a waste of money.


Some peoples children have less carnal capacity than that from which they waz bread.









SS


----------



## NavDigitalStorm

@Elmy Added


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> Already tore mine apart, going to plumb it tomorrow when I can dedicate the day to run acrylic to it.
> 
> 
> Spoiler: Warning: Spoiler!


very jelly - my koolance block won't get here until friday









which T-pads did you use?


----------



## Jpmboy

Quote:


> Originally Posted by *ssiperko*
> 
> Some peoples children have less carnal capacity than that from which they waz bread.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SS


Gotta laugh at your sig line.


----------



## Roikyou

Quote:


> Originally Posted by *Jpmboy*
> 
> very jelly - my koolance block won't get here until friday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> which T-pads did you use?


Stock pads from Koolance. They come with 1mm and .7 with plenty to spare and thermal paste in a disposable baggy... Surprised when I saw it.


----------



## Majin SSJ Eric

Starting to get a little jelly of all you guys in this club...


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Starting to get a little jelly of all you guys in this club...


You should join


----------



## Majin SSJ Eric

I don't think I'd replace my Titans with this card but I am definitely eyeing the R9 290/290X as a replacement for my soon-to-be-retired GTX 580 Lightning in my folding rig. I really like to have rigs with both Nvidia and AMD GPU's in them just to cover as many bases as possible. If the 295X2 were priced at a bit of a discount to two 290X's I'd definitely consider it...


----------



## Sgt Bilko

Update on the EK Block........(Soon)


----------



## shadow85

Seeing all these water blocks makes me wana switch over to a water setup, never done it, but it just looks kool!


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> Stock pads from Koolance. They come with 1mm and .7 with plenty to spare and thermal paste in a disposable baggy... Surprised when I saw it.


Ugh, 0.7mm, probably the only Fuji extremes I do not have! 1mm for vrms, right?


----------



## Jpmboy

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> I don't think I'd *replace my Titans with this card* but I am definitely eyeing the R9 290/290X as a replacement for my soon-to-be-retired GTX 580 Lightning in my folding rig. I really like to have rigs with both Nvidia and AMD GPU's in them just to cover as many bases as possible. If the 295X2 were priced at a bit of a discount to two 290X's I'd definitely consider it...


That's my concern too, Eric. so far the 295x2 (even at 1100/1700) pushes significantly lower fps than sli titans @ 4K rez in my hands. But that is comparing unlocked titans to a stock volt 295x2. At bone stock with the OEM bios, the 295x2 is faster, but not so once you load a mod bios (svl7 or skyn3t) and unlock the NPC vrms.


----------



## Yuri_RP

Will there be unlocked bios for 295x2?


----------



## Jpmboy

Quote:


> Originally Posted by *Yuri_RP*
> 
> Will there be unlocked bios for 295x2?


without it, better off buying 290s or 290x's


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Yuri_RP*
> 
> Will there be unlocked bios for 295x2?


Only a matter of time I suppose.


----------



## Roikyou

Quote:


> Originally Posted by *Jpmboy*
> 
> Ugh, 0.7mm, probably the only Fuji extremes I do not have! 1mm for vrms, right?


Hope this helps, the ones with the boxes around the definition's are the 1mm, including the two strips in the middle and the rest are .7mm



Side notes, the back plate goes right back on, so no additional back plate need as you probably already saw, uses same thermal pads, pretty straight forward.


----------



## Elmy

Been getting some high frametimes over 20ms constant and some superbad stuttering.... Only tested for about 20 mins in BF4 and 20 mins in Titanfall.( They both are have the same problem( Have some more testing tonight to try to figure out whats going on.

JFYI

295X2 tri-fired with 290X
Sabertooth Z87
4770K
5 Asus VG248QE's running @ 5400X1920 @ 144Hz
using 4 mini-displayport to standard displayport cables and one Dual-Link DVI
NZXT Hale V2 1200 Watt PSU


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> Hope this helps, the ones with the boxes around the definition's are the 1mm, including the two strips in the middle and the rest are .7mm
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Side notes, the back plate goes right back on, so no additional back plate need as you probably already saw, uses same thermal pads, pretty straight forward.


Thanks! the vrms use 1mm, so I'm good there. Gotta order some 0.7 fuji... or just use the koolance T-pads.
+1


----------



## Redeemer

Quote:


> Originally Posted by *EliteReplay*
> 
> yeah, you know what people do when they see something about AMD... they just get full of hate.


That's because AMD actually has amazing technology and the haterz know it

PS I am a 780TI owner


----------



## Jpmboy

Interesting... FrozenCPU sells only one 0.7mm T-pad, the koolance and it is 1.5 W/mK. Fuji extreme is 11W/mK, and thier Ultra is 17 W/mK. The stock koolance pads are pretty poor if the thermal conductivity data is correct.


----------



## Roikyou

So, got the card in, tried Dark Souls 2 since Nvidia wasn't able to render the 3840 x 2160, good news, the 295 ran it, no problem other than under that load, the card starts marking this buzzing noise. I've confirm that the card is max of 50c under load and water doesn't get any hotter than 30 as per inline temps. I've tried this with my AX850 corsair, yes, probably not enough voltage, so I figured I had the Cooler Master v1000 sitting here, go ahead and give it a try, same noise under heavy load. I talked with XFX support, no known issues as they don't have one to test and it's too new... but tech said he's heard coil whine in the past with previous versions, anyone heard of this? My case is pretty dead silent, so it's not like your going to hear the fan on the card cause there isn't one.



I honestly didn't like the Cooler master because of the video card cables, don't know why they split them, who would split them on a high end card?



Yes, it's a crazy water work conglomeration with a blue tint since I didn't get all the previous UV blue out, my fault but after it ran for a bit, the color evened out. Going back to Corsair with the 1200i as they've been my favorite for years, thought I would try something different but...

Any ideas or thoughts on the card noise would be appreciated.


----------



## Majin SSJ Eric

Not much you can do about coil whine I'm afraid. Sometimes it goes away with usage...


----------



## NavDigitalStorm

I guess coil-whine really is luck of the draw. All three of the cards I had have no issues.


----------



## lowgun

I know this is silly, but does anyone have a picture of their 295x2 CFX'd with a reference 290x? I'm thinking of buying a 290x to pair with mine, and just like to see what it looks like in a case.


----------



## lowgun

Also, I need your all's opinion. I've got a 4820k which is mildly overclocked, my 295x2, and possibly adding a 290x. Will my 1200w Corsair AX1200 be enough juice?


----------



## The Mac

should be, one 295x2 and a decent CPU overclock should be around 750w. adding a 290x should put you around 1000w.

that should leave you some room to OC the cards


----------



## Roikyou

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Not much you can do about coil whine I'm afraid. Sometimes it goes away with usage...


Thanks for the input everyone, hoping it goes away also but so far of course, better performer than my 780 ti classy and I have had no 2160 resolution issues so far. (unlike what I had with my 780 ti classy)


----------



## shadow85

Does a 295x2 and a 290x in CF give you a total of 12GB ram, or would it still be 8GB?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *shadow85*
> 
> Does a 295x2 and a 290x in CF give you a total of 12GB ram, or would it still be 8GB?


Technically, each GPU still only has 4.


----------



## shadow85

I think im going to wait till the samsung ud970 4k monitor comes out before i get a 295x2. By then there should be some more variety of 295x2's available.

No point in getting one now if I am not going to play games at 4k or benchmark.


----------



## Roikyou

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Not much you can do about coil whine I'm afraid. Sometimes it goes away with usage...


So the noise under heavy load is starting to subside. Going to wire up the 1200i corsair this evening and done for now... Thanks again for the input everyone.

Oh, in case you want to see the 295x2 next to a 780 ti classy with EK waterblock. Going to give you the quick run down from start to end...


----------



## Majin SSJ Eric

Those are a couple of sweet cards!!!


----------



## Jpmboy

Got the Koolance block today... hope I wasn't too distracted by the ranger-penguins game











seems to be a well constructed block, I only wish Koolance had milled it a little "closer" so it did not require 1mm T-pads.


----------



## hotrod717

Quote:


> Originally Posted by *Jpmboy*
> 
> Got the Koolance block today... hope I wasn't too distracted by the ranger-penguins game
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> seems to be a well constructed block, I only wish Koolance had milled it a little "closer" so it did not require 1mm T-pads.


Nice looking block! Is that blk/blk or blk/ ss? That scene makes me want to pull part my wb and reinstall it. Lol Exactly what my coffee table looks like when I'm done too.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Jpmboy*
> 
> Got the Koolance block today... hope I wasn't too distracted by the ranger-penguins game
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> seems to be a well constructed block, I only wish Koolance had milled it a little "closer" so it did not require 1mm T-pads.


Welcome to the DARK SIDE! now up that voltage!


----------



## Majin SSJ Eric

Well??? Results????


----------



## Jpmboy

Quote:


> Originally Posted by *hotrod717*
> 
> Nice looking block! Is that blk/blk or blk/ ss? That scene makes me want to pull part my wb and reinstall it. Lol Exactly what my coffee table looks like when I'm done too.


It's black/SS. And powercolor was good enough not to cover any screws with stickers.








Quote:


> Originally Posted by *DeadlyDNA*
> 
> Welcome to the DARK SIDE! now up that voltage!
> 
> 
> Spoiler: Warning: Spoiler!


"come into the light...." would love to up the mV... *HOW????*
Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Well??? Results????


gonna be a day before I open this TJ09, drain and pull the sli titans. . I need to do some stuff with my wife's tax PC (yeah, I took the sammy 4k60 back and gave her my HP30zrw). I set it up as a 5 disk raid 10 (4+1 spare) and need to reconfigure it to a raid 5. Not hard, just want a full B/U before the event. That's what happens when someone can google for "information": she's convinced raid 5 has better data security than 10 (or raid0+1 as it actually is).


----------



## Roikyou

Quote:


> Originally Posted by *Jpmboy*
> 
> It's black/SS. And powercolor was good enough not to cover any screws with stickers.
> 
> 
> 
> 
> 
> 
> 
> 
> "come into the light...." would love to up the mV... *HOW????*
> gonna be a day before I open this TJ09, drain and pull the sli titans. . I need to do some stuff with my wife's tax PC (yeah, I took the sammy 4k60 back and gave her my HP30zrw). I set it up as a 5 disk raid 10 (4+1 spare) and need to reconfigure it to a raid 5. Not hard, just want a full B/U before the event. That's what happens when someone can google for "information": she's convinced raid 5 has better data security than 10 (or raid0+1 as it actually is).


So your going from SLI Titans to the 295X2? It is nice having the horse power and one card. Got mine buried in white light and water...


----------



## axiumone

Quick cellphone pic of how it all turned out.



Now the wait begins for a driver version that actually lets me use both of these cards in games with eyefinity. As of right now 14.4 whql has major issues that can only be overcome by disabling the bottom card. Benchmarks work great though... go figure.

Edit - also on a little side note. Pretty disappointed with the fan filters in the corsair 450D, they hardly stop any dust getting through.


----------



## The Mac

Its rather surprising to me the amount of people on here who have 3 grand to drop on these.

I would have thought maybe one or two people, and everyone else drooling on thier keyboards

lol


----------



## Jpmboy

Okay, got it in (while the raid 10 to 5 was running). Daaum, I wish I had shot teh temp of these vrms when the aircooler was on. 1079/1400 clocks on Valley and they hit 80C by IR thermo?? Crap, I don't want to pull this out to check the (koolance) T-pads, but did use QDCs just in case.








Certainly throws much less heat than sli titians, and only a few fps lower at 4K on the same rig (again, fully unlocked titans, so not quite a fair comparison yet.)


----------



## levism99

looks good


----------



## Elmy

Been getting some high frametimes over 20ms constant and some superbad stuttering.... Only tested for about 20 mins in BF4 and 20 mins in Titanfall.( They both are have the same problem Have some more testing tonight to try to figure out whats going on.
Quote:


> Originally Posted by *axiumone*
> 
> Quick cellphone pic of how it all turned out.
> 
> 
> 
> Now the wait begins for a driver version that actually lets me use both of these cards in games with eyefinity. As of right now 14.4 whql has major issues that can only be overcome by disabling the bottom card. Benchmarks work great though... go figure.
> 
> Edit - also on a little side note. Pretty disappointed with the fan filters in the corsair 450D, they hardly stop any dust getting through.


That is pretty awesome axiumone!!!! Very clean looking setup you got there.

Here is a picture of mine thrown together. Me and Axiumone are on the same page.... with 5X1 and anything more than one 295X2 we are getting crazy frametimes and stuttering but benchmarking it works fine. I am running a 295X2 with a 290X and have to turn off the 290X to play any games..... I am running 5 Asus VG248QE's @ 144Hz and Axiumone is running 5 Eizo's @ 120Hz. Its seems to be only a 5X1 problem from what I have seen unless someone here can speak up differently.... I am impressed with the temps of the 295X2 from the factory. I don't get over 60c gaming with this card and it can barely be heard....

http://s1126.photobucket.com/user/Elmnator/media/20140506_193329.jpg.html


----------



## DeadlyDNA

Quote:


> Originally Posted by *Elmy*
> 
> Been getting some high frametimes over 20ms constant and some superbad stuttering.... Only tested for about 20 mins in BF4 and 20 mins in Titanfall.( They both are have the same problem Have some more testing tonight to try to figure out whats going on.
> That is pretty awesome axiumone!!!! Very clean looking setup you got there.
> 
> Here is a picture of mine thrown together. Me and Axiumone are on the same page.... with 5X1 and anything more than one 295X2 we are getting crazy frametimes and stuttering but benchmarking it works fine. I am running a 295X2 with a 290X and have to turn off the 290X to play any games..... I am running 5 Asus VG248QE's @ 144Hz and Axiumone is running 5 Eizo's @ 120Hz. Its seems to be only a 5X1 problem from what I have seen unless someone here can speak up differently.... I am impressed with the temps of the 295X2 from the factory. I don't get over 60c gaming with this card and it can barely be heard....
> 
> http://s1126.photobucket.com/user/Elmnator/media/20140506_193329.jpg.html


Extremeley nice builds!!!
















14.4whql gave me alot of issues, but i dont have 295x2's just 290's.

i made a pic for 14.4 that sums it well for me because i am stuck on 13.12whql for now


----------



## axiumone

Quote:


> Originally Posted by *Elmy*
> 
> Been getting some high frametimes over 20ms constant and some superbad stuttering.... Only tested for about 20 mins in BF4 and 20 mins in Titanfall.( They both are have the same problem Have some more testing tonight to try to figure out whats going on.
> That is pretty awesome axiumone!!!! Very clean looking setup you got there.
> 
> Here is a picture of mine thrown together. Me and Axiumone are on the same page.... with 5X1 and anything more than one 295X2 we are getting crazy frametimes and stuttering but benchmarking it works fine. I am running a 295X2 with a 290X and have to turn off the 290X to play any games..... I am running 5 Asus VG248QE's @ 144Hz and Axiumone is running 5 Eizo's @ 120Hz. Its seems to be only a 5X1 problem from what I have seen unless someone here can speak up differently.... I am impressed with the temps of the 295X2 from the factory. I don't get over 60c gaming with this card and it can barely be heard....
> 
> http://s1126.photobucket.com/user/Elmnator/media/20140506_193329.jpg.html


Thanks elmy!








Your's is freaking sweet too. Can't wait to see what you do once you get the blocks for the 295.

I'm very impressed with the temps on the stock cooler. It's actually one of the things that lead me to keep the stock cooling this round, instead of going all out on water.

I think that it may be a while before we can crossfire with eyefinity working with these cards, just because it's such a limited appeal set up. I hope that I'm wrong and that it gets resolved quickly. In any case. I did submit a few bug reports on amd's page, hopefully someone reads them.


----------



## Jpmboy

Yeah - I'm liking this 295x2. played ~2h BF4 and it's silk smooth @ 4K FPS never< 70. JUst need to figure a way to cool the vrms better. they get scary hot!


----------



## The Mac

define scary hot...


----------



## lowgun

How are you all tracking the VRM temps anyways?


----------



## Jpmboy

Quote:


> Originally Posted by *The Mac*
> 
> define scary hot...


mid to high 80's on the vrms

gpu temps never higher than 55C
Quote:


> Originally Posted by *lowgun*
> 
> How are you all tracking the VRM temps anyways?


IR thermometer (Fluke)

from guru3D:


----------



## Roikyou

Am I understanding correctly, your thermal pic shows from the plate side, not the water block side, so the placement of the Koolance T pads are going to make the most difference between the chips and block. This side shows the open air chips, not even covered by the back plate and passively cooled as the hottest point. Does that sound right, and if so, was it they're intention that these chips can run that hot? I guess it would be easier for me to understand it more if we knew the issue before we put the cards back together to take a look at it. If the T pads were incorrect, possibly the issue but it's just the two single strands that you cut and placing across those chips and of course, they have to be the 1.0T as per the instructions. Wish I had an IR Thermometer.


----------



## The Mac

mid to high 80s is pretty normal

over 100 would be scary hot.

lol
.


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> Am I understanding correctly, your thermal pic shows from the plate side, not the water block side, so the placement of the Koolance T pads are going to make the most difference between the chips and block. This side shows the open air chips, not even covered by the back plate and passively cooled as the hottest point. Does that sound right, and if so, was it they're intention that these chips can run that hot? I guess it would be easier for me to understand it more if we knew the issue before we put the cards back together to take a look at it. If the T pads were incorrect, possibly the issue but it's just the two single strands that you cut and placing across those chips and of course, they have to be the 1.0T as per the instructions. Wish I had an IR Thermometer.


Quote:


> Originally Posted by *The Mac*
> 
> mid to high 80s is pretty normal
> over 100 would be scary hot.
> lol
> .


yes, the backplate side of the vrms. Sure, 80C is within the AOR of those regulators, at stock volts. ... anticipating voltage unlock.


----------



## bencher

Quote:


> Originally Posted by *Jpmboy*
> 
> mid to high 80's on the vrms
> 
> gpu temps never higher than 55C
> IR thermometer (Fluke)
> 
> from guru3D:


You call 80s scary hot? lol


----------



## Jpmboy

Quote:


> Originally Posted by *bencher*
> 
> You call 80s scary hot? lol


Yup, i do when thats happening at 1.2 volts. The AOR - sorry - acceptable operating range - for NSC or ON semi vrms tops out 15c higher. How you gonna deal with it at 1.3V? I want to see if i can beat some of my benchmark scores with this card, stock volts don't cut it.


----------



## bencher

Quote:


> Originally Posted by *Jpmboy*
> 
> Yup, i do when thats happening at 1.2 volts. The AOR - sorry - acceptable operating range - for NSC or ON semi vrms tops out 15c higher. How you gonna deal with it at 1.3V? I want to see if i can beat some of my benchmark scores with this card, stock volts don't cut it.


I thought the waterblock you put on it would cool it though.


----------



## Roikyou

Quote:


> Originally Posted by *bencher*
> 
> I thought the waterblock you put on it would cool it though.


would be my thought also, they know it's an enthusiast card, why would they not at least passively cool the chips unless they know something we don't, they know people overclock these cards.


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> would be my thought also, they know it's an enthusiast card, why would they not at least passively cool the chips unless they know something we don't, they know people overclock these cards.


the block has a contact milling for the two vrm strips... 1mm T-pads! Koolance should have milled to a bit tighter tolerance. Anyway, I order some fuji ultra 1mm, see it that helps. Sure you can run these all day at 80C... but can't get'em much hotter I suspect. Always best to keep things cool.


----------



## axiumone

Well. I just had a chance to confirm. On my set up with 5x1 eyefinity and crossfire 295's, the drivers are utterly broken.

Single display works perfectly smooth. However, once you enable eyefinity the monitors exhibit some horrible syncing issues in games.

Imagine the screen tearing without vsync, but simultaneously across all 5 screens and it looks like a part of the frame takes much noticeably longer to refresh, making everything pretty much unplayable.

I though for a moment there could be a connection issue. Since 4 monitors are connected with DP and 1 is DVI DL, but leaving only 3 monitors in eyefinity with DP connections doesn't solve the problem.

Single 295 in eyefinity 5x1 works fine and one display with 295 crossfire works fine. I wonder what the issue could be.


----------



## DeadlyDNA

Quote:


> Originally Posted by *axiumone*
> 
> Well. I just had a chance to confirm. On my set up with 5x1 eyefinity and crossfire 295's, the drivers are utterly broken.
> 
> Single display works perfectly smooth. However, once you enable eyefinity the monitors exhibit some horrible syncing issues in games.
> 
> Imagine the screen tearing without vsync, but simultaneously across all 5 screens and it looks like a part of the frame takes much noticeably longer to refresh, making everything pretty much unplayable.
> 
> I though for a moment there could be a connection issue. Since 4 monitors are connected with DP and 1 is DVI DL, but leaving only 3 monitors in eyefinity with DP connections doesn't solve the problem.
> 
> Single 295 in eyefinity 5x1 works fine and one display with 295 crossfire works fine. I wonder what the issue could be.


if your referring to 14.4whql yes i had nothing but trouble with them. I think most people using a single display are fine, just us eyefinity folks and crossfire, and yeah were the small bunch so who knows whats next.


----------



## levism99

1386179355754.jpg 27k .jpg file

Quote:


> Originally Posted by *DeadlyDNA*
> 
> if your referring to 14.4whql yes i had nothing but trouble with them. I think most people using a single display are fine, just us eyefinity folks and crossfire, and yeah were the small bunch so who knows whats next.


yea but we should matter to them I hope , these drivers are so bad


----------



## lowgun

Quote:


> Originally Posted by *axiumone*
> 
> Quick cellphone pic of how it all turned out.
> 
> 
> 
> Now the wait begins for a driver version that actually lets me use both of these cards in games with eyefinity. As of right now 14.4 whql has major issues that can only be overcome by disabling the bottom card. Benchmarks work great though... go figure.
> 
> Edit - also on a little side note. Pretty disappointed with the fan filters in the corsair 450D, they hardly stop any dust getting through.


Must. Not. Buy. Second 295x2......

hrunnggggggg


----------



## Skinnered

Wow axiumone , talk about a nice organised case







A beautifull powerhouse.

One question, are there any guy's with two 295X2's on a Z78 chipset with 2x PCie [email protected]? I'm also planning on two 295x2's soon and wonder if there enough bandwith. I know the two R290x cores are communicating via a plx(?) chip on board the PCB, but still...

I also thought about trifire with a R290x, but are the diffrences in bios not causing any latency issue's?


----------



## Roikyou

Understanding the new 97 chipset motherboards have increased the bandwidth with multiple lanes at PCIe 3.0. Plus the new sata express and m.2 support at 10gbs. Makes me think about new motherboard and second card...


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roikyou*
> 
> Understanding the new 97 chipset motherboards have increased the bandwidth with multiple lanes at PCIe 3.0. Plus the new sata express and m.2 support at 10gbs. Makes me think about new motherboard and second card...


Does the 295x2 even need more pcie 3.0 bandwidth?


----------



## Roikyou

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Does the 295x2 even need more pcie 3.0 bandwidth?


My thought behind this was in the 87 chipset, it's 16x and 8x. New boards are suppose to support 16x 16x, that's where I was going with that. Will it make a great difference, doubt it but you never know.


----------



## WaveRider69

Does the OS treat this card as crossfire or a single card?

Meaning since it's mostly only games that benefit from crossfire, would having this card benefit 3D apps that don't utilize crossfire?


----------



## WaveRider69

Quote:


> Originally Posted by *Skinnered*
> 
> Wow axiumone , talk about a nice organised case
> 
> 
> 
> 
> 
> 
> 
> A beautifull powerhouse.
> 
> One question, are there any guy's with two 295X2's on a Z78 chipset with 2x PCie [email protected]? I'm also planning on two 295x2's soon and wonder if there enough bandwith. I know the two R290x cores are communicating via a plx(?) chip on board the PCB, but still...
> 
> I also thought about trifire with a R290x, but are the diffrences in bios not causing any latency issue's?


Depends how windows reads the card. You'll need 12.6 drivers or earlier for Windows 7 to see 5 cards and Win8 to see 6 cards. And the older drivers take a performance hit with these new cards. On Linux you'd be good to go. But that doesn't mean much for a gaming rig.


----------



## Roikyou

Quote:


> Originally Posted by *WaveRider69*
> 
> Does the OS treat this card as crossfire or a single card?
> 
> Meaning since it's mostly only games that benefit from crossfire, would having this card benefit 3D apps that don't utilize crossfire?


New to the AMD side of the world but it allows me to enable crossfire... Not sure what else you have to do.


----------



## WaveRider69

^ Not what I was asking at all. Thanks though.


----------



## axiumone

Quote:


> Originally Posted by *WaveRider69*
> 
> Does the OS treat this card as crossfire or a single card?
> 
> Meaning since it's mostly only games that benefit from crossfire, would having this card benefit 3D apps that don't utilize crossfire?


Windows sees it as two separate cards.

Not sure I can answer the second part.


----------



## Jpmboy

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Does the 295x2 even need more pcie 3.0 bandwidth?


possibly with two or more cards. the CFX traffic is routed to the PCI bus - no bridge.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Windows sees it as two separate cards.
> 
> Not sure I can answer the second part.


no it would not provide a benefit if the software can't use crossfire. think of it as 2x290x


----------



## Roikyou

Device manager shows as two AMD Radeon R9 200 Series display adapters. Checked the drivers 14.1 and 14.4 Catalyst Version.


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> Device manager shows as two AMD Radeon R9 200 Series display adapters. Checked the drivers 14.1 and 14.4 Catalyst Version.


Device manager better show two gpus! you paid for two...


----------



## lowgun

It was darn tempting, but I did not buy a second 295x2. I just bought myself a 290x to pair with it, and I'm trading one of my old 780s for a 4930k to replace my 4820k. 50% boosts all around!


----------



## Elmy

I am loving this 295X2 .... butter smooth @ 144Hz on my 5X1


----------



## Roikyou

Quote:


> Originally Posted by *Jpmboy*
> 
> Device manager better show two gpus! you paid for two...


Goes back to the question "how does the OS see this card". Don't think the OS would see it as crossfire as that's software level.


----------



## fireedo

count me in









well I decide to change side, going to this beast















well overall I feel satisfied but just found a problem with 3DMark2011 and 2013, I cant get my result error code : 15, else it greaaat


----------



## rakesh27

Guys,

Ive joined the band wagon, im a proud owner of a PowerColor 295x2, what beast. I was gonna get the Sapphire model, changed my mind as ive had a bad experience with there cards...

I thought my Enermax Galaxy 1000Watt PSU wouldnt be able to handle this card, the PSU done me proud, only thing now is need to get rid of my 3slot VTX3D 7990 (another awesome card).

Any takers....


----------



## NavDigitalStorm

Quote:


> Originally Posted by *fireedo*
> 
> count me in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well I decide to change side, going to this beast
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well overall I feel satisfied but just found a problem with 3DMark2011 and 2013, I cant get my result error code : 15, else it greaaat


Accepted!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *rakesh27*
> 
> Guys,
> 
> Ive joined the band wagon, im a proud owner of a PowerColor 295x2, what beast. I was gonna get the Sapphire model, changed my mind as ive had a bad experience with there cards...
> 
> I thought my Enermax Galaxy 1000Watt PSU wouldnt be able to handle this card, the PSU done me proud, only thing now is need to get rid of my 3slot VTX3D 7990 (another awesome card).
> 
> Any takers....


Can you post a picture to verify and then tag me?


----------



## Jpmboy

Quote:


> Originally Posted by *fireedo*
> 
> count me in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well I decide to change side, going to this beast
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well overall I feel satisfied but just found a problem with 3DMark2011 and 2013, I cant get my result error code : 15, else it greaaat


it seems to be a futuremark problem? I have the same error code for 3dmk11, but firestrike is okay...


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> Goes back to the question "how does the OS see this card". Don't think the OS would see it as crossfire as that's software level.


CCC "sees" a 295x2 as crossfire (just can't disable it). The on-board PLX chip manages GPU-GPU traffic. You can however, set various CFX AFR profiles in CCC.
I don't think the the OS knows CFX from SLI. it's a AMD (or NV) driver level thing running in the OS. yes?


----------



## lowgun

Is Firestrike/Firestrike Extreme biased to Nvidia? I'm getting a lower score with my 295x2 than my dual 780s, and I know my 295x2 is more powerful...

Also, no biggie but @NavDigitalStorm my 295x2 is a Sapphire, and it is listed as MSI in the first post.


----------



## Majin SSJ Eric

This card would be perfect for an air cooled build. Well, air cooled as in this paired with an H100 or something. I could definitely see something like this in my folding rig but I can't spend that kind of money on it right now...


----------



## shadow85

I gotta stop reading this thread, tempting me to get one of these sooner than later after seeing all these juicy pics.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *shadow85*
> 
> I gotta stop reading this thread, tempting me to get one of these sooner than later after seeing all these juicy pics.


Getting 95+ FPS on BF4 at 1440p maxed out is great!


----------



## Someone09

Quote:


> Originally Posted by *shadow85*
> 
> I gotta stop reading this thread, tempting me to get one of these sooner than later after seeing all these juicy pics.


I know the feeling.
Luckily they are still too expensive for me. But I am really scared about the time the first 295x2 will be available for 1000-1100...
Not sure I will be able to hold back.


----------



## King4x4

Just put two of my 290xs for sale.... Might be joining this club soonish... need them DP ports.


----------



## Yuri_RP

Hello, my MSI R9 295X2 arrived last weekend, haven't had the chance to upload it until today. Here it is.


Spoiler: Pictures



Unboxing
https://imageshack.com/i/nbqgcrj

Installed (Rig not yet completed)
https://imageshack.com/i/nfarozj

The Phoebus finally got some room, my twin PCS+ blows hot air on it and it touches the backplate








https://imageshack.com/i/najv46j

Gonna pair it with these. Still waiting for the Active Adapter though. Hopefully there will be new drivers to make it work fine.
(Mind the tag, it's from local forum, to lazy to reshoot







)
https://imageshack.com/i/nhgri6j

Will be selling these boys, anyone wants it?








https://imageshack.com/i/n86sszj



Hope to be allowed to join in.


----------



## Yuri_RP

Quote:


> Originally Posted by *fireedo*
> 
> count me in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well I decide to change side, going to this beast
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well overall I feel satisfied but just found a problem with 3DMark2011 and 2013, I cant get my result error code : 15, else it greaaat


Fellow Indonesian..!








Only few of this available in Indonesia currently. You must be buying from an MSI Early Bird seller right?


----------



## 295x2

hey there!

I am using the 295x2 to do calculations. No, i am not doing mining at all. (It's dead on GPU anyway)
I have 3 * Sapphire 290x Tri X and one Sapphire 295x2.

However, i am experimenting overheating with the 295x2 when all the cards are working together.
I am desperately trying to get the Sapphire 295x2 OC BIOS, as i heard the Fans are turning faster. And possibly clock adjusted, so i don't need to use +50 on powertune
in order to have full speed during the tasks i am sending to the GPU.

Is there anyone in the owner club owning one ? If so, could they share a BIOS dump with me ?
This is for work, i don't even play with the computer









Thanks for any help


----------



## fireedo

Quote:


> Originally Posted by *Yuri_RP*
> 
> Fellow Indonesian..!
> 
> 
> 
> 
> 
> 
> 
> 
> Only few of this available in Indonesia currently. You must be buying from an MSI Early Bird seller right?


Hi,
Yes, maybe







, really dont know about that, but for sure when I bought this, this is the only one r9 295x2 in surabaya
actually I'm looking for power color r9 295x2









anyway salam sesama indonesia nya bang


----------



## ANGELPUNISH3R

Question for those who have one. Im wondering how loud the fan on the card that cools the vrms is. Would it be silent at idle? Rite now i have a custom loop and all the fans are at 5 volts so my pc is inaudible from more then a meter away and even if your with in one meter its really hard to hear.

I want to buy one of these and put an ek block on it but as they are not out yet i would have to use the reference cooler for now and to my understanding you cant change the fan profiles. So i dont care about the fan on the radiator i can change that for a better quieter fan and make it silent but if i cant change the speed of the fan on the cooler its self and its noisy i might just wait to get it until EK releases there blocks.


----------



## NavDigitalStorm

The fan that cools the VRM is pretty darn silent I'd say.


----------



## rakesh27

Heres my PowerColor 295x2


----------



## NavDigitalStorm

Quote:


> Originally Posted by *rakesh27*
> 
> Heres my PowerColor 295x2
> 
> 
> 
> Spoiler: Warning: Spoiler!


Accepted


----------



## ANGELPUNISH3R

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> The fan that cools the VRM is pretty darn silent I'd say.


Thanks mate.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ANGELPUNISH3R*
> 
> Thanks mate.


If anything it's the fan on the radiator that gets pretty audible during gaming. I put two Corsair SP 120s on there and haven't looks back


----------



## Jpmboy

Quote:


> Originally Posted by *295x2*
> 
> hey there!
> 
> I am using the 295x2 to do calculations. No, i am not doing mining at all. (It's dead on GPU anyway)
> I have 3 * Sapphire 290x Tri X and one Sapphire 295x2.
> 
> However, i am experimenting overheating with the 295x2 when all the cards are working together.
> I am desperately trying to get the Sapphire 295x2 OC BIOS, as i heard the Fans are turning faster. And possibly clock adjusted, so i don't need to use +50 on powertune
> in order to have full speed during the tasks i am sending to the GPU.
> 
> Is there anyone in the owner club owning one ? If so, could they share a BIOS dump with me ?
> This is for work, i don't even play with the computer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for any help


I second this... anyone with a Sapphire OC version? I'd love to try that bios. Only the stock clocks bios' are available: http://www.techpowerup.com/vgabios/index.php?did=1002-67b9--


----------



## alamox

Quote:


> Originally Posted by *King4x4*
> 
> Just put two of my 290xs for sale.... Might be joining this club soonish... need them DP ports.


if i were you i would hold on to 1 r9 290x and crossfire it with the r9 295x2, you get the best performance value out of it in a tri-fire setup


----------



## DeadlyDNA

Quote:


> Originally Posted by *alamox*
> 
> if i were you i would hold on to 1 r9 290x and crossfire it with the r9 295x2, you get the best performance value out of it in a tri-fire setup


also 290x's and 290s are cheap right now because of miner exodus


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Yuri_RP*
> 
> Hello, my MSI R9 295X2 arrived last weekend, haven't had the chance to upload it until today. Here it is.
> 
> 
> Spoiler: Pictures
> 
> 
> 
> Unboxing
> 
> 
> Installed (Rig not yet completed)
> 
> 
> The Phoebus finally got some room, my twin PCS+ blows hot air on it and it touches the backplate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna pair it with these. Still waiting for the Active Adapter though. Hopefully there will be new drivers to make it work fine.
> (Mind the tag, it's from local forum, to lazy to reshoot
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> Will be selling these boys, anyone wants it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope to be allowed to join in.


Accepted!


----------



## Yuri_RP

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Accepted!


Thank you..!








Hope to learn much from here.


----------



## NavDigitalStorm

Do you guys think I should start a roster for clock speeds ?


----------



## Jpmboy

Quote:


> Originally Posted by *DeadlyDNA*
> 
> also 290x's and 290s are cheap right now because of miner exodus


abused graphics cards - PETGC


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> abused graphics cards - PETGC


I turned off my mining rigs MONTHS ago. Sorry poor 290Xs


----------



## DeadlyDNA

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I turned off my mining rigs MONTHS ago. Sorry poor 290Xs


i am contemplating obtaining 290x over 290's is it worth it and sell my 290s? I see 290x costs less than my 290s retail did on ebay.,..


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I turned off my mining rigs MONTHS ago. Sorry poor 290Xs


smart. gpu mining is dead. Has anyone... when you actually "net out" mining... _*really*_ turned a profit excluding valuation of the coin? Every time i ask this question, it's like asking a "problem" gambler.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jpmboy*
> 
> smart. gpu mining is dead. Has anyone... when you actually "net out" mining... *really* turned a profit excluding valuation of the coin? Every time i ask this question, it's like asking a "problem" gambler.


Profitted about $1500. Not bad for free money I suppose.


----------



## Cool Mike

Sounds good. I would like to see both gpu and memory speeds people are squeezing out. Maybe a maxed heaven valley score would show reasonable stability to prove a solid overclock.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Sounds good. I would like to see both gpu and memory speeds people are squeezing out. Maybe a maxed heaven valley score would show reasonable stability to prove a solid overclock.


Done. Can everyone post their speeds they have the card currently running?


----------



## ANGELPUNISH3R

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> If anything it's the fan on the radiator that gets pretty audible during gaming. I put two Corsair SP 120s on there and haven't looks back


Yep that's my plan. From what i see with reviews it runs at about 60 degrees give or take under load with the one stock fan. I will push pull it with 2 sp120s at 5 volts until EK release a water block and i can add it as a part of my loop. Will be picking mine up tomorrow.

Thanks again mate.


----------



## 295x2

I
Quote:


> Originally Posted by *Jpmboy*
> 
> I second this... anyone with a Sapphire OC version? I'd love to try that bios. Only the stock clocks bios' are available: http://www.techpowerup.com/vgabios/index.php?did=1002-67b9--


I think noone bothered buying that card :-(


----------



## 295x2

What's the temperature your card sits at during gaming ?
I am experimenting here, and in some calculs my card stays at 73/74C.

on 290x it is pretty safe, however, i wonder about 295x, as it is a single PCB..


----------



## NavDigitalStorm

Quote:


> Originally Posted by *295x2*
> 
> What's the temperature your card sits at during gaming ?
> I am experimenting here, and in some calculs my card stays at 73/74C.
> 
> on 290x it is pretty safe, however, i wonder about 295x, as it is a single PCB..


GPUs around 63C Max


----------



## Skinnered

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> GPUs around 63C Max


That's a very great temperature









Do you have some Skyrim enb (scale)figures in 4K? I'm planning to get two 295x2's tomorrow and I wonder what to expect in this game when modded into "oblivion". I know DeadlyDNA is busy with it, but I'm dying to know


----------



## Roikyou

Quote:


> Originally Posted by *295x2*
> 
> What's the temperature your card sits at during gaming ?
> I am experimenting here, and in some calculs my card stays at 73/74C.
> 
> on 290x it is pretty safe, however, i wonder about 295x, as it is a single PCB..


Under water at the gpu diode per hwinfo, 50c under heavy load, around 30 ish idle. I know Jpmboy was seeing high external temps at vrms, I have yet to check mine.


----------



## 295x2

Your numbers seem pretty low.
On some tests, i saw 68C . Maybe the card isn't going at 1018 Mhz and that's why.
Did you try to enable over drive in Catalyst and put power target +50 ? This will force the card to go full speed under load. not 950/980 ish MHZ


----------



## Roikyou

Quote:


> Originally Posted by *295x2*
> 
> Your numbers seem pretty low.
> On some tests, i saw 68C . Maybe the card isn't going at 1018 Mhz and that's why.
> Did you try to enable over drive in Catalyst and put power target +50 ? This will force the card to go full speed under load. not 950/980 ish MHZ


I usually leave hwinfo running in the background, gives me min, max and average. I'm confident it's hit the full load but I'll take a look again. I wouldn't expect high temps with the water I'm pushing through it also, two 480's Monsta and one UT 360 with two mcp35x in series. The 4770k is stock speed right now also.

This is my first AMD in a long time also, so good chance over drive is not enabled but I'll check it out this evening.

Why when I search for overdrive is it reference cpu's, not gpu's?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *295x2*
> 
> Your numbers seem pretty low.
> On some tests, i saw 68C . Maybe the card isn't going at 1018 Mhz and that's why.
> Did you try to enable over drive in Catalyst and put power target +50 ? This will force the card to go full speed under load. not 950/980 ish MHZ


I have push/pull on my setup.


----------



## levism99

running 1048 temps at 55c to 62c I keep my room nice and cold


----------



## Roikyou

Eight fans pulling two 480's and three fans pushing the 360 for myself. Just wasn't ready for the 22 fans for push pull and all the harnesses I would need but who knows, might do it. But right now, the curve I have set up through Aquaero's 6, it's silent and cool.


----------



## 295x2

Ah ok, i was talking about stock cooling.

I can't compare with games though, i don't play any.
My room is pretty hot with the 5 GPU.

However for what it's worth, just unplugging the stock fan from the radiator, and plugging another fan, to say, motherboard or molex removed all the throttling issues i had around 68-69C.

Now it can goes up to 74C, and when the card is hot, it will go down to 950-980 MHZ even with power target set to +50.
According to Sapphire tech support, this temp is not a problem for the card.

Maybe when you put a new waterblock etc, this helps quite a lot on the power from the card.. There is no valid reason with just a fan, it would go to 300MHZ every few seconds when hot.


----------



## Roikyou

I've seen in the past with Nvidia where cards throttle depending on heat and fan speed but with water cooling, the temp's are under control and no throttle or dealing with fan speeds. My water temp doesn't rise above 30c under load.


----------



## 295x2

And here i experiment more weirdness.
Same fan, which i moved to pull from outside and push in the radiator, while it seems to do a great job cooling the gpu, it again throttles back to 300Mhz every few seconds around 69C.

There is something i don't understand, and i am not sure what the hell is causing this throttling. I mean, the gpu isn' even full speed.
I am officially puzzled


----------



## Farmer Boe

Quote:


> Originally Posted by *Skinnered*
> 
> That's a very great temperature
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you have some Skyrim enb (scale)figures in 4K? I'm planning to get two 295x2's tomorrow and I wonder what to expect in this game when modded into "oblivion". I know DeadlyDNA is busy with it, but I'm dying to know


Skyrim doesn't play well with Crossfire enabled. I'd be cautious going forward if you plan on playing a heavily modded Skyrim.


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Profitted about $1500. Not bad for free money I suppose.


impressive! $1500 after equipment and power cost is pretty good.


----------



## Cool Mike

My HIS is at 1078 core 1650 Memory. 4K Valley maxed stable.


----------



## Jpmboy

1080 1650 - everyday clocks. BF4, crysis stable. valley run:

jpmboy --- [email protected] r9 295x2 1080/1650 (watercooled)


----------



## NavDigitalStorm

Those are really high mem clocks


----------



## lowgun

I run 1080 core 1625 memory on both my 295X2 and 290X

Btw @NavDigitalStorm, I just realized I didn't mention you properly last time. I have a Sapphire 295X2, not a MSI if you could change it when you get a chance


----------



## DeadlyDNA

Quote:


> Originally Posted by *lowgun*
> 
> I run 1080 core 1625 memory on both my 295X2 and 290X
> 
> Btw @NavDigitalStorm, I just realized I didn't mention you properly last time. I have a Sapphire 295X2, not a MSI if you could change it when you get a chance


1080g on da. Core


----------



## 295x2

Want some more weirdness









If the card runs and stay stable at 74C, if i use a house fan to blow air to the case, of course temp goes down, but then, it throttles down to 300MHZ









I tested it on 3DMark and so on, and the card never get past 55C on the hardest stuff. otherwise 50C.
So it seems, this overheating problem is only experienced on very intensive tasks (computations), that are different from normal gaming.
So far, i am very disappointed by the 295x2. For computing that is.


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Those are really high mem clocks


You would think AMD would want to get an unlocked bios out to "enthusiasts" ... win on Sunday, sell on Monday you know.


----------



## lowgun

Quote:


> Originally Posted by *Jpmboy*
> 
> You would think AMD would want to get an unlocked bios out to "enthusiasts" ... win on Sunday, sell on Monday you know.


On one hand (and I definitely want this) it would seem obvious to give your customers some unlockedness since majority of people buying these are going to be "enthusiasts", but on the other hand I think they may be worried we will melt the Asetek pumps and hoses if we push them too much, lol.


----------



## hotrod717

Quote:


> Originally Posted by *Jpmboy*
> 
> You would think AMD would want to get an unlocked bios out to "enthusiasts" ... win on Sunday, sell on Monday you know.


Have you checked compatability with rubytool or manually "coding" AB ? All about the voltage controller.


----------



## NavDigitalStorm

The PCMag review is out! http://www.pcmag.com/slideshow_viewer/0,3253,l=323653&a=323472&po=1,00.asp


----------



## bencher

Good luck selling $7000 machines.


----------



## Elmy

Quote:


> Originally Posted by *bencher*
> 
> Good luck selling $7000 machines.


Actually there is no luck in selling 7000.00 dollar machines....Just because you can't afford it or know how to build one cheaper doesn't mean that other people don't know how or have the money to fork over for one of these pre-built machines.

Next time think before you speak....


----------



## Jpmboy

Quote:


> Originally Posted by *lowgun*
> 
> On one hand (and I definitely want this) it would seem obvious to give your customers some unlockedness since majority of people buying these are going to be "enthusiasts", but on the other hand I think they may be worried we will melt the Asetek pumps and hoses if we push them too much, lol.


nah, they won't melt before other components smoke. I think it is a given that once you unlock the voltage ion this card (or any other for that matter), stock cooling has to go.


----------



## Jpmboy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> The PCMag review is out! http://www.pcmag.com/slideshow_viewer/0,3253,l=323653&a=323472&po=1,00.asp
> 
> 
> Spoiler: Warning: Spoiler!


Nice job Nav!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *bencher*
> 
> Good luck selling $7000 machines.


That is a top of the line machine, our computers start at $799. If you think no one buys $7,000 machines....


__
http://instagr.am/p/moF0HUPBDV%2F/


----------



## bencher

Quote:


> Originally Posted by *Elmy*
> 
> Actually there is no luck in selling 7000 machines....*Just because you can't afford i*t or know how to build one cheaper doesn't mean that other people don't know how or have the money to fork over for one of these pre-built machines.
> 
> Next time think before you speak....


LOL!


----------



## axiumone

Quote:


> Originally Posted by *bencher*
> 
> LOL!


I apologize if I seem rude, but I don't think you're on the same page as the rest of this thread. This thread is about 295x2 cards, which are inherently excessive.


----------



## bencher

Quote:


> Originally Posted by *axiumone*
> 
> I apologize if I seem rude, but I don't think you're on the same page as the rest of this thread. This thread is about 295x2 cards, which are inherently excessive.


Maybe you should think before you post...

I was thinking about the economy when I made my post ok?

Thanks


----------



## axiumone

Irrespective of what some may think about the current state of economy, and judging by some of the folks rigs right here on OCN, its doing quite well, I think that manufacturers consider the price of their rigs fairly carefully and know how to pick their target audience. It's a business after all. Thanks.


----------



## ViRuS2k

dam this card is fast, can i jump in LOL







just got this card 4 days ago








tested the memory overclock @6000mhz and she is a blowing ran a few firestrikes @ that memory speed and no issues at all.
though i havent overclocked gpu as im waiting on afterburner update as for the memory havent gone higher but why the need ? lol as for my system i know its a bit messy but its fast


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ViRuS2k*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> dam this card is fast, can i jump in LOL
> 
> 
> 
> 
> 
> 
> 
> just got this card 4 days ago
> 
> 
> 
> 
> 
> 
> 
> 
> tested the memory overclock @6000mhz and she is a blowing ran a few firestrikes @ that memory speed and no issues at all.
> though i havent overclocked gpu as im waiting on afterburner update as for the memory havent gone higher but why the need ? lol as for my system i know its a bit messy but its fast


 Picture doesn't work


----------



## Elmy

Quote:


> Originally Posted by *bencher*
> 
> Maybe you should think before you post...
> 
> I was thinking about the economy when I made my post ok?
> 
> Thanks


I don't know where you live but the economy where I live is going crazy good.

So now your assuming?

You know what happens when you assume right?


----------



## ViRuS2k

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Picture doesn't work


why does it not work ??? haha but why hehe..


----------



## NavDigitalStorm

Can you try imgur and link it? OCN is saying its broken.


----------



## ViRuS2k

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Can you try imgur and link it? OCN is saying its broken.


strange cause i can see the image and it shows up on the forum correctly.


----------



## axiumone

For what it's worth, the image doesn't work for me either.


----------



## bencher

Quote:


> Originally Posted by *Elmy*
> 
> I don't know where you live but the economy where I live is going crazy good.
> 
> So now your assuming?
> 
> You know what happens when you assume right?


What exactly did I assume?


----------



## ViRuS2k

ok here ya go reuploaded to imgur


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ViRuS2k*
> 
> ok here ya go reuploaded to imgur
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Works! Ill add you to the roster


----------



## NavDigitalStorm

Also, what brand and clock speeds?


----------



## ViRuS2k

thanks









card i have is this one








took this picture also lol



MSI version 3 years warranty FTW








clock speeds for now 1018/1600


----------



## NavDigitalStorm

Stock speeds?


----------



## ViRuS2k

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Stock speeds?


Look post above mate, stick gpu 1600 mem







for now that is until afterburner gets updated...


----------



## bencher

Quote:


> Originally Posted by *ViRuS2k*
> 
> ok here ya go reuploaded to imgur


That cooler is so close to the card


----------



## Cool Mike

Both beautiful and powerful machines. Congratulations on a job well done!


----------



## Cool Mike

Not sure you guys are having the same minor problem with CCC? When adjusting the core percentage (which I hate, AMD please go back to real core frequencies) the applied percentage will not stick. Example: Apply 6.2% the number will go to 6.1%. Doesn't seem to have enough precision in the adjustments. I was having a hard time getting 1080 Core to stick after a reboot. I ended up using Asus GPU tweak and *Now I am at 1080 core after reboots*


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Not sure you guys are having the same minor problem with CCC? When adjusting the core percentage (which I hate, AMD please go back to real core frequencies) the applied percentage will not stick. Example: Apply 6.2% the number will go to 6.1%. Doesn't seem to have enough precision in the adjustments. I was having a hard time getting 1080 Core to stick after a reboot. I ended up using Asus GPU tweak and *Now I am at 1080 core after reboots*


How much of a boost has that OC given?


----------



## Cool Mike

Not much at all. Something that was just pestering me. 1080 looks better than 1078.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Not much at all. Something that was just pestering me. 1080 looks better than 1078.


Gotcha.

I hate this California heat. For the first time my card broke 70C.


----------



## fireedo

seems like my 4770K @ 4,2 Ghz hold me down....so guys is it worth for now if I switch to x79 and 4930K?


----------



## Roaches

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Gotcha.
> 
> I hate this California heat. For the first time my card broke 70C.


I know right!, it was so hot, sleeping has been troublesome throughout the week









At least my 680s didn't break a sweat hovering around mid 40s Celsius on PCSX2 emulation gaming...








80 degrees Fahrenheit right now in my room and my cards are resting 36 degrees Celsius.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *fireedo*
> 
> seems like my 4770K @ 4,2 Ghz hold me down....so guys is it worth for now if I switch to x79 and 4930K?


I'd wait for X99.


----------



## fireedo

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I'd wait for X99.


I know i should wait, grrrr waiting is killing me


----------



## NavDigitalStorm

Quote:


> Originally Posted by *fireedo*
> 
> I know i should wait, grrrr waiting is killing me


Then again DDR4 prices will be all hell. If you can get a cheap X79 setup i'd do it.


----------



## ANGELPUNISH3R

Heres mine. Sorry for phone pics and bad photography.

Was originally going to put it into this rig


I have decided to down size. Selling my phantom 820. Love the new pc takes up about a third of the space and is more powerful for gaming anyway. My phantom does have a 3930k but i have to say with 4770k i've put in the phenom i havnt really noticed a difference. Even for things like rendering.




Have yet to have time to play around with overclocking or anything since had to install windows and stuff and move all my stuff across. But have run a few quick benchmarks and played a few games to see how it goes and there is a night and day difference in performance between this and my GTX 690

I will be water cooling the new pc just going to wait until EK releases their waterblocks. So for rite now i've just thrown it all in there because i will have to take everything out when i water cool it. I've put the rad for the 295x2 in push/pull with SP120 quiets.

Probably one of the most powerful ITXs in the world right now.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ANGELPUNISH3R*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> Heres mine. Sorry for phone pics and bad photography.
> 
> Was originally going to put it into this rig
> 
> 
> I have decided to down size. Selling my phantom 820. Love the new pc takes up about a third of the space and is more powerful for gaming anyway. My phantom does have a 3930k but i have to say with 4770k i've put in the phenom i havnt really noticed a difference. Even for things like rendering.
> 
> 
> 
> 
> 
> 
> Have yet to have time to play around with overclocking or anything since had to install windows and stuff and move all my stuff across. But have run a few quick benchmarks and played a few games to see how it goes and there is a night and day difference in performance between this and my GTX 690
> 
> I will be water cooling the new pc just going to wait until EK releases their waterblocks. So for rite now i've just thrown it all in there because i will have to take everything out when i water cool it. I've put the rad for the 295x2 in push/pull with SP120 quiets.
> 
> Probably one of the most powerful ITXs in the world right now.


SICK! Accepted. What brand and clock speeds?


----------



## ANGELPUNISH3R

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> SICK! Accepted. What brand and clock speeds?


Its XFX speed are stock rite now. So gpu is 1018 and i believe the stock mem speed are 1250 correct me if im wrong.

I'll update you once i've overclocked it.


----------



## utnorris

Has anyone taken the single 120mm rad and swapped it for a dual or triple?


----------



## lowgun

Quote:


> Originally Posted by *utnorris*
> 
> Has anyone taken the single 120mm rad and swapped it for a dual or triple?


Do you mean like put a waterblock on it and add it to a custom waterloop? There is no way to change the radiator while continuing to use the stock AIO solution.


----------



## ViRuS2k

Quote:


> Originally Posted by *fireedo*
> 
> seems like my 4770K @ 4,2 Ghz hold me down....so guys is it worth for now if I switch to x79 and 4930K?


Then overclock it to 4.5ghz








though at 4.2ghz surely that cpu even at 4.2 should not be bottle necking that card, if your getting a bottleneck its from something else.


----------



## Yuri_RP

Hello,

Anyone using this with a QNIX QX2710 DL DVI-D?
I have some questions regarding Overclocking the monitor when using this Card.


----------



## Jpmboy

Quote:


> Originally Posted by *lowgun*
> 
> Do you mean like put a waterblock on it and add it to a custom waterloop? There is no way to change the radiator while continuing to use the stock AIO solution.


well you could cut the lines and re-plumb it into a rad/res. just easier to put a waterblock on it.


----------



## utnorris

Quote:


> Originally Posted by *lowgun*
> 
> Do you mean like put a waterblock on it and add it to a custom waterloop? There is no way to change the radiator while continuing to use the stock AIO solution.


Quote:


> Originally Posted by *Jpmboy*
> 
> well you could cut the lines and re-plumb it into a rad/res. just easier to put a waterblock on it.


Yes, I mean remove the 120mm rad and insert a dual or triple rad and res (if necessary). Obviously the AIO pump/blocks are more than fine for the GPU's, just need more rad space and since these only have 2 x 8 pin connectors, in my opinion you are not going to be able to push the cards to their limits where the vrm's need the water cooling from a full cover block. Just my opinion, but I am surprised that no one has done it since so many mod the AIO cpu coolers adding bigger rads and such.


----------



## 295x2

Quote:


> Originally Posted by *utnorris*
> 
> Yes, I mean remove the 120mm rad and insert a dual or triple rad and res (if necessary). Obviously the AIO pump/blocks are more than fine for the GPU's, just need more rad space and since these only have 2 x 8 pin connectors, in my opinion you are not going to be able to push the cards to their limits where the vrm's need the water cooling from a full cover block. Just my opinion, but I am surprised that no one has done it since so many mod the AIO cpu coolers adding bigger rads and such.


I want to do that.
So, if i were to use a bigger radiator, even 140mm with a better fan, it would work right? if i was just to rewire it?
The stock radiator seems to be very cheap and fragile. I am sure you can get much better cooling radiators than this junk.


----------



## Roikyou

I didn't think the stock radiator pump combo wasn't that bad. It is kind of funny in a way to have a 100 dollar cooling solution (comparing to something similar to the h100 or along that lines) for a card when you pay 1500 but necessary as the fan and block wasn't to they're (AMD) standards of cooling for this card or they just wanted to bring it to another level with this all in one pump/radiator combo.

But if you want a simple solution, you can go with the h100 and this card, water cooled without all the extra work of separate blocks, pumps, etc.

I talked with XFX, they expect North America to do custom water blocks and honor the warranty if you do. (first thing I checked)


----------



## 295x2

Quote:


> Originally Posted by *Roikyou*
> 
> I didn't think the stock radiator pump combo wasn't that bad. It is kind of funny in a way to have a 100 dollar cooling solution (comparing to something similar to the h100 or along that lines) for a card when you pay 1500 but necessary as the fan and block wasn't to they're (AMD) standards of cooling for this card or they just wanted to bring it to another level with this all in one pump/radiator combo.
> 
> But if you want a simple solution, you can go with the h100 and this card, water cooled without all the extra work of separate blocks, pumps, etc.
> 
> I talked with XFX, they expect North America to do custom water blocks and honor the warranty if you do. (first thing I checked)


The radiator fell of my hand from 5 cms against the case and the metal strips of the radiator were torn like i hit it with a "hammer"..I fixed it back but on h80i for cpu is, that would not happen. The strips are strong and don't move at all, well spaced.

1) I was thinking about an h100i. I would just need not to use the CPU part, and plug the water pipes from card directly to the radiator, right ?

2) is it possible to unscrew the water pipes from original radiator, or does it need to be cut to be dismantled? As you went total custom, you probably know the answer

Once i get the temp fixed, all my computation will be able to work full speed, and i will love the card.


----------



## Roikyou

Quote:


> Originally Posted by *295x2*
> 
> The radiator fell of my hand from 5 cms against the case and the metal strips of the radiator were torn like i hit it with a "hammer"..I fixed it back but on h80i for cpu is, that would not happen. The strips are strong and don't move at all, well spaced.
> 
> 1) I was thinking about an h100i. I would just need not to use the CPU part, and plug the water pipes from card directly to the radiator, right ?
> 
> 2) is it possible to unscrew the water pipes from original radiator, or does it need to be cut to be dismantled? As you went total custom, you probably know the answer
> 
> Once i get the temp fixed, all my computation will be able to work full speed, and i will love the card.


Wish I could help, custom blocks remove all pump, block and radiator, replace it with one piece, we reuse the back plate, that's it. You would need someone who has customized these before. Plus, for warranty, you'll need to put it back on and send it in. (if that ever happens)


----------



## 295x2

Quote:


> Originally Posted by *Roikyou*
> 
> Wish I could help, custom blocks remove all pump, block and radiator, replace it with one piece, we reuse the back plate, that's it. You would need someone who has customized these before. Plus, for warranty, you'll need to put it back on and send it in. (if that ever happens)


Yes, i figured for a complete custom blocks. However, i am wondering about just ditching the radiator only and replace it with one like you have in the h80. That is also possible? Altho less efficient than full custom blocks.
I never played with waterblocks, and i am a bit scared to do it :-(
I hope it would be enough to cool down. Computation stresses the card much more than gaming. And i have it makes calculs for 48 hours non stop sometimes.


----------



## Roikyou

Quote:


> Originally Posted by *295x2*
> 
> Yes, i figured for a complete custom blocks. However, i am wondering about just ditching the radiator only and replace it with one like you have in the h80. That is also possible? Altho less efficient than full custom blocks.
> I never played with waterblocks, and i am a bit scared to do it :-(
> I hope it would be enough to cool down. Computation stresses the card much more than gaming. And i have it makes calculs for 48 hours non stop sometimes.


I was just referencing the h100 or h80 for the cpu if you wanted the quick and easy water cooling. I actually had one of those for a short period of time, they're enclosed, no maintenance. I'm running two mcp35x with two 480's and a 360 with a EK cpu block, so my temps are very manageable but not quick and easy. My thoughts if you change the 120 to a 140 or 240, how much would you gain honestly, using the same pumps built on the card, I wouldn't think it would be too much and you stand a chance of loosing your warranty. I would think really two options, leave as it is, best for warranty or set up a custom loop, you'll drop temps quite a bit, allowing more headroom.


----------



## 295x2

Quote:


> Originally Posted by *Roikyou*
> 
> I was just referencing the h100 or h80 for the cpu if you wanted the quick and easy water cooling. I actually had one of those for a short period of time, they're enclosed, no maintenance. I'm running two mcp35x with two 480's and a 360 with a EK cpu block, so my temps are very manageable but not quick and easy. My thoughts if you change the 120 to a 140 or 240, how much would you gain honestly, using the same pumps built on the card, I wouldn't think it would be too much and you stand a chance of loosing your warranty. I would think really two options, leave as it is, best for warranty or set up a custom loop, you'll drop temps quite a bit, allowing more headroom.


Well considering the temps are good depending of the problems, and only the hardest ones make it hot, i would assume a really good 240 radiator with two fans would dissipate quite more heat and allow me to reach ok levels. 65C max is perfect. I dont need 50C.

My i7 4960x is using H80i and that is good enough for my use. I only use it for some rare problems were GPU are not useful.

I am more interested if technically a 240 could be used on this stock card. I only need to drop a few degrees.

As for the warrenty, i dont really care. It is for work. I could buy a replacement instead of bothering with a warrenty.

Thanks for nice discussion


----------



## ViRuS2k

can someone check this for me.
if you have a 295x2 and farcry 3 installed

can you max out your graphics 4x or 8x aa only at a high resolution then find a place on the map that makes your cards use nearly 99% gpu usage
then press the menu button and check out your map and guns ect and tell me if any of you guys are experiencing a weird artifact i am getting horizontal black line flickering seems to be just 1 black line that flickers are different spots on the screen but there allways the complete horizontal width
if does not happen all the time seems it happens when the gpus are maxed out... the issue does not happen in game seems to only happen in the menus of the game the menus where you look at your guns and map

i have tested out firestrike extreme and other high end games like crysis 3 and no issues at all but this game seems to have a issue could it be that it just needs a crossfire fix


----------



## 295x2

My machine joined a node for the positive hack days conference competition where there is password cracking challenges.
Interesting to notes that on some of those algorithms, the 295x2 performs well and doesn't warm much at all.
Those 5 GPU are just awesome for calculs of any sorts.


----------



## Tobiman

Quote:


> Originally Posted by *ViRuS2k*
> 
> can someone check this for me.
> if you have a 295x2 and farcry 3 installed
> 
> can you max out your graphics 4x or 8x aa only at a high resolution then find a place on the map that makes your cards use nearly 99% gpu usage
> then press the menu button and check out your map and guns ect and tell me if any of you guys are experiencing a weird artifact i am getting horizontal black line flickering seems to be just 1 black line that flickers are different spots on the screen but there allways the complete horizontal width
> if does not happen all the time seems it happens when the gpus are maxed out... the issue does not happen in game seems to only happen in the menus of the game the menus where you look at your guns and map
> 
> i have tested out firestrike extreme and other high end games like crysis 3 and no issues at all but this game seems to have a issue could it be that it just needs a crossfire fix


Yeah far cry 3 is a problem child for crossfire and sli, iirc.


----------



## ViRuS2k

Quote:


> Originally Posted by *Tobiman*
> 
> Yeah far cry 3 is a problem child for crossfire and sli, iirc.[/quote
> 
> thanks that puts my mind at ease lol thought i had weird **** going on but seems its only with this game and only in the games menus and map screen


----------



## levism99

Clock at 1048 / 1350


----------



## ViRuS2k

Quote:


> Originally Posted by *levism99*
> 
> 
> 
> Clock at 1048 / 1350


thief







nah just messing same cooler as me lol


----------



## NavDigitalStorm

Quote:


> Originally Posted by *levism99*
> 
> 
> 
> Clock at 1048 / 1350


brand?


----------



## levism99

Sapphire


----------



## levism99

Quote:


> Originally Posted by *ViRuS2k*
> 
> thief
> 
> 
> 
> 
> 
> 
> 
> nah just messing same cooler as me lol


nice choice lol


----------



## NavDigitalStorm

Apparently there is a single-slot WB....


----------



## Someone09

Oh boy, the cheapest 295x2 is now available for 1209€ over here.

Not sure how long I can resist...


----------



## shadow85

Quote:


> Originally Posted by *Someone09*
> 
> Oh boy, the cheapest 295x2 is now available for 1209€ over here.
> 
> Not sure how long I can resist...


What brand? Link?


----------



## Someone09

Club 3D

Hm...am I allowed to post links to the shop?


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Apparently there is a single-slot WB....


I love that. So much power in a single slot! I would only use that in a vertical orientation though. In horizontal with a single slot conversion it would probably rip the pcie slot right off your motherboard.


----------



## DeadlyDNA

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Apparently there is a single-slot WB....


that's Pure EPIC.


----------



## ViRuS2k

Quote:


> Originally Posted by *DeadlyDNA*
> 
> that's Pure EPIC.


i want to know where you can buy that block








would be fantastic buying that block and then buying adding your own pump and 260mm rad







would get better vrm cooling and completely silent


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ViRuS2k*
> 
> i want to know where you can buy that block
> 
> 
> 
> 
> 
> 
> 
> 
> would be fantastic buying that block and then buying adding your own pump and 260mm rad
> 
> 
> 
> 
> 
> 
> 
> would get better vrm cooling and completely silent


I believe it is an EK block.


----------



## ViRuS2k

yeah i just found it









WOW ok thats my next block







i can say that is mine


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ViRuS2k*
> 
> yeah i just found it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> WOW ok thats my next block
> 
> 
> 
> 
> 
> 
> 
> i can say that is mine


Is it for sale?


----------



## ViRuS2k

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Is it for sale?


says on there facebook page, coming soon








let me know if you see it up for sale in the uk







i will do the same heh


----------



## Elmy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Is it for sale?


I am getting 2 of those blocks as soon as they release them  I should have my 2nd 295X2 coming in the next week and half or so....

Edit... just ordered my AX1500i from Newegg


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Elmy*
> 
> I am getting 2 of those blocks as soon as they release them  I should have my 2nd 295X2 coming in the next week and half or so....
> 
> I just need to get my hands on a AX1500i but they are sold out everywhere :-(


You aren't playing around here are you.


----------



## ViRuS2k

Quote:


> Originally Posted by *Elmy*
> 
> I am getting 2 of those blocks as soon as they release them  I should have my 2nd 295X2 coming in the next week and half or so....
> 
> I just need to get my hands on a AX1500i but they are sold out everywhere :-(


Corsair AX1500i Digital ATX 80 Plus Titanium Modular Power Supply (CP-9020057-UK)
I see 15+ in stock...
http://www.overclockers.co.uk/showproduct.php?prodid=CA-178-CS&groupid=701&catid=123&subcat=2464


----------



## Jpmboy

Quote:


> Originally Posted by *ViRuS2k*
> 
> says on there facebook page, coming soon
> 
> 
> 
> 
> 
> 
> 
> 
> let me know if you see it up for sale in the uk
> 
> 
> 
> 
> 
> 
> 
> i will do the same heh


they've been saying that for weeks... couldn't wait so I picked up a koolance block. Curious to know what size T-pads the EK block will use on the VRM section. Koolance uses 1mm








Daaum - I have 4 EK blocks right now (2 titans 2 kingpins) and I think they make the best blocks with the best milling tolerances. Bit disappointed with the 1mm gap from koolance.

not important yet tho - but when voltage control is available, it may make a significant difference if EK has a 0.5mm vrm gap.


----------



## ViRuS2k

I would be intrested to know what other peoples temps are like on there GPUS
cause i need the block ASAP god knows what the VRMs temps would be but i HIT

72c tonight on my card that was playing crysis 3 @3440x1440 maxed out with 2xmultigpu aa.
this game completely destroys temps but my card never throttled at all seen 100% gpu on both cores continuasly









but dam think tonight is the hottest its been then again the air tonight is really hot....

what are you guys temps is my cooler faulty or is this the norm temps of this card i was planning on overclocking it guess i can do with 1650 memory thats fine but the gpus not much headroom
as with more voltage when afterburner gets updated will push temps over 75c though in the past my previous cards ran all day long at 84c but im guessing these cards will throttle up there

what are you guys temps...


----------



## Jpmboy

Quote:


> Originally Posted by *ViRuS2k*
> 
> I would be intrested to know what other peoples temps are like on there GPUS
> cause i need the block ASAP god knows what the VRMs temps would be but i HIT
> 
> 72c tonight on my card that was playing crysis 3 @3440x1440 maxed out with 2xmultigpu aa.
> this game completely destroys temps but my card never throttled at all seen 100% gpu on both cores continuasly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but dam think tonight is the hottest its been then again the air tonight is really hot....
> 
> what are you guys temps is my cooler faulty or is this the norm temps of this card i was planning on overclocking it guess i can do with 1650 memory thats fine but the gpus not much headroom
> as with more voltage when afterburner gets updated will push temps over 75c though in the past my previous cards ran all day long at 84c but im guessing these cards will throttle up there
> 
> what are you guys temps...


Before I put the water block on, 70C on the GPUs was pretty easy to do, and that was on an open bench rig. I put the waterblock on before installing it in this TJ09 case. With water, gpu temps never get above 57C (BF4 or Crysis at 4K resolution). The VRMs were in the 70s so I put a 90mm fan (wearing rubber feet) blowing on the backside - now with an IR thermometer they stay in the 60s by IR thermometer. I need to tape a temp sensor from the aquaduct to them to see what's what in real time.


----------



## PCModderMike

If I needed to upgrade my 690, this would be the card I would upgrade to no doubt. This card + the new EK block = pr0n I need in my rig.









Hmmm maybe I should *make* it time to upgrade.


----------



## ViRuS2k

Quote:


> Originally Posted by *Jpmboy*
> 
> Before I put the water block on, 70C on the GPUs was pretty easy to do, and that was on an open bench rig. I put the waterblock on before installing it in this TJ09 case. With water, gpu temps never get above 57C (BF4 or Crysis at 4K resolution). The VRMs were in the 70s so I put a 90mm fan (wearing rubber feet) blowing on the backside - now with an IR thermometer they stay in the 60s by IR thermometer. I need to tape a temp sensor from the aquaduct to them to see what's what in real time.


haha so its the norm then to have 70-73c temps on the default water cooler








its impossible for me to put a fan on the back of the card as there is no room lol
here is a current picture of the card in my system....

----

speaking of 1 slot waterblocks though for this card
there is another here thats being released soon


rumor is that its retailing around 169 euro

and here is a pic showing my system my cpu cooler blocks any air coolers from being on the back


----------



## Jpmboy

Quote:


> Originally Posted by *ViRuS2k*
> 
> haha so its the norm then to have 70-73c temps on the default water cooler
> 
> 
> 
> 
> 
> 
> 
> 
> its impossible for me to put a fan on the back of the card as there is no room lol
> here is a current picture of the card in my system....
> 
> ----
> 
> speaking of 1 slot waterblocks though for this card
> there is another here thats being released soon
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> rumor is that its retailing around 169 euro
> 
> and here is a pic showing my system my cpu cooler blocks any air coolers from being on the back
> 
> 
> Spoiler: Warning: Spoiler!


no rumor - 14 day lead time at AC... For the past two weeks







. Surprised Shoggy (AC rep) hasn't popped in on this forum yet. And that's a 2 slot bracket.
koolance on and installed.


Spoiler: Warning: Spoiler!


----------



## Elmy

Quote:


> Originally Posted by *ViRuS2k*
> 
> Corsair AX1500i Digital ATX 80 Plus Titanium Modular Power Supply (CP-9020057-UK)
> I see 15+ in stock...
> http://www.overclockers.co.uk/showproduct.php?prodid=CA-178-CS&groupid=701&catid=123&subcat=2464


Found them at Newegg.... silly me ... should of looked harder.

Thanks for the link! Rep!


----------



## 295x2

For those interested, i modded my 295x2 today. I added a coolgate 240mm radiator and a water tank. No custom block. The temps in my computations dropped by 12C on the heavy load.

However if if max out power and set min clock = max clock, it throttles at 62C. Guess what is damn hot ? VRMs









For gaming it should work flawlessly though. No game put the card and memory the way i stress it, especially with enforced clocks. Before it would go 74C and reach target temp even with a better fan.

I will probably not bother with a full block now as 98% of my problems are gone.

I need to keep VRM just a bit colder. I suspect the stock fan on them is just crap. But not easy to fix :/


----------



## NavDigitalStorm

Quote:


> Originally Posted by *295x2*
> 
> For those interested, i modded my 295x2 today. I added a coolgate 240mm radiator and a water tank. No custom block. The temps in my computations dropped by 12C on the heavy load.
> 
> However if if max out power and set min clock = max clock, it throttles at 62C. Guess what is damn hot ? VRMs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For gaming it should work flawlessly though. No game put the card and memory the way i stress it, especially with enforced clocks. Before it would go 74C and reach target temp even with a better fan.
> 
> I will probably not bother with a full block now as 98% of my problems are gone.
> 
> I need to keep VRM just a bit colder. I suspect the stock fan on them is just crap. But not easy to fix :/


PICTURES!!!!


----------



## bencher

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> PICTURES!!!!


----------



## 295x2

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> PICTURES!!!!


Don't pay attention to random fans and cables. The fans were left for desperate tries to cool things down.
I just plugged the modded version just to test it, i drove 6 hours today (thanks to jam and far city) to get it done, i couldn't care less of how it looks like.
Right now, it is being tested, adjusted.
The white cable is for a riser to the 5th GPU. There are 3 Sapphire 290x Tri X and now, the 295x2 blood transfusion edition lol
There are two 1700W PSU from Lepa





Cheers, NBZ


----------



## GravemanSam

Any ideas when the aquacomputer blocks will be on sale in the U.S along with possible back plates? I've got a nice coupon through frozen cpu I've been holding onto for them to offer the blocks for sale.


----------



## Jpmboy

Quote:


> Originally Posted by *GravemanSam*
> 
> Any ideas when the aquacomputer blocks will be on sale in the U.S along with possible back plates? I've got a nice coupon through frozen cpu I've been holding onto for them to offer the blocks for sale.


email Shoggy @ aquacomputer.


----------



## GravemanSam

i just sent him a pm on here. thanks


----------



## bencher

Quote:


> Originally Posted by *295x2*
> 
> Don't pay attention to random fans and cables. The fans were left for desperate tries to cool things down.
> I just plugged the modded version just to test it, i drove 6 hours today (thanks to jam and far city) to get it done, i couldn't care less of how it looks like.
> Right now, it is being tested, adjusted.
> The white cable is for a riser to the 5th GPU. There are 3 Sapphire 290x Tri X and now, the 295x2 blood transfusion edition lol
> There are two 1700W PSU from Lepa
> 
> 
> 
> 
> 
> Cheers, NBZ


must have been alot of work.

Are you a miner?


----------



## Jpmboy

Quote:


> Originally Posted by *295x2*
> 
> Don't pay attention to random fans and cables. The fans were left for desperate tries to cool things down.
> I just plugged the modded version just to test it, i drove 6 hours today (thanks to jam and far city) to get it done, i couldn't care less of how it looks like.
> Right now, it is being tested, adjusted.
> The white cable is for a riser to the 5th GPU. There are 3 Sapphire 290x Tri X and now, the 295x2 blood transfusion edition lol
> There are two 1700W PSU from Lepa
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers, NBZ


Niiiice!!!


----------



## Sgt Bilko

Quote:


> Originally Posted by *295x2*
> 
> Don't pay attention to random fans and cables. The fans were left for desperate tries to cool things down.
> I just plugged the modded version just to test it, i drove 6 hours today (thanks to jam and far city) to get it done, i couldn't care less of how it looks like.
> Right now, it is being tested, adjusted.
> The white cable is for a riser to the 5th GPU. There are 3 Sapphire 290x Tri X and now, the 295x2 blood transfusion edition lol
> There are two 1700W PSU from Lepa
> 
> 
> 
> 
> 
> Cheers, NBZ


Looks good there.

Nice work


----------



## Cool Mike

First, I know I'm crazy and addicted to hardware.

Not sure anyone has the Sapphire 295x2 OC yet?

I will receive my Sapphire 295x2 OC with the over the top case this afternoon. I will publish the overclock numbers later on. It will be interesting to see if the OC version will overclock past the typical 1080 Core. The only reason it may would be better silicon. Stay tuned.


----------



## lowgun

Quote:


> Originally Posted by *Cool Mike*
> 
> First, I know I'm crazy and addicted to hardware.
> 
> Not sure anyone has the Sapphire 295x2 OC yet?
> 
> I will receive my Sapphire 295x2 OC with the over the top case this afternoon. I will publish the overclock numbers later on. It will be interesting to see if the OC version will overclock past the typical 1080 Core. The only reason it may would be better silicon. Stay tuned.


Be sure to post its BIOS! I'd love to see if we can throw it on our non-OC versions.


----------



## axiumone

Mike, please post the bios to your OC card if you have a chance!


----------



## Cool Mike

I will post the BIOS.


----------



## bencher

how are the overclocks on these cards?


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Mike, please post the bios to your OC card if you have a chance!


I third that request. Does the OC version have Hynix memory?


----------



## GravemanSam

Hey guys I talked to shoggy about the aqua computer water blocks and apparently they broke a special tool they use to make the blocks and can't make anymore for a while. Also the back plates they originally announced they were making and would be out by the end of may, have been put on hold for a while since they don't even have a prototype made yet.







The few blocks they did make are shipping out relatively shortly though to U.S re distributors I think.

Shame. I was really looking forward to putting their blocks on both my cards and I know they have always made back paltes that actually do a really good job at cooling and are not just for looks.

Guess I am going to have to go with EK blocks then as soon as they are up for sale. Any news on EK possible making a backplate? I just saw the picture of the gold back plate they released for the 780 ti on their Facebook page and holy crap is that amazing. Wish they could make something like that for the 295x2.

https://www.facebook.com/EKWaterBlocks/photos/a.279856092068429.76142.182927101761329/669998886387479/?type=1&theater


----------



## Cool Mike

I will answer all your questions soon. Card has been delivered and will have it in my hands in about 1.5 hours. The HIS I sold had Hynix and was solid at 1650. Was a great card. Hope I did not make a mistake with the Sapphire 295X2 OC. Hoping for at least a solid 1100 core @ 4K maxed.

The purchase was easier as I used a $75 off coupon code with Newegg Business.


----------



## Someone09

I was looking at the prices for the 295x2 today and - for the first - saw them dip below 1200€. But only for a few hours.
C´mon...70€ less and I´m probably in.


----------



## NavDigitalStorm

Does anyone know if there is software to change fan speed on the radiator?


----------



## ViRuS2k

strange issue with powerplay with these cards.

default clocks are 300/150 clocks. in idle.

IF you use a monitor that is over the resolution of 2560x1080p your memory in idle will stay @150
but if you go over that resolution and use for instance 3440x1440 your idle memory clock will run at 1250 constantly. adding 2-3c more idle temps.

very weird though i think its normal play here unless anyone else is running that resolution and there idle memory clocks are staying at idle 150.

also tried overclocking the card on its default voltages and with my MSI version i could only get 1080/1700 have not tried higher memory but anything over that core speed and firestrike gives artifacts or flickering black line artifacts.
i would be interested in seeing what the other cards bios`s are like that run 1080 default as i beleave they could have slightly bumped up gpu voltages to hit 1100 without artifacts.
though i do beleave any card under extreme gpu usage that runs there cores over 1100 will hit the 75c throttle so overclocking would be pointless anyway on these cards when it comes to gpu that is but memory with hynix should fly like mine does.


----------



## lowgun

Quote:


> Originally Posted by *ViRuS2k*
> 
> strange issue with powerplay with these cards.
> 
> default clocks are 300/150 clocks. in idle.
> 
> IF you use a monitor that is over the resolution of 2560x1080p your memory in idle will stay @150
> but if you go over that resolution and use for instance 3440x1440 your idle memory clock will run at 1250 constantly. adding 2-3c more idle temps.
> 
> very weird though i think its normal play here unless anyone else is running that resolution and there idle memory clocks are staying at idle 150.
> 
> also tried overclocking the card on its default voltages and with my MSI version i could only get 1080/1700 have not tried higher memory but anything over that core speed and firestrike gives artifacts or flickering black line artifacts.
> i would be interested in seeing what the other cards bios`s are like that run 1080 default as i beleave they could have slightly bumped up gpu voltages to hit 1100 without artifacts.
> though i do beleave any card under extreme gpu usage that runs there cores over 1100 will hit the 75c throttle so overclocking would be pointless anyway on these cards when it comes to gpu that is but memory with hynix should fly like mine does.


I'm at 5760x1080 120Hz and my memory idles at 150.


----------



## ViRuS2k

Quote:


> Originally Posted by *lowgun*
> 
> I'm at 5760x1080 120Hz and my memory idles at 150.


*** strange
could it be cause i have 2 monitors plugged in ? though 1 of them is plugged into the hdmi port on my motherboard and 1 is plugged into my card
if i go to native resolution then i get 300/150 if i go above my native to 3440x1440 then the memory stays full speed hmmm

hmm weird.


----------



## bencher

Quote:


> Originally Posted by *ViRuS2k*
> 
> strange issue with powerplay with these cards.
> 
> default clocks are 300/150 clocks. in idle.
> 
> IF you use a monitor that is over the resolution of 2560x1080p your memory in idle will stay @150
> but if you go over that resolution and use for instance 3440x1440 your idle memory clock will run at 1250 constantly. adding 2-3c more idle temps.
> 
> very weird though i think its normal play here unless anyone else is running that resolution and there idle memory clocks are staying at idle 150.
> 
> also tried overclocking the card on its default voltages and with my MSI version i could only get 1080/1700 have not tried higher memory but anything over that core speed and firestrike gives artifacts or flickering black line artifacts.
> i would be interested in seeing what the other cards bios`s are like that run 1080 default as i beleave they could have slightly bumped up gpu voltages to hit 1100 without artifacts.
> though i do beleave any card under extreme gpu usage that runs there cores over 1100 will hit the 75c throttle so overclocking would be pointless anyway on these cards when it comes to gpu that is but memory with hynix should fly like mine does.


Do you have 2 monitors connected?

This happens with my 290 when 2 monitors are connected.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bencher*
> 
> Do you have 2 monitors connected?
> 
> This happens with my 290 when 2 monitors are connected.


Only happened to me when i started running DVI 1440p (110hz) and HDMI 1080p (60hz)

Seems to be an issue with Hawaii GPU's in general actually.


----------



## bencher

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Only happened to me when i started running DVI 1440p (110hz) and HDMI 1080p (60hz)
> 
> Seems to be an issue with Hawaii GPU's in general actually.


Its not an issue, it happens with my 7970 and 7870 as well.

The cards require more power to run at idle the higher the resolutions and all the windows effects enabled.


----------



## lowgun

Quote:


> Originally Posted by *ViRuS2k*
> 
> *** strange
> could it be cause i have 2 monitors plugged in ? though 1 of them is plugged into the hdmi port on my motherboard and 1 is plugged into my card
> if i go to native resolution then i get 300/150 if i go above my native to 3440x1440 then the memory stays full speed hmmm
> 
> hmm weird.


I have 3 monitors plugged in, lol. Although I do have all 3 of mine going to the card and they are all using miniDP-to-DP cables.


----------



## Cool Mike

Hello Everyone,

Received my Sapphire 295x2 OC yesterday.







I have been running stability tests today. At this point it looks like the OC version does indeed over clock better. I am at 1100 core and 1650 memory with no issues. I am running Valley at 4K ultra, X4 Anti with no artifacts. My previous standard clocked HIS 295x2 would do 1080 Core 1650 memory in the same identical system. I have more testing to do before I can say I am 100% stable at 1100 core. I will try to move the clock higher once the card has had a good run in for a day or two.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Hello Everyone,
> 
> Received my Sapphire 295x2 OC yesterday.
> 
> 
> 
> 
> 
> 
> 
> I have been running stability tests today. At this point it looks like the OC version does indeed over clock better. I am at 1100 core and 1650 memory with no issues. I am running Valley at 4K ultra, X4 Anti with no artifacts. My previous standard clocked HIS 295x2 would do 1080 Core 1650 memory in the same identical system. I have more testing to do before I can say I am 100% stable at 1100 core. I will try to move the clock higher once the card has had a good run in for a day or two.
> 
> 
> Spoiler: Warning: Spoiler!


Updated on the roster.


----------



## Cool Mike

Thanks Nav, Give me a day or two and I will give you my new overclock numbers.


----------



## ViRuS2k

Quote:


> Originally Posted by *Cool Mike*
> 
> Thanks Nav, Give me a day or two and I will give you my new overclock numbers.


dont hope for high clocks on the gpu...
though memory should skyrocket


----------



## ViRuS2k

Quote:


> Originally Posted by *lowgun*
> 
> I have 3 monitors plugged in, lol. Although I do have all 3 of mine going to the card and they are all using miniDP-to-DP cables.


if its not doing it for you with 3x minidp to dp cables then the cause for me has to be one of my cables and that is the minidp to hdmi adapter thats causeing it....

also in a sidenote i would be intrested in people flashing there cards with the OC bios and see if they can break 1110gpu


----------



## 295x2

Quote:


> Originally Posted by *Cool Mike*
> 
> Hello Everyone,
> 
> Received my Sapphire 295x2 OC yesterday.
> 
> 
> 
> 
> 
> 
> 
> I have been running stability tests today. At this point it looks like the OC version does indeed over clock better. I am at 1100 core and 1650 memory with no issues. I am running Valley at 4K ultra, X4 Anti with no artifacts. My previous standard clocked HIS 295x2 would do 1080 Core 1650 memory in the same identical system. I have more testing to do before I can say I am 100% stable at 1100 core. I will try to move the clock higher once the card has had a good run in for a day or two.


Could you please share with me the BIOS of the OC version ? I want to see if it helps with the fans and the VRM?
I use my Sapphire for computation, and VRM gets crazy hot.. you could dump it with GPU-z ...
You could be my savior! Thanks!


----------



## Cool Mike

Hello 295x2,

Yes, I will, Many people wanting this Bios. I have never uploaded using GPUz. I just tried and get a bios file already exists though its not in the database.

Any tricks on getting this on Techpowerup's database?


----------



## Roikyou

Quote:


> Originally Posted by *Cool Mike*
> 
> Hello 295x2,
> 
> Yes, I will, Many people wanting this Bios. I have never uploaded using GPUz. I just tried and get a bios file already exists though its not in the database.
> 
> Any tricks on getting this on Techpowerup's database?


Saving to file or to online database? Should let you store locally. Store locally and then I'm assuming you can submit it here to the forums?


----------



## Cool Mike

I will try. Will not accept .rom as an extension. I will need to change extension and then users will need to change back to .rom

Not sure if this procedure is correct. Maybe I need to use a different extension. Please give me feedback/input if bios works or not.









Sapphire 295X2 OC Bios.

*Here it is: Change .txt to .rom after download. Disclaimer: I am not responsible for any issues related to this bios file.
*

295X2_OC.txt 128k .txt file


----------



## 295x2

I can't wait to try it and see if it helps my problems







i may not need to powertune anymore and thus, go full speed without VRMs so hots


----------



## 295x2

Quote:


> Originally Posted by *Cool Mike*
> 
> I will try. Will not accept .rom as an extension. I will need to change extension and then users will need to change back to .rom
> 
> Not sure if this procedure is correct. Maybe I need to use a different extension. Please give me feedback/input if this doesn't work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire 295X2 OC Bios.
> 
> *Here it is: Change .txt to .rom after download. Disclaimer: I am not responsible for any issues related to this bios file.
> *
> 
> 295X2_OC.txt 128k .txt file


I love you !
I will try that


----------



## 295x2

Quote:


> Originally Posted by *Cool Mike*
> 
> I will try. Will not accept .rom as an extension. I will need to change extension and then users will need to change back to .rom
> 
> Not sure if this procedure is correct. Maybe I need to use a different extension. Please give me feedback/input if bios works or not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire 295X2 OC Bios.
> 
> *Here it is: Change .txt to .rom after download. Disclaimer: I am not responsible for any issues related to this bios file.
> *
> 
> 295X2_OC.txt 128k .txt file


Ok i flash it, but you dumped the MASTER card, can you please also dump the slave?
They both have different BIOS.

Select the second GPU (under GPU-Z), you should see : S0 in the last part of the name, versus M0 in the first one.
Please pretty please, then i am done flashing it


----------



## DeadlyDNA

Quote:


> Originally Posted by *ViRuS2k*
> 
> yeah i just found it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> WOW ok thats my next block
> 
> 
> 
> 
> 
> 
> 
> i can say that is mine


That is the baddest a$$ waterblock/card single slot ever. This would just be epic if 290/290x were one slot, and had one slot wb.......


----------



## Cool Mike

Here is the slave rom.

Let us know your results.









I would appreciate a Rep on this one.









295X2_OC_Slave.txt 128k .txt file


----------



## 295x2

Quote:


> Originally Posted by *Cool Mike*
> 
> Here is the slave rom.
> 
> Let us know your results.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 295X2_OC_Slave.txt 128k .txt file


You totally rule. I will let you know right away








Right now i have one OC and one not OC







so it works and it is easy as pie to flash.


----------



## 295x2

So i am already back, and my 295x2 Sapphire is now an OC edition.
While i am running some tests, here is how you do it:

1: you read the info here:
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread

2) you prepare your USB key the same way, you save : 295X2_OC.txt as OC.rom and 295X2_OC_Slave.txt as OC2.rom
You put them on your USB key, you have also atiflash.exe

3) You boot on the key and get the command line

4) atiflash -f -p 0 OC.rom (this flash the master card)
atiflash -f -p 1 OC2.rom (this flash the Slave card)

Reboot and enjoy


----------



## 295x2

Ok, on power target +50 it still throttles (VRMS too hot), however at +15% it stays stable at 1010-1015 MHZ and thus, basically reach full speed of the non OC, without getting too hot (thanks to my mod on the radiator).
This is a big win!







this is the hardest scenario of computations that i use
Thank you very much Cool Mike!







Sapphire support didn't want to give it to me, saying blabla you could brick your card blabla


----------



## ViRuS2k

http://www.techpowerup.com/gpudb/b2914/sapphire-r9-295x2-oc.html

guess i am going to flash that bios as it gives you default 1300 memory clocks and 1030 gpu clocks
every little helps







+ this bios will keep powerplay in check









Pitty there is no MSI OC version bios







lol


----------



## Cool Mike

Your welcome.

I do a lot of benchmarking and I have found the power target is best set at 20-25%. No gains going higher than that in my testing and the temps are better.


----------



## Elmy

AX1500i showed up today. 2nd 295X2 shipped yesterday. 

Now just need EK waterblocks :-/


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Elmy*
> 
> AX1500i showed up today. 2nd 295X2 shipped yesterday.
> 
> Now just need EK waterblocks :-/


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> I will try. Will not accept .rom as an extension. I will need to change extension and then users will need to change back to .rom
> 
> Not sure if this procedure is correct. Maybe I need to use a different extension. Please give me feedback/input if bios works or not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire 295X2 OC Bios.
> 
> *Here it is: Change .txt to .rom after download. Disclaimer: I am not responsible for any issues related to this bios file.
> *
> 
> 295X2_OC.txt 128k .txt file


Thanks, that should work fine. It's how we've shared 290x and GK110 bios'. You can also zip the file. Ocn will accept a ziped rom file upload.


----------



## Jpmboy

Quote:


> Originally Posted by *295x2*
> 
> Ok, on power target +50 it still throttles (VRMS too hot), however at +15% it stays stable at 1010-1015 MHZ and thus, basically reach full speed of the non OC, without getting too hot (thanks to my mod on the radiator).
> This is a big win!
> 
> 
> 
> 
> 
> 
> 
> this is the hardest scenario of computations that i use
> Thank you very much Cool Mike!
> 
> 
> 
> 
> 
> 
> 
> *Sapphire support didn't want to give it to me, saying blabla you could brick your card blabla
> 
> 
> 
> 
> 
> 
> 
> *


That's why team green loves EVGA !


----------



## DeadlyDNA

Quote:


> Originally Posted by *Jpmboy*
> 
> [/B]
> 
> That's why team green loves EVGA !


EVGA is really awesome. I feel ahsamed i didn't use any of their hardware in my latest build :-( in my defense i got tons of their equipment still.

So i heard someone say that the 1500watt psu they had isn't strong enough for 2 of these? Does that sound right? I mean of course stock and overclocked within reason.


----------



## axiumone

Quote:


> Originally Posted by *DeadlyDNA*
> 
> EVGA is really awesome. I feel ahsamed i didn't use any of their hardware in my latest build :-( in my defense i got tons of their equipment still.
> 
> So i heard someone say that the 1500watt psu they had isn't strong enough for 2 of these? Does that sound right? I mean of course stock and overclocked within reason.


Hmm, I wonder were you read that. Take a look at my sig rig. No power issues here. CPU is at 4.5 and cards are at 1070/1500 with some power to spare.


----------



## DeadlyDNA

Quote:


> Originally Posted by *axiumone*
> 
> Hmm, I wonder were you read that. Take a look at my sig rig. No power issues here. CPU is at 4.5 and cards are at 1070/1500 with some power to spare.


Someone mentioned it on these forums, i was just curious if these dual gpu cards get more usage occupying 2 slots instead of 4. Or if gpu caling is more effiecient using more power instead of 4 gpus across 4 slots.


----------



## Roikyou

Grabbed the Corsair AX1500i also so I can move up to two if I choose to do so, just wish Corsair didn't go all ribbon cables, hard to manage in my opinion.


----------



## Cool Mike

Hello Nav/Everyone

Completed a lot of benchmarking with my Sapphire 295X2 OC. Defiantly a improvement on the core over the standard clock I previously had.

Lots benching at 4K @ max resolution with Valley. Also ran BF4 at 4K ultra with no issues for 1 hour. The overclocks are based on a 4k display with AA at X4.
I'm sure with resolutions of 1080P or 1440P I could have pushed the core slightly higher. I could not push the memory any further than 1650Mhz. Great Memory OC though.

*Nav, can you please update my CLOCK SPEEDS.
*

Both Cores - 1100Mhz
Memory - 1650Mhz (6600Mhz Effective)


----------



## 295x2

I just realized something. I don't know if it is related to OC edition or not.

If you download the latest MSI Afterburner tool, and you tick "extend official overclocking limits" and reboot your computer.
You can now control the VRM fan speed like any other fan!!!
So i can push it harder during my tasks, to help cooling. Interestingly, this clock my OC edition (since i flashed it) to 1040, i don't know if this is because all my other 290x are clocked to that speed or not.

Eitherway, OC edition + VRM Fan control is a win win for me


----------



## 295x2

Quote:


> Originally Posted by *295x2*
> 
> I just realized something. I don't know if it is related to OC edition or not.
> 
> If you download the latest MSI Afterburner tool, and you tick "extend official overclocking limits" and reboot your computer.
> You can now control the VRM fan speed like any other fan!!!
> So i can push it harder during my tasks, to help cooling. Interestingly, this clock my OC edition (since i flashed it) to 1040, i don't know if this is because all my other 290x are clocked to that speed or not.
> 
> Eitherway, OC edition + VRM Fan control is a win win for me


Ok, with this setting i no longer need any damn power target it seems...!!
Card is now full speed at 1040 MHZ +0 power target.. clock full, temp much lower.. I am puzzled.
It is like it removed some bad limitations .. now it acts like all my 3 other 290x!!

Edit: only Master card stays at full speed.. +25% on the second card . seems to be related to temp.
This card is really a puzzle to optimize on computing

Edit2: Actually it depends of the problems i try to resolve. extend official overclocking limits MADE a huge difference, I don't know if it is related to the OC bios or not, but my card is running one problem right now, +0 on both cards, at 1040 MHZ (new defaut clock for some reasons) and temps are under 60C









Edit3: My post was reported to moderator because i used a swear word which was automatically changed to "*". I am sorry if the asterix offended someone. I won't swear again...


----------



## ViRuS2k

Might have to flash that bios
arghhh wish there was a way to change the device id from sapphire to msi so that my card is identified as a msi card rather than sapphire :/
that way if i have to sell it in the future it will look normal lol

though i tried the unofficial overclocking mode and the fanspeed was greyed out so i guess it only works with the OC bios.


----------



## 295x2

Did you reboot once you ticked the option ? I think that's the only thing i ticked. But i was doing some tests. try to tick everything else, like voltage unlocked etc, and reboot, and see if it helps.
It could well be related to the OC bios.
Editing the BIOS ID is easy, however, I suppose there is a checksum on the BIOS. Though, this might not be too hard to update. But i can't risk it


----------



## ViRuS2k

risk, there is no risk involved







you have a bios switch







just reflash if it did not work ...
and yeah all i did was that setting and rebooted.

think its the OC bios though before flashing it i need to figure out a way to change the device id, VBE7 - vBIOS Editor for Radeon HD 7000 series cards the guy that programed that app needs to update it though he seems to be awol.....
people have asked you have asked i have asked quite a few people have Q`s in that thread but the guy has vanished. lol


----------



## 295x2

Did you try to edit the BIOS rom with a hex editor then?







replace MSI ID with the Sapphire one or something.

However, for me , that setting did much more than just the fan control, it made the card much better at holding a clock without any power tune needed.
The card is working at 1040 MHZ right now, on one problem, both GPU are at 56C!
There was no way to do that, even just with the OC BIOS itself..

It changes the PP_PhmSoftPowerPlayTable in the registry, and that might have changed the default voltage or something. I don't know. But it works just perfectly well now.
All my problems are solved, thanks to Cool Mike and this extended trick. I won't complain even if i don't understand it all. Oh and the modded cooler.


----------



## ViRuS2k

one other question can you run GPUZ and leave it running and go into a demanding game and let me know what your GPU voltage 3d voltage is
so i can compare it with my none OC card to see if the OC bios gives slightly higher voltages... ?

mine report :VDDC 1.231v max voltage
do you also have UPLS disabled ?


----------



## 295x2

I have it disabled yes, I don't have any game installed sorry, this machine isn't for gaming, despite the 5 GPU.
1.2v is the max i currently see with power target disabled. same enabled.

The Current is moving a lot though.


----------



## ViRuS2k

only 1.2v hmmm going to give that bios a shot and see whats different lol could be why your getting lower temps, lower voltage bios.


----------



## Elmy

My new toys

Add me as having 2 295X2's to the list plz 

Just make them both Club3D brand... they both came from them someway...

Need EK to hurry up with waterblocks so I can have them on and ready for Infernal LAN in 3 weeks :-/


----------



## axiumone

Very nice, congrats!

Did you get a chance to install them yet?


----------



## Elmy

Quote:


> Originally Posted by *axiumone*
> 
> Very nice, congrats!
> 
> Did you get a chance to install them yet?


nope just picked the card up from Fedex last night @ 6pm...then went and had dinner and drinks...probably tonight or first thing tomorrow.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Elmy*
> 
> My new toys
> 
> Add me as having 2 295X2's to the list plz
> 
> Just make them both Club3D brand... they both came from them someway...
> 
> Need EK to hurry up with waterblocks so I can have them on and ready for Infernal LAN in 3 weeks :-/
> 
> 
> Spoiler: Warning: Spoiler!


Accepted!


----------



## 4K-HERO

Hey, Ive got an Asus 295x2 crossfired with an msi 290 liquid cooled with Kraken g10 and X40.

Here"s a pic and a link to my firestrike score. The cards have scaled exceptionally well. The graphics score went from 21,000 with only the 295x2 up to 30,100 with the 290. Everything runs like a dream in 4K, except Ive limited the msaa or ssaa to 2x. My case is too small for this setup! Just wondering if anyone else has done this and if they are getting similar scores? BTW IVE GOT A CORSAIR RM 1000. IN BF4 ALL SETTING MAXED IN 4K WITH 2x MSAA, 70-80 FPS DURING GAMEPLAY UNLESS SOMETHING CRAZY HAPPENS, AND NOT GOING OVER 950 WATTS (IF ANYONES INTERESTED IN DOING THE SAME AND LOOKING FOR A PSU)

http://www.3dmark.com/fs/2186304


----------



## NavDigitalStorm

Quote:


> Originally Posted by *4K-HERO*
> 
> Hey, Ive got an Asus 295x2 crossfired with an msi 290 liquid cooled with Kraken g10 and X40.
> 
> Here"s a pic and a link to my firestrike score. The cards have scaled exceptionally well. The graphics score went from 21,000 with only the 295x2 up to 30,100 with the 290. Everything runs like a dream in 4K, except Ive limited the msaa or ssaa to 2x. My case is too small for this setup! Just wondering if anyone else has done this and if they are getting similar scores? BTW IVE GOT A CORSAIR RM 1000. IN BF4 ALL SETTING MAXED IN 4K WITH 2x MSAA, 70-80 FPS DURING GAMEPLAY UNLESS SOMETHING CRAZY HAPPENS, AND NOT GOING OVER 950 WATTS (IF ANYONES INTERESTED IN DOING THE SAME AND LOOKING FOR A PSU)
> 
> http://www.3dmark.com/fs/2186304


Accepted


----------



## lowgun

Quote:


> Originally Posted by *295x2*
> 
> Did you reboot once you ticked the option ? I think that's the only thing i ticked. But i was doing some tests. try to tick everything else, like voltage unlocked etc, and reboot, and see if it helps.
> It could well be related to the OC bios.
> Editing the BIOS ID is easy, however, I suppose there is a checksum on the BIOS. Though, this might not be too hard to update. But i can't risk it


I've applied the OC BIOS and check the overclock option in AB, but I'm not getting the VRM fan control option. Any idea what I'm doing wrong?


----------



## 295x2

Interesting. Then it is some other options.
Tomorrow i will upload my settings. I noticed the control by luck. I assumed it was that.

Did you get your card clocked at 1040 too? I mean not by the BIOS, but auto clocked higher. Thats the clock of my 290x so i am not sure if it is related.

I need to figure what enables it. Because i need that setting very much. I no longer have problems with modded cooler, OC bios and that fan control.

Can you upload a screenshot of your setting by any chance? Check catalyst overdrive. I can also change fan speed there now.

Interestingly, the target temperature of 75C is gone.


----------



## 4K-HERO

2 quick questions. Will this Sapphire OC bios work on any 295x2? Are all these rebranded models the same? Because I really need to turn my fan level up. The 295x2 is running hotter than usual because of the card below it. NZXT H440 is really small. Some advice would be really appreciated.


----------



## 295x2

I assume the BIOS should work on any model, as they are all reference design and same hardware. However, i can't promise it will.
If i were you, i would use GPU-Z to save the bios of BOTH gpu. (One is the master, and the second is the slave).
In the BIOS revision, between parenthesis, you will see at the end something like xxx-xxxxxxMxx-xxx and xxx-xxxxxxSxx-xxx. Save them both and make sure to give them a proper name.

M means Master, and S means Slave. Back them up and follow the instructions i posted a while back to apply the OC Edition bios.
However, We need to figure out what exactly adds the fan control, maybe it is not even related to the BIOS.
I am surprised the extanded official overclocking feature didn't enable it.
But in my opinion, this setting should be used, it changes the power table and the cards keeps its clock much better without any power tune needed.

Attached is my MSI After Burner settings.
Please note that the fan control is only accessible in one of the card panel, not on both gpus panel. I assume on the Master one


----------



## 4K-HERO

Are you sure your card is running as it should? You said earlier, your temp is 60 degrees, and thats with the OC settings. Can you please run firestrike and post the link. If all is good, flashing my asus 295x2 with the sapphire OC bios will save me lots of time and money. Your help would be greatly appreciated.


----------



## 295x2

What do you mean by "your card is running as it should ?
My temps are based on computation, not games.
Which is much more intensive. 100% GPU always, there is no single pauses. My cooler is also modded (bigger radiator and a tank).
You should try, we obviously have different configuration and tasks.


----------



## 4K-HERO

I have an r9 290 running at stock clocks, sitting below the 295x2. The 290 is watercooled with a kraken g10 bracket and x40 on the core. I put some aftermarket heatsinks on the memory and vrms. But the vrms are sending heat up to the 295x2 and its hitting the 74 degree mark before throttling. If the 295x2 runs cooler with this bios. It will definitely save me money. I wont have to water cool it


----------



## 295x2

You need to manage the VRM fan to get better temp. I only managed to do so, after i modded the BIOS though.
The problem is, noone managed to reproduce it so far, this is what we are trying to do, and thus why i posted my AA settings.
However, the VRM fans is very noisy, almost as noisy as a 290x when you push it. Funny thing is, the speed is max around 50+ , it doesn't make any more noise after that treshold.
I usually keep it around 45-47 as my other cards are at 60-65% and thus, the machine isn't silent anyway.

By default, the VRM fans seem to be at 25%, so far under its capacity, and thus relatively quiet. the problem with the card under, is that it is blowing hot air to the VRM, i decided to put a riser and move the card right under, outside the box for now.

The OC bios in my case, allwos me to disable powertune (in over drive, power target) and maintain the clock. and thus, the card runs at 60C. Depends of the task, the worst i can get is 63C so far, when the room is pretty hot.
Considering you have two BIOS position, you should try to flash the BIOS and make sure you setup AA the way i set it, once you did a benchmark, to compare what works best for you.
I think the best way is to watercool it all though, which is what i will do, so i can push that card like i push my 290x and get the best possible speed, in silence.


----------



## 4K-HERO

Lowgun, were you able to figure out the fan control situation?


----------



## lowgun

Quote:


> Originally Posted by *4K-HERO*
> 
> Lowgun, were you able to figure out the fan control situation?


Unfortunately no. I have been toggling settings on and off, even set it to 295x2's settings from the screenshot, and still no luck. The BIOS definitely are applied, but I also tried flashing back to the orignal BIOS on both the master and slave then to OC again, but yet again, no luck. There is a new AMD driver coming out tomorrow, but other than if we can replicate his results, we may have to wait till a new Afterburner is released. My VRMs are getting too hot (not sure temperatures) which is causing the card to throttle down to 300Mhz on both GPUs even though the GPUs aren't even cracking 70C.


----------



## 4K-HERO

How many gpu's do you have running under the 295x2? I had the same problem because I have an r9 290 crossfired, sitting under the 295x2. You have to water cool the bottom card so the x2's fan stops absorbing hot air, Yesterday I put a corsair h80i on my processor,to remove more heat from the case and get better airflow. I Put a 2500 rpm fan in the front of the case to blow cool air between the two cards. Everything is good now. The x2's designed to not go over 75 degrees, but remember, there are two compact cards, side by side, putting out 75 degree air. When it starts to throttle, put your hand on the metal casing, TO FEEL HOW HOT IT IS RUNNING!. It took me a week to get temperatures to a place where nothing in the case would throttle or overheat. I had two 290's previously. Sapphire tri x, was sitting in the top slot, stock cooling. A dark rock pro 3 on the processor and never had any problems with heat. Once I put that goliath of a video card in there. Everything was getting hotter. AMD, WE NEED FAN CONTROL!


----------



## 295x2

So, people start seeing the 300 MHZ throttling i was talking about a few weeks before, so i was not the only one.
If you can't replicate, then i don't know what enabled the fan control, and that worries me to be honest. Because if i ever format, I am going to lose it.

The other tool i am also using is EVGA overclocking tool. I doubt he is the reason for the fan control though.. I am puzzled.
However, i didn't know about new drivers coming tomorrow, i am very happy to hear that.

If i disable the feature i mentioned, i lose the control of the fan.. this is why i assumed that was the reason.
Was your card clocked up to 1040 when you enabled that feature ?

I have a 290x under the 295x2, but it is 4 PCI slots down (one of which can't be used as 295x2 almost cover it)
I removed the card in between, to keep temperature good, and used a riser instead.
I am still convinced the current drivers are not optimal at all.


----------



## axiumone

Well, new 14.6 drivers are out and eyefinity with 295x2 crossfire is still broken. Pretty disappointed at the moment. The second card has to be disabled in order to get rid of the tearing.

Here are two quick vids I took this morning with the new drivers, but 14.4 performance was the same.


----------



## 4K-HERO

The AMD Catalyst 14.6 Beta driver is a notable release for us for a couple reasons:

Mantle is now enabled for notebooks based on AMD Enduro technology
Mixed-resolution AMD Eyefinity configurations are now supported for SLS
Performance improvements for Watch Dogs
Performance improvements for Murdered: Soul Suspect

Watch Dogs:

AMD Radeon R9 290X - 1920x1080 4xMSAA - improves up to 25%
AMD Radeon R9 290X - 2560x1600 4xMSAA - improves up to 28%
AMD Radeon R9 290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 92% scaling
Murdered: Soul Suspect:

AMD Radeon R9 290X - 2560x1600 4xMSAA - improves up to 16%
AMD Radeon R9 290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 93% scaling
AMD Eyefinity Verbesserungen:
Mixed Resolution Support

A new architecture providing brand new capabilities
Display groups can be created with monitors of different resolution (including difference sizes and shapes)
Users have a choice of how surface is created over the display group
Fill - legacy mode, best for identical monitors
Fit - create the Eyefinity surface using best available rectangular area with attached displays.
Expand - create a virtual Eyefinity surface using desktops as viewports onto the surface.
Eyefinity Display Alignment
Enables control over alignment between adjacent monitors
One-Click Setup
Driver detects layout of extended desktops
Can create Eyefinity display group using this layout in one click!
New user controls for video color and display settings

Greater control over Video Color Management:

Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
Allows users to select different color depths per resolution and display
AMD Mantle enhancements

Mantle now supports AMD Mobile products with Enduro technologyEnables support for Multi-GPU configurations with Thief (requires the latest Thief update).
Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) - 21% gain
Thief: AMD Radeon HD 8970M (1920x1080; high settings) - 14% gain
Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) - 274% gain
AMD AM1 JPEG decoding acceleration

JPEG decoding acceleration was first enabled on the A10 APU Series in AMD Catalyst 14.1 beta, and has now been extended to the AMD AM1 Platform
Provides fast JPEG decompression
Provides Power Efficiency for JPEG decompression


----------



## lowgun

Quote:


> Originally Posted by *axiumone*
> 
> Well, new 14.6 drivers are out and eyefinity with 295x2 crossfire is still broken. Pretty disappointed at the moment. The second card has to be disabled in order to get rid of the tearing.
> 
> Here are two quick vids I took this morning with the new drivers, but 14.4 performance was the same.


I never had an issue with Eyefinity and 14.4. I'm installing 14.6 now and will let you know if I start to have any.


----------



## axiumone

Quote:


> Originally Posted by *lowgun*
> 
> I never had an issue with Eyefinity and 14.4. I'm installing 14.6 now and will let you know if I start to have any.


Are you running in landscape or portrait?


----------



## lowgun

Quote:


> Originally Posted by *axiumone*
> 
> Are you running in landscape or portrait?


Landscape


----------



## axiumone

Quote:


> Originally Posted by *lowgun*
> 
> Landscape


Gotcha. That's been on my mind as well. It seems that landscape is a more widely used orientation and has better support. I think some of the timings for the monitors are different in different orientations.


----------



## lowgun

Ugh, now with the new 14.6, my computer freaks out sometimes when switching from Extended (standard 3 screen setup) to Duplicate (Eyefinity setup). It makes the screens go all pixelated, then the PC reboots. Not sure if its 295x2 or the 14.6 driver. Anyone else with a similar setup as me?


----------



## fireedo

well I just can say this card is really OC friendly with just increase 2% power limit I can achieve core/mem @ 1099/1500 rocksolid stable








and the most important things is it keeps cool all the time ( ambient temp @ 26- 27 C )


----------



## NavDigitalStorm

Quote:


> Originally Posted by *fireedo*
> 
> well I just can say this card is really OC friendly with just increase 2% power limit I can achieve core/mem @ 1099/1500 rocksolid stable
> 
> 
> 
> 
> 
> 
> 
> 
> and the most important things is it keeps cool all the time ( ambient temp @ 26- 27 C )


How was the performance increase with that 80 MHz boost?


----------



## axiumone

Does anyone on here have the 295x2 in crossfire on a x79 rampage black edition board? Preferably also with eyefinity.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Does anyone on here have the 295x2 in crossfire on a x79 rampage black edition board? Preferably also with eyefinity.


I do... minus the eyefinity.


----------



## axiumone

Nav, do you still have it on your test bench?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> Nav, do you still have it on your test bench?


 Nope, what were you wondering about?


----------



## 295x2

Quote:


> Originally Posted by *fireedo*
> 
> well I just can say this card is really OC friendly with just increase 2% power limit I can achieve core/mem @ 1099/1500 rocksolid stable
> 
> 
> 
> 
> 
> 
> 
> 
> and the most important things is it keeps cool all the time ( ambient temp @ 26- 27 C )


As long as you keep power target in a low range, you won't have problems with temp. I pushed mine to 1120 on some calculs without problems.
I noticed that sometimes, only the slave gpu needs power target. When not needed, don't push both gpu if the first one can hold the clock, because it warms more.

As for the 14.6 beta driver, i quickly rolled back.
None of my three 290x could hold the clocks without power tune, nor the 295x2.

14.4 is far better, so i will skip that beta and wait for the next one. finger crossed.


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Nope, what were you wondering about?


I'm wondering about the bottom molex on the rampage. In the Corsair 450D there isnt enough room to plug the bottom molex. I've been having some issues with these cards in eyefinity and I'm wondering if it may be from lack of power. In order for me to test it with the bottom molex plugged in, I'd have to take apart the whole rig and test outside of the case.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *axiumone*
> 
> I'm wondering about the bottom molex on the rampage. In the Corsair 450D there isnt enough room to plug the bottom molex. I've been having some issues with these cards in eyefinity and I'm wondering if it may be from lack of power. In order for me to test it with the bottom molex plugged in, I'd have to take apart the whole rig and test outside of the case.


Ahh yeah that was the issue when we were building our test rig. We had to go with the Sabertooth X79 instead. Sorry, can't help you there in regards to the eyefinity. There might be a power delivery issue though.


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Ahh yeah that was the issue when we were building our test rig. We had to go with the Sabertooth X79 instead. Sorry, can't help you there in regards to the eyefinity. There might be a power delivery issue though.


No worries. I love a good tear down haha.


----------



## NavDigitalStorm

If adding the molex does fix the issue, you could always have some fun and replace the 90 degree connector with another one


----------



## Elmy

Just ordered 2 295X2 EK Nickel/Clear Plexi waterblocks. Should be here in 3 days. 

475.00 out the door with shipping.


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> If adding the molex does fix the issue, you could always have some fun and replace the 90 degree connector with another one


Already ordered. Just in case.









Quote:


> Originally Posted by *Elmy*
> 
> Just ordered 2 295X2 EK Nickel/Clear Plexi waterblocks. Should be here in 3 days.
> 
> 475.00 out the door with shipping.


Can't wait to see what the finished beast will look like!


----------



## fireedo

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> How was the performance increase with that 80 MHz boost?


in game i dont have to try it but at least in the benchmark (3DMark) it give me an increased performance really well
Quote:


> Originally Posted by *295x2*
> 
> As long as you keep power target in a low range, you won't have problems with temp. I pushed mine to 1120 on some calculs without problems.
> I noticed that sometimes, only the slave gpu needs power target. When not needed, don't push both gpu if the first one can hold the clock, because it warms more.
> 
> As for the 14.6 beta driver, i quickly rolled back.
> None of my three 290x could hold the clocks without power tune, nor the 295x2.
> 
> 14.4 is far better, so i will skip that beta and wait for the next one. finger crossed.


Thankyou for that information
Quote:


> Originally Posted by *Elmy*
> 
> Just ordered 2 295X2 EK Nickel/Clear Plexi waterblocks. Should be here in 3 days.
> 
> 475.00 out the door with shipping.


excited here, need to get backplate too...since back PCB need to be cooled too (VRMs are hot too)


----------



## Elmy

Quote:


> Originally Posted by *fireedo*
> 
> in game i dont have to try it but at least in the benchmark (3DMark) it give me an increased performance really well
> Thankyou for that information
> excited here, need to get backplate too...since back PCB need to be cooled too (VRMs are hot too)


The link for the backplate is broken on their website....


----------



## fireedo

Quote:


> Originally Posted by *Elmy*
> 
> The link for the backplate is broken on their website....


not sure, but I cant find backplate for 295x2 on their store website


----------



## Elmy

They said the 295X2 factory one could be used with modifications on their website...


----------



## NavDigitalStorm

So my card pushes 74C on BF4, not any other game but I do get 120+ fps maxed. Should be safe to drop the MHz to lower temps and still get 80 fps, right?


----------



## axiumone

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> So my card pushes 74C on BF4, not any other game but I do get 120+ fps maxed. Should be safe to drop the MHz to lower temps and still get 80 fps, right?


Is that at stock clocks? You may also reduce the power limit target if you're overclocking, it should reduce the temps a bit.


----------



## papi4baby

Hey guys i just got my Asus 295X2.

I wonder if anyone can see if i have a faulty card. While playing BF4 the card will hit 74 degrees. Room temp is around that too.

And i actually can hear the water pumps running, it almost sounds as if it were raining (like running water). I have a H100 on my CPU and it has never made any noise like that. I can hear the pump running on the CPU one, but i have to get up to it. On the 295X2 i can hear it while sitting on the chair.

Thanks for your help.

Also, anyway to disable xfire to be able to play COD ghost? Or any tricks to get Ghost to run?


----------



## 4K-HERO

I have an Asus 295x2. I don"t hear anything of the sort. My case is closed. but the gpu"s radiator is mounted on the exhaust vent. I was getting 74 degrees and throttling on mine also, because it is crossfired with a 290, but that was when the rad was mounted in the front. Mount it as an exhaust and put a better fan on it. Mine never goes over 70 degrees now and thats crossfired, and its the top card! I have a gentle typhoon 3000rpm pushing the air out of that sucker!


----------



## ViRuS2k

we need some way to modify the 75c thermal limit on these cards cause 75 is nowt to worry about
should be raised to 85c tops.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ViRuS2k*
> 
> we need some way to modify the 75c thermal limit on these cards cause 75 is nowt to worry about
> should be raised to 85c tops.


I think it might be the liquid inside to worry about. That's some hot water running around in there.


----------



## ViRuS2k

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I think it might be the liquid inside to worry about. That's some hot water running around in there.


haha yeah i know but shouldnt be any issue with that hot water though the water could get up to 100+c and still be safe in there
the card would shut down well before the rubber tubes melted.


----------



## ViRuS2k

Quote:


> Originally Posted by *ViRuS2k*
> 
> haha yeah i know but shouldnt be any issue with that hot water though the water could get up to 100+c and still be safe in there
> the card would shut down well before the rubber tubes melted.


one other thing im thinking about changing / upgrading the fans on my radiator as with 2 fans and better performance ones im sure it would knock off 5+c max temps....
as i seem to be always near enough hitting the thermal limit of 75c we really need a way to bump that up to 80c or something...

only the most demanding of games reach up there though... like crysis 3..


----------



## NavDigitalStorm

Even with dual fans, my card still hits 74C on BF4. I think the increased pressure of high temps is the issue. Im just going to set my card to downclock when playing games like BF4, no need for 140 fps on a 60hz display haha.


----------



## axiumone

Still can't figure out the issue with 295x2 crossfire in eyefinity portrait. I've rebuilt the pc from the ground up today. I bought an evga x79 dark to see if it was my asus r4be that was acting up, but that's not the case, both boards exhibit the same behavior.

14.4, 14.6, ulps disabled / enabled, powerplay disabled / enabled. Downclocking, overclocking, switching slots, win 7, win 8, win 8.1, nothing seems to fix the stuttering / tearing across all of the screens.

Having just one card has no issues at all, but enabling crossfire starts the trouble. I've tried each card individually as well and they are both fine.

The only thing I haven't tried yet is a different psu. I've ordered an ax1500i today to replace my nex1500 to see if it makes any difference.

Pretty much at my wits end here.


----------



## DeadlyDNA

Quote:


> Originally Posted by *axiumone*
> 
> Still can't figure out the issue with 295x2 crossfire in eyefinity portrait. I've rebuilt the pc from the ground up today. I bought an evga x79 dark to see if it was my asus r4be that was acting up, but that's not the case, both boards exhibit the same behavior.
> 
> 14.4, 14.6, ulps disabled / enabled, powerplay disabled / enabled. Downclocking, overclocking, switching slots, win 7, win 8, win 8.1, nothing seems to fix the stuttering / tearing across all of the screens.
> 
> Having just one card has no issues at all, but enabling crossfire starts the trouble. I've tried each card individually as well and they are both fine.
> 
> The only thing I haven't tried yet is a different psu. I've ordered an ax1500i today to replace my nex1500 to see if it makes any difference.
> 
> Pretty much at my wits end here.


is there a way to try 3 way cf? How are the monitors connected and what resolution/displays? Also what are you testing with games?Vsynch?


----------



## axiumone

No way to test 3 way unfortunately. Res is [email protected] - Tried 60, 100. Tried just 3 monitors instead of 5 as well. With vsync and without, doesnt matter.

Also the cards are not overheating or throttling.

Monitors are connected with 4 x DP and one DVID DL. No adapters.


----------



## levism99

Quote:


> Originally Posted by *axiumone*
> 
> No way to test 3 way unfortunately. Res is [email protected] - Tried 60, 100. Tried just 3 monitors instead of 5 as well. With vsync and without, doesnt matter.
> 
> Also the cards are not overheating or throttling.
> 
> Monitors are connected with 4 x DP and one DVID DL. No adapters.


It's just very bad drivers for crossfire !!!


----------



## DeadlyDNA

I seem to recall having this issue with 290s. Trying to remember if mine was driver related. Try testing it in a single 1080p screen with quad cf.


----------



## axiumone

Single display works perfect. No issues what so ever.


----------



## DeadlyDNA

Quote:


> Originally Posted by *axiumone*
> 
> Single display works perfect. No issues what so ever.


in device manager can you disable one of your gpus? The reboot maybe it will allow 3 way then. Are you monitoring fps and gpu usage? It may just be quad cf and eyefinity as I think I was dealing with same issue. I am on 13.12 though 14.x driver arent working for me yet


----------



## axiumone

Quote:


> Originally Posted by *DeadlyDNA*
> 
> in device manager can you disable one of your gpus? The reboot maybe it will allow 3 way then. Are you monitoring fps and gpu usage? It may just be quad cf and eyefinity as I think I was dealing with same issue. I am on 13.12 though 14.x driver arent working for me yet


Man! That would have been epic if it worked. I actually had 290's in quadfire before this and I had to use 13.12 in order to get normal quadfire performance. Unfortunately, if you disable one of the 295x2 gpu's in device manager it actually disables the whole card.

I've also tried to hack 13.12 drivers to work with 295x2 and I was almost successful. I was able to get them to install after thoroughly hacking them, but everything else was broken. Clocks didnt show right in 2d/3d and you're not able to set them and it really screws up the CCC.


----------



## ViRuS2k

Guys, you can disable crossfire with the card you need to use radeon pro and force crossfire off.
works with all games and i have tested this well not all games but works with dx9 and dx11 and dx10 games. mostly all lol


----------



## papi4baby

Quote:


> Originally Posted by *4K-HERO*
> 
> I have an Asus 295x2. I don"t hear anything of the sort. My case is closed. but the gpu"s radiator is mounted on the exhaust vent. I was getting 74 degrees and throttling on mine also, because it is crossfired with a 290, but that was when the rad was mounted in the front. Mount it as an exhaust and put a better fan on it. Mine never goes over 70 degrees now and thats crossfired, and its the top card! I have a gentle typhoon 3000rpm pushing the air out of that sucker!


Thanks for the reply. And yes I did mounted to the front. I will move it to the back and add some fans to the case.

Sent from my iPhone using Tapatalk


----------



## DeadlyDNA

Quote:


> Originally Posted by *ViRuS2k*
> 
> Guys, you can disable crossfire with the card you need to use radeon pro and force crossfire off.
> works with all games and i have tested this well not all games but works with dx9 and dx11 and dx10 games. mostly all lol


Yes, but the issue he is having is very specific. He uses crossfire with 1 card in 5x1 eyefinity and has no issues. When he uses 2 cards in quadfire he has screen tearing and stuttering. I was hoping he could use 3 way CF and see what his results would be. I am assuming he wants to use all his gpu's but the problem is preventing that. So if he was able to run 3 way CF then he could possibly get more power and not have the screen tearing.

I am really suprised they arent allowing 3 way CF with the 295x2, that will be a problem for games that work fine in 3 way cf but arent any good at 4 way cf for users running eyefinity every GPU counts.


----------



## shadow85

I was wondering what happens if I were to have a 295x2 in CF with a 290x Tri-x OC. The Tri-x oc has different core and memory speeds to the 295x2, does the speeds change or do they stay the same?


----------



## ImperialOne

Speeds stay the same for each card.... and this trifire does well! http://m.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ImperialOne*
> 
> Speeds stay the same for each card.... and this trifire does well! http://m.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/


Here are my tri-fire benchmarks with 290X and 295X2

http://www.digitalstormonline.com/unlocked/amd-radeon-r9-295x2-and-r9-290x-hybrid-crossfire-review-multi-gpu-scaling-idnum251/


----------



## ViRuS2k

well guys i have run into a issue with my card.
GPUz bus rendering test
i am getting black horizontal line flicking.

very faint but i can totaly notice them
also sometimes when i reboot my computer and get back into windows and rerun the render test the black lines vanish but then i loose performance
its like the card is operating at a slower speed even though its showing that it is operating at correct clocks and speeds

and if i reboot again the performance is back its like the PCI Express bus 3.0 is glitching or not getting enough bandwidth rebooting solves it
but only temporally :?
also my system has started to lockup freeze in watchdogs with a loud screeching sound coming out of my speakers.

i did a memtest and no issues with my memory
also ran cpu test and no issues there

im thinking i might not have enough power from my PCPower&Cooling 950w psu.
but then again when i had my 2x 7950s the bus bandwith issues was the same that i am experiancing but the flicking black horizontal line was not happening with those cards.
or the system freezing in games......

see computers there pains in the ass lol i should play on my ps4 or xboxone more often haha


----------



## axiumone

Yeah, I'd think power issues as well. You're running a 295 and a 290 off a 950 psu? Any other psus you can test?

I'm asking because the dual 295x2 and overclocked CPU pull 1750 watts from the wall. I'd figure trifire is good for around 1100-1300 watts.


----------



## NavDigitalStorm

You need ATLEAST a 1200w for that tri-fire if you don't want power issues.


----------



## ViRuS2k

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> You need ATLEAST a 1200w for that tri-fire if you don't want power issues.


Im not running trifire yet.

just a single r9 295x2.


----------



## Cool Mike

Our 295X2 sure does look like the buy of the year compared to the Titan Z. Available at Newegg for $2,999.









The reviews by DizZz have the 295X2 beating the Titan Z in many benches.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Our 295X2 sure does look like the buy of the year compared to the Titan Z. Available at Newegg for $2,999.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The reviews by DizZz have the 295X2 beating the Titan Z in many benches.


Those are my reviews...


----------



## Cool Mike

Sorry Nav, I should have looked closer. Great job.









The 295X2 is defiantly the overall winner here. Cant believe NVidia came in at that price.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Sorry Nav, I should have looked closer. Great job.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 295X2 is defiantly the overall winner here. Cant believe NVidia came in at that price.


I Don't blame them, developers will pay the cash.


----------



## Cool Mike

Your correct. I work for a medical device company. 3K is no big deal for a work related expense.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Cool Mike*
> 
> Your correct. I work for a medical device company. 3K is no big deal for a work related expense.


Precisely


----------



## Cool Mike

Also being advertised a gamer's graphics card. Takers here may be low.


----------



## ImperialOne

Quote:


> Originally Posted by *Cool Mike*
> 
> Also being advertised a gamer's graphics card. Takers here may be low.


Risk is low since nVidia will (rightly) believe the segment burned by buying Titan Z for gaming is really small. Stiil, the hubris of my guys in Green... And Jen-Hsun Huang said the Titan Z was bit for 4K gaming, but 5K gaming?!?






I hope someone got demoted for that.


----------



## ImperialOne

Sorry for double post... couldn't resist: JHH said this was the "Woodstock" for computational professionals... I think he took a little too big a hit from the Green Bong


----------



## Elmy

This picture reminds me of the Youtube video that follows.....


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Elmy*
> 
> This picture reminds me of the Youtube video that follows.....


Yeah, when I opened the box it was pretty much that haha.


----------



## papi4baby

Any other way to force crossfire off on this card? I want to play Ghost and i can't due to it being a lag fest because of no crossfire support. RadeonPro does not work for me since i am running 64bit OS.

Thanks.


----------



## blue1512

Quote:


> Originally Posted by *papi4baby*
> 
> Any other way to force crossfire off on this card? I want to play Ghost and i can't due to it being a lag fest because of no crossfire support. RadeonPro does not work for me since i am running 64bit OS.
> 
> Thanks.


Did you try going to Device manager and disable the slave card? I did it on my old 7990, not sure about 295x2 though.


----------



## mparra11

Hey guys! here is my Sapphire 295x2! It will be getting a full water block soon!


----------



## papi4baby

Quote:


> Originally Posted by *mparra11*
> 
> Hey guys! here is my Sapphire 295x2! It will be getting a full water block soon!
> 
> Pic overload]


Need more pictures to make sure.

J/K

Sent from my iPhone using Tapatalk


----------



## mparra11

http://www.3dmark.com/3dm/2956913

This seem right to you guys?


----------



## Elmy

Quote:


> Originally Posted by *mparra11*
> 
> http://www.3dmark.com/3dm/2956913
> 
> This seem right to you guys?


Here is mine with 295X2 and a 290X. I haven't done a 2 295X2 run yet. Still painting my PSU...

http://www.3dmark.com/fs/2129458

yes yours does seem like only one is working there. Try running afterburner while you run the benchmark and see if all 4 GPU's are working.


----------



## mparra11

I only have one 295 lol. I feel stupid for asking u can have a 295 and a 290 in crossfire?


----------



## axiumone

Yep, you sure can.


----------



## mparra11

That bentchmark i posted was for 1 295x2


----------



## Elmy

Quote:


> Originally Posted by *mparra11*
> 
> That bentchmark i posted was for 1 295x2


Well than yes... thats a good score....

Here is mine at stock clocks on CPU and GPU. I could do an overclocked run for you if you want to see it.

http://www.3dmark.com/fs/2116576


----------



## mparra11

Yeah they would be awsome. I have been thinking about overclocking mine.


----------



## Elmy

Quote:


> Originally Posted by *mparra11*
> 
> Yeah they would be awsome. I have been thinking about overclocking mine.


http://www.3dmark.com/3dm/3182662? 16167

This is with GPU @ 1075/1400 and 4770K @ 4.5

Beta 14.6 drivers


----------



## mparra11

Thats a nice jump. What are ur temps at


----------



## Elmy

Quote:


> Originally Posted by *mparra11*
> 
> Thats a nice jump. What are ur temps at


Around 70 with 2 fans in push/pull


----------



## mparra11

Now im tempted to overclock mine lol. Whatd did u use afterburner?


----------



## Elmy

Quote:


> Originally Posted by *mparra11*
> 
> Now im tempted to overclock mine lol. Whatd did u use afterburner?


yes


----------



## 4K-HERO

Here's my firestrike run if anybody is interested in crossfiring their x2 with a 290. It seems to do just as well as the 290x with an x2 in the graphics score. If you plan on gaming in 4k with max settings, You'll need to crossfire the beast with a 290 or better to keep the game smooth.

http://www.3dmark.com/fs/2227178


----------



## NavDigitalStorm

Quote:
Originally Posted by *mparra11* 

Hey guys! here is my Sapphire 295x2! It will be getting a full water block soon!



Spoiler: Warning: Spoiler!


























Accepted!


----------



## alamox

The Asus RoG ARES III gonna be here soon, source : http://www.pcper.com/news/Graphics-Cards/Computex-2014-ASUS-Announces-ROG-Ares-III-Water-Cooled-Gaming-Graphics-Card


----------



## NavDigitalStorm

Quote:


> Originally Posted by *alamox*
> 
> The Asus RoG ARES III gonna be here soon, source : http://www.pcper.com/news/Graphics-Cards/Computex-2014-ASUS-Announces-ROG-Ares-III-Water-Cooled-Gaming-Graphics-Card


Yeah, that thing is MASSIVE


----------



## papi4baby

Quote:


> Originally Posted by *blue1512*
> 
> Did you try going to Device manager and disable the slave card? I did it on my old 7990, not sure about 295x2 though.


Thanks for the tip, it did work but it made the game run even worse.


----------



## kalijaga

Hi guys,

just join the group after buying the card on the weekend (on a whim) and the samsung 590 ...was thinking to run CF but my PSU may not like it. At the moment I am using Strider Gold Evo 1200w and its fine with one card. I am thinking of buying 2 CM v1000 psu and hotwire them to run the CF config with maybe 1 psu running 1 card and the other running card and the rest of the system.
I would love to have a 1500w+ PSU but they are in short supply in my area, and the cost much more than the dual v1000.

I would appreciate any advice from anybody running dual PSU, should they be the same wattage or can they be different.

by the way, using my Haf X I get maximum temp of 65C with an ambient of about 30C (no aircond room , tropical climate). I am using CM jet-flo to run pull duty from the radiator, and it is mounted on the back exhaust.

Cheers.


----------



## GravemanSam

Which waterblock do you guys think is the best for the 295x2. Aqua Computer vs EK vs XSPC vs Koolance What one has your favorite look and what do you think will have the best performance?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *GravemanSam*
> 
> Which waterblock do you guys think is the best for the 295x2. Aqua Computer vs EK vs XSPC vs Koolance What one has your favorite look and what do you think will have the best performance?


I really like EKs.


----------



## Elmy

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I really like EKs.


I got 2 Nickel EK blocks that will be here Friday morning. I'll post pics as soon as I get them.


----------



## axiumone

http://videocardz.com/50743/asus-announces-rog-ares-iii#disqus_thread

More news and pics about the ares iii card.

It has THREE outputs! Dvi, hdmi and a displayport. What were asus engineers thinking?


----------



## King4x4

Indeed... Back to looking at the 295x2 and drooling!

Thinking of buying three AOC 4ks and plugging them into that beauty.... 4k Eyefinity oh yes!


----------



## Jpmboy

Quote:


> Originally Posted by *King4x4*
> 
> Indeed... Back to looking at the 295x2 and drooling!
> 
> Thinking of buying three AOC 4ks and plugging them into that beauty.... 4k Eyefinity oh yes!


would be very nice, but you'll need a couple of 295x2's for 4K eyefinity.


----------



## lowgun

Quote:


> Originally Posted by *axiumone*
> 
> http://videocardz.com/50743/asus-announces-rog-ares-iii#disqus_thread
> 
> More news and pics about the ares iii card.
> 
> It has THREE outputs! Dvi, hdmi and a displayport. What were asus engineers thinking?


Especially since the current 295x2 is already configured in a way that you can maintain the DVI + 4 DP connections on a single-slot bracket


----------



## NavDigitalStorm

I have no idea what ASUS was thinking with that output...


----------



## overclockFrance

I watercooled and crossfired 2 HIS 290X and I use Catalyst 14.6.

The crossfire overclock is not as good as 1 card : if I overclock only 1 card and unplug the other one, I easily reach 1200/1500 Mhz with Sapphire TriXX.

But with 2 cards, I cannot go higher than 1100/1450 Mhz whatever the GPU voltage (up to +175 mV). The GPU temperatures don't exceed 50°C, the VRM temperatures, 60°C and the power supply is powerful enough (EVGA 1000 W Platinum). I overclocked with MSI Afterburner but same result.

I don't understand why the overclock is far worse with 2 cards. Any idea ?


----------



## pronacyk

Are you trying to apply the overclock to both cards? One of your cards can do 1200/1500 while the other can only do 1100/1450. Try disabling "Synchronize settings" in MSI AB and setting the proper clocks & voltage to each card independently.


----------



## overclockFrance

Quote:


> Originally Posted by *pronacyk*
> 
> Are you trying to apply the overclock to both cards? One of your cards can do 1200/1500 while the other can only do 1100/1450. Try disabling "Synchronize settings" in MSI AB and setting the proper clocks & voltage to each card independently.


You are right : I always overclocked the GPUs synchronously. I unticked the synchronisation of the 2 cards and I am now able to overclock far higher with MSI AB or Trixx : 1175/1500 Mhz for the 2 cards.

According to my results, would the multi-GPU synchronisation be bugged in MSI AB and TriXX ?


----------



## fireedo

umm...does anyone here already has a hybrid cfx between R9 295x2 and R9 290 (Not 290x) ? since there is someone here who offer me his r9 290 (again without X) but he doesnt allow me to test it since we are separate by different city









can it work flawlessly? and what about power 1000watt sound safe or not?

I want to do this because I want to get "future proof" gaming rig for about next 2 years ...it sounds silly isnt it?









anyway thx for any reply

ps : I know there are already review about hybrid cfx with R9 290x but I need to hear from real users, common users....


----------



## lowgun

Guys, I guess I'm going to have to sell this thing off. The only place I can put the radiator for it in in the front, so the hot air off it is coming in to the case, and it keeps dropping down to 300MHz from what I can only assume is the VRM overheating. I've had it positioned behind the front fan (it makes the cores frosty cold but VRM overheats), pulling air from underneath case (cores okay temps, but VRMs overheat), or push hot air out the bottom of the case (VRMs seem to be okay, but cores hit 75C and throttle).


----------



## Cool Mike

Lowgun

I have my radiator at the rear.
This may be helpful, I previously had the Corsair 750D. I purchased the Corsair AIR540 and my VRM temps dropped by 10C. The PCIe power end of the 295x2 is slightly over 1 inch from the 140mm intake fan, so I am getting much more airflow to the 295x2 cooling fan.


----------



## King4x4

Quote:


> You can now download MSI AfterBurner 3.0.1 We just updated this graphics card tweak and overclock utility with some important fixes. We added core voltage control for reference design AMD RADEON R9 295X2 series graphics cards with NCP81022 voltage regulators. The hardware database for reference design AMD RADEON HD 7990 and AMD RADEON HD 290X series graphic cards was fixed. We embedded a fix for a GDI resource leak when tray icon monitoring mode is enabled. Also the RivaTuner Statistics Server has been upgraded to version 6.1.2


woot you got OCing potentatioal now... don't blow them cards up!









http://www.guru3d.com/news_story/download_afterburner_3_1.html


----------



## lowgun

Quote:


> Originally Posted by *Cool Mike*
> 
> Lowgun
> 
> I have my radiator at the rear.
> This may be helpful, I previously had the Corsair 750D. I purchased the Corsair AIR540 and my VRM temps dropped by 10C. The PCIe power end of the 295x2 is slightly over 1 inch from the 140mm intake fan, so I am getting much more airflow to the 295x2 cooling fan.


I wish I could, but having only X79 boards, the radiator/fan hit my RAM. I did get a new 450D in hopes it'd get better airflow than my 350D, but same problem. Maybe if the new Afterburner allows control of the VRM fan, I might be able to keep it under control. I'll update you tonight.


----------



## Cool Mike

GOOD NEWS!
MSI Afterburner now supports core voltage control for the 295X2 reference. Version 3.0.1

Sorry, just noticed the news discussed above.

Very good news. I will try tonight.


----------



## axiumone

Doesnt look like there are fan controls in the new version.


----------



## cennis

edit: the AIOs are removable.


----------



## lowgun

Quote:


> Originally Posted by *axiumone*
> 
> Doesnt look like there are fan controls in the new version.


How do you keep your VRMs from getting too hot? You have almost the exact same setup as me (Rampage IV, 295x2 with another card 2 spaces under it, 50D, etc.). If I play a stressful game like Crysis for maybe 10-15 minutes, the 295x2 starts to down-clock to 300MHz even though the cores are under 70C.


----------



## cennis

Quote:


> Originally Posted by *lowgun*
> 
> How do you keep your VRMs from getting too hot? You have almost the exact same setup as me (Rampage IV, 295x2 with another card 2 spaces under it, 50D, etc.). If I play a stressful game like Crysis for maybe 10-15 minutes, the 295x2 starts to down-clock to 300MHz even though the cores are under 70C.


Looking at the design of 295x2, isnt the vrm heatsink directly under the deadzone of its fan?
specially with non focused fans the dead/weak zone is larger than the fan motor..


----------



## axiumone

Quote:


> Originally Posted by *lowgun*
> 
> How do you keep your VRMs from getting too hot? You have almost the exact same setup as me (Rampage IV, 295x2 with another card 2 spaces under it, 50D, etc.). If I play a stressful game like Crysis for maybe 10-15 minutes, the 295x2 starts to down-clock to 300MHz even though the cores are under 70C.


Not sure how mine aren't throttling.

I have afterburner set up to disable power play, that limits core clock variations to a minimum. I also have 2 fans on each radiator for the cards.

The vrms get super hot though. I can't give you an exact figure, because there's no way to monitor the temps for the vrm in software, but they are extremely hot to the touch.

Cards are between 65-72c


----------



## Cool Mike

What is your power setting in CCC? 20-25% is all you need. Keeping this lower should lower your temps.


----------



## Jpmboy

3.0.1 beta is looking good!! Been "putzin" with the new afterburner. I certainly need more time with it to find the edges. One thing for sure... helps with higher clocks. need to make a google sheet of results...

early data:
stock = 1.228V 1090/1450 (+50%) stable to 3DMK11 (scene 1 pulls the most watts, much more than firestrike). any higher will artifact.

+55mV (1.294V peak) + 50%, 1115/1600 stable and temps <56C gpu1, <53C gpu2. (stock ref bios, koolance block). no artifacts!

I don't want to juice the pcb too quickly... will let it get "accustomed" to ~1.3V before going higher.


----------



## Cool Mike

I spend about 30 min with afterburner 3.0.1.
No artifacts in fire strike at 1130 core and 1650 memory.
+45 on the volts. The 295x2 needed slightly more volts for sure.
I will tweak it in more along with more stress testing when I get home later.


----------



## Satchmo0016

I'm about to pick up a 295x2 (Sapphires OC version) but I've seen some conflicting information. Would a single rail 860w 71A on 12v be enough for one of these? By the figures it looks like it should be fine, but just looking to see if anybody else is running on something similar. I didn't have any issues with crossfire 290s on this PSU. It's a Seasonic 860xp2 platinum.

Thanks.


----------



## Cool Mike

AMD recommends at least a quality 1000W. That said, looking at your system you should be fine. The Seasonic you own is high quality and has some reserve power.


----------



## Satchmo0016

Yeah, I've been really happy with this Seasonic, its also super quiet. I am kinda wishing I got something a little beefier so I could hybrid crossfire with one of these Vapor X 290s I have though.. sigh.


----------



## Bartouille

Voltage control on 295x2... poor thing. That thing is already pushed to the absolute limit. Hopefully no one burns his card.


----------



## Jpmboy

Quote:


> Originally Posted by *Cool Mike*
> 
> I spend about 30 min with afterburner 3.0.1.
> No artifacts in fire strike at 1130 core and 1650 memory.
> +45 on the volts. The 295x2 needed slightly more volts for sure.
> I will tweak it in more along with more stress testing when I get home later.


yeah, got higher clocks in FS, Mk11 puts a bit more stress on the card. on stock cooling watch the vrm temps !









really wanna tear it a new one? run some catzilla.








Quote:


> Originally Posted by *Bartouille*
> 
> Voltage control on 295x2... poor thing. That thing is already pushed to the absolute limit. Hopefully no one burns his card.


..and how might you know where that limit might be?


----------



## King4x4

He doesn't!


----------



## Cool Mike

Overall I am getting very little gains by adding voltage due to GPU2 throttling down (hitting 75C) during a 10-15 valley run or playing BF4 at 4K resolutions . Running firestrike I see gains and the overclock (with +45mV) is stable, but keep in mind, time at 100% load is minimal and therefore temps hit maybe 65C max.


----------



## ViRuS2k

is 1250w good enough for 2x295x2`s ? OCZ ZX 1250w ?
or only 1 295x2 and 1 R90x ? going to be throwing these cards on a watercooling loop soon as the blocks are in stock @ OCUK

is 2x 360mm rads good enough to cool 2 of these + 4770k it should be i hope lol


----------



## cennis

I am thinking of mounting two H90s onto my 295x2 and remove the stock AIO.

I tested these H90s on my 290s and they could keep temps in the low 60s with a 1200/1500, +175mw OC

I doubt this vrm cooling will hold up though, pretty bad design with the fan's motor directly above the heatsink so no air really gets to the middle of the heatsink


----------



## Jpmboy

Nothing to write home about (vs 4960X/R4BE/SLI kingpins







) but not bad at all for this old [email protected], PCIE gen2 rig. Certainly seeing gains with voltage control across the board: synthetics and games.. Koolance water block helps. I also have a 90mm fan w/rubber feet sitting right on top of the vrm opening of the backplate - 1800 rpm but silent in the TJ09 case.


----------



## cennis

Quote:


> Originally Posted by *Jpmboy*
> 
> Nothing to write home about (vs 4960X/R4BE/SLI kingpins
> 
> 
> 
> 
> 
> 
> 
> ) but not bad at all for this old [email protected], PCIE gen2 rig. Certainly seeing gains with voltage control across the board: synthetics and games.. Koolance water block helps. I also have a 90mm fan w/rubber feet sitting right on top of the vrm opening of the backplate - 1800 rpm but silent in the TJ09 case.


will these cards never run at 1200mhz even under water like most 290/290x ?


----------



## Skinnered

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> 
> Accepted!


What brand and type are those casefans? You have the corsair carbide air 540?


----------



## Jpmboy

Quote:


> Originally Posted by *cennis*
> 
> will these cards never run at 1200mhz even under water like most 290/290x ?


doubtful that this one will... but that's par for a dual gpu card vs the 2-card gpus. The only dual gpu card i know of that will OC that far above stock clocks (in a reletive sense) is the 7990 with some softmods. The 6990 did very well also. This card may be able to do 1200 if you keep the memory clock down. I wouldn't call the 295x2 a bench champ... but as a gaming card it performs exceptionally well.

edit: realize that at 4K the 295x2 stays above 60 fps for most any game (but so will cfx 290s/290xs until you saturate the vram)


----------



## cennis

1100 for 45 mw offset is not bad,
Quote:


> Originally Posted by *Jpmboy*
> 
> doubtful that this one will... but that's par for a dual gpu card vs the 2-card gpus. The only dual gpu card i know of that will OC that far above stock clocks (in a reletive sense) is the 7990 with some softmods. The 6990 did very well also. This card may be able to do 1200 if you keep the memory clock down. I wouldn't call the 295x2 a bench champ... but as a gaming card it performs exceptionally well.
> 
> edit: realize that at 4K the 295x2 stays above 60 fps for most any game (but so will cfx 290s/290xs until you saturate the vram)


Most 290/x need 100mw or more to hit 1200 so dont give up on this card

well, you are under water after all you can try upping the volts?


----------



## Cool Mike

Don't think anyone answered your questions.

Running two 295x2 you need a 1500W. A 295X2 and a 290x the 1250W will be fine.


----------



## Jpmboy

Quote:


> Originally Posted by *cennis*
> 
> 1100 for 45 mw offset is not bad,
> 
> Most 290/x need 100mw or more to hit 1200 so dont give up on this card
> 
> well, you are under water after all you can try upping the volts?


yeah, Dual gpu PCBs are a different beast. My (now sold) 290x did 1270, but not very good on memory... only 1625. believe it or not, it actually held 1st place on the OCN Mk11 board for a short period







http://www.3dmark.com/3dm11/7574870
On this 295x2, voltage seems to be helping memory more than core right now.


----------



## cennis

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah, Dual gpu PCBs are a different beast. My (now sold) 290x did 1270, but not very good on memory... only 1625. believe it or not, it actually held 1st place on the OCN Mk11 board for a short period
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7574870
> On this 295x2, voltage seems to be helping memory more than core right now.


have you tried how much core you can get with 1500 or lesser on mem and 100mw? or 200mw?

the mem speed doesnt affect performance nearly as much as core

great score though. wish I had 5ghz hex core haha


----------



## ViRuS2k

could i get a little help with my previous thread please. ?????

are 2x 360mm rads good enough to cool 2x 295x2`s + 4770k I7 Haswell ?
and if so is 1250w OCZ XZ psu good enough..


----------



## Cool Mike

Yes, on two 360mm rad for two 295x2 cards and the cpu. You will need a quality 1500w psu to power the beast.


----------



## alamox

PowerColor Devil 13 AMD Radeon R9 290 X2 8192MB GDDR5 PCI-Express Graphics Card @ £1,139.99 inc VAT

http://forums.overclockers.co.uk/showthread.php?t=18605206

4x 8 pins, i wonder the overclocking on it when waterblocked


----------



## Jpmboy

Quote:


> Originally Posted by *ViRuS2k*
> 
> could i get a little help with my previous thread please. ?????
> 
> are 2x 360mm rads good enough to cool 2x 295x2`s + 4770k I7 Haswell ?
> and if so is 1250w OCZ XZ psu good enough..


So i have one 295x2, a 2700K in a loop with 2x360 rads and 6 fans (an aquacomputer 720XTMarkIII) ... extended gaming @ 1080/1400 will run the fans at full speed with a max water temp of ~ 37C (ambient 23C) card never above 53C with the koolance waterblock . If you push a pair of 295x2s hard with only 2x360 rads... 12 fans may do the trick. That OCZ has >100A on the 12V rail (>1000watts), I think you be cutting it close if your pumps. fans .. etc are on it also which is not a good thing to do with your PSU. using a killawatt meter, one overclocked 295x2 is >600 Watts (at the wall) on my rig, subtracting idle watts from total load watts.


----------



## evoll88

I am wanting to do either titan black sli or 295x2+290x for 4k gaming,any opinions on which be more future proof? Any problems with amd drivers with the 295x2+290x?? Also would this psu be enough for the 295x2+290x? Thanks for the help.


Spoiler: Warning: Spoiler!



http://www.amazon.com/EVGA-SuperNOVA-1300G2-ATX12V-120-G2-1300-XR/dp/B00COIZTZM/ref=sr_1_1?ie=UTF8&qid=1402585666&sr=8-1&keywords=EVGA+SuperNOVA+1300+G2


----------



## Roikyou

Quote:


> Originally Posted by *Jpmboy*
> 
> So i have one 295x2, a 2700K in a loop with 2x360 rads and 6 fans (an aquacomputer 720XTMarkIII) ... extended gaming @ 1080/1400 will run the fans at full speed with a max water temp of ~ 37C (ambient 23C) card never above 53C with the koolance waterblock . If you push a pair of 295x2s hard with only 2x360 rads... 12 fans may do the trick. That OCZ has >100A on the 12V rail (>1000watts), I think you be cutting it close if your pumps. fans .. etc are on it also which is not a good thing to do with your PSU. using a killawatt meter, one overclocked 295x2 is >600 Watts (at the wall) on my rig, subtracting idle watts from total load watts.


What were you running to get your load greater than 600 watts? Curious as I tried a kill a watt while running dark souls 2 and maxed at 500 (maybe the crossfire didn't kick in the second gpu, running one maybe?). Currently running Corsair AX1200i, 11 fans, two mcp35x pumps, 295x2 of course, Aquaero 6 pro, msata, wd raptor 1k and 500 gb backup drive.

Personally, my thoughts with power supplies, they're greatest efficiency is around 50% and start to degrade (not by much) after 80%. So I would imagine pushing to 100% or greater would affect stability. I was sitting on the AX1500i but decided I didn't want to spend another 1500 plus water block for the second 295x2 but will wait for the day when the next card comes with with gains worth upgrading which I would image by then will be more efficient with power which means my 1200 is more than enough for now.


----------



## cennis

Anyone knows when then Aquacomputer kryographics Vesuvius for Radeon R9 295X2 acrylic glass edition, nickel plated version
block will be available?

has it ever been available or was it sold out?


----------



## cennis

Quote:


> Originally Posted by *GravemanSam*
> 
> i just sent him a pm on here. thanks


hey. did you get a reply?


----------



## Jpmboy

Quote:


> Originally Posted by *Roikyou*
> 
> What were you running to get your load greater than 600 watts? Curious as I tried a kill a watt while running dark souls 2 and maxed at 500 (maybe the crossfire didn't kick in the second gpu, running one maybe?). Currently running Corsair AX1200i, 11 fans, two mcp35x pumps, 295x2 of course, Aquaero 6 pro, msata, wd raptor 1k and 500 gb backup drive.
> 
> Personally, my thoughts with power supplies, they're greatest efficiency is around 50% and start to degrade (not by much) after 80%. So I would imagine pushing to 100% or greater would affect stability. I was sitting on the AX1500i but decided I didn't want to spend another 1500 plus water block for the second 295x2 but will wait for the day when the next card comes with with gains worth upgrading which I would image by then will be more efficient with power which means my 1200 is more than enough for now.


peaked during the raymarch test in catzilla. I also flashed the second bios with the sapphire OC bios and use the voltage unlock in MSI Abeta.

the 295x2 does surprisingly well in skydiver...


----------



## Roikyou

Quote:


> Originally Posted by *Jpmboy*
> 
> peaked during the raymarch test in catzilla. I'm also flashed the second bios with the sapphire OC bios and use the voltage unlock in MSI Abeta.
> 
> the 295x2 does surprisingly well in skydiver...


That makes sense, I have a feeling my setup is working as it should as everything is default bios and stock clocks.


----------



## Satchmo0016

Quote:


> Originally Posted by *Jpmboy*
> 
> peaked during the raymarch test in catzilla. I'm also flashed the second bios with the sapphire OC bios and use the voltage unlock in MSI Abeta.
> 
> the 295x2 does surprisingly well in skydiver...


I didn't realize that you could put the sapphire oc bios on a standard card. Is there any reason at all to get the oc version then? Are they better quality silicon or anything?


----------



## Cool Mike

My sapphire overclock is stable at 1100 core whereas my HIS maxed at 1080. My sample suggests better silicon.


----------



## Jpmboy

^ erm - did you load the SOC bios on the HIS card? If not, can't conclude anything about the quality of the silicon,

This power color has AISC values of 73.3% master and 79.3 slave.

I certainly hope the SOC has two binned GPUs on board, else a bios and a new fan curve ain't worth it IMO. THe SOC bios appears to have better memory timings. But basically, I'm seeing the same as mike. slightly higher clocks with stability. hard to tell without loading it onto coolmike's first 295x2 to compare.
Quote:


> Originally Posted by *Satchmo0016*
> 
> I didn't realize that you could put the sapphire oc bios on a standard card. Is there any reason at all to get the oc version then? Are they better quality silicon or anything?


----------



## Elmy

I took some pics. Sorry they are cell phone quality.



Club3D 295X2's undergoing surgery 

The top one has all the black screws on the block replaced with Stainless steel screws. Just took this picture to show the difference between the 2.



Here they are installed in my rig 



Here is my setup at Intel's Infernal LAN last weekend. I won the CPU magazine mod contest while I was at the event. My computer will be on the front cover of a future issue of CPU magazine with a article written up about it.


----------



## joeh4384

That looks very nice!


----------



## cennis

Quote:


> Originally Posted by *Elmy*
> 
> I took some pics. Sorry they are cell phone quality.
> 
> 
> 
> Club3D 295X2's undergoing surgery
> 
> The top one has all the black screws on the block replaced with Stainless steel screws. Just took this picture to show the difference between the 2.
> 
> 
> 
> Here they are installed in my rig
> 
> 
> 
> Here is my setup at Intel's Infernal LAN last weekend. I won the CPU magazine mod contest while I was at the event. My computer will be on the front cover of a future issue of CPU magazine with a article written up about it.


Beauty


----------



## Jpmboy

Quote:


> Originally Posted by *Elmy*
> 
> I took some pics. Sorry they are cell phone quality.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Club3D 295X2's undergoing surgery
> 
> The top one has all the black screws on the block replaced with Stainless steel screws. Just took this picture to show the difference between the 2.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here they are installed in my rig
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here is my setup at Intel's Infernal LAN last weekend. I won the CPU magazine mod contest while I was at the event. My computer will be on the front cover of a future issue of CPU magazine with a article written up about it.


Very nice!! Will be looking for your rig on the cover. (next stop... 4K eyefinity!)

Do you have any temp data from those EK blocks?


----------



## Profiled

Quote:


> Originally Posted by *Elmy*
> 
> I took some pics. Sorry they are cell phone quality.
> 
> Club3D 295X2's undergoing surgery
> 
> The top one has all the black screws on the block replaced with Stainless steel screws. Just took this picture to show the difference between the 2.
> 
> Here they are installed in my rig
> 
> Here is my setup at Intel's Infernal LAN last weekend. I won the CPU magazine mod contest while I was at the event. My computer will be on the front cover of a future issue of CPU magazine with a article written up about it.


Lightboost?


----------



## Elmy

Quote:


> Originally Posted by *Jpmboy*
> 
> Very nice!! Will be looking for your rig on the cover. (next stop... 4K eyefinity!)
> 
> Do you have any temp data from those EK blocks?


I have a XSPC AX 480 that had a copper core originally that was nickel plated. So that of course affects temps....
That being said temps were in low 50's for one 295X2 and low 60's when both Club3D 295X2's are running. I am guessing temps would drop 5c
or more with a better radiator. These Club3D 295X2's are on their own independent loop. I did not monitor VRM temps. Right now the rig
is being shipped up to Toronto for the ExtravaLANza event so I can't do any more testing for a couple weeks.

I won't do 4K eyefinity until there are 4K 120Hz 1ms monitors available. Thats going to be awhile because DP 1.3 is needed for that to happen.

Also even dual Club3D 295X2's aren't powerful enough to push 24 million pixels. I have a hard time pushing 10 miliion as it is. I love my setup though
its going to be awhile before I upgrade monitors. My next thing is to de-matte them all.


----------



## Synthaxx

Quote:


> Originally Posted by *Elmy*
> 
> Also even dual Club3D 295X2's aren't powerful enough to push 24 million pixels.


So, are you saying a 295x2 quadfire can't push 3 4k monitors? What would the framerates be if someone tried this?


----------



## Elmy

Quote:


> Originally Posted by *Synthaxx*
> 
> So, are you saying a 295x2 quadfire can't push 3 4k monitors? What would the framerates be if someone tried this?


I don't know. But I know it would be a lot less than what I am getting. We are waiting on DS to get 2 more 4K monitors to do proper testing....


----------



## Jpmboy

Even with a single 4K monitor (60hs, 1ms)







, one 295x2 manages well, but not as good as other 2 card configurations I have (kingpins and titans). I think 4K surround will need at least 3x295x2's. Quad titans may work also.


----------



## Synthaxx

Quote:


> Originally Posted by *Jpmboy*
> 
> Quad titans may work also.


Well that seems correct, according to this post:
http://www.overclock.net/t/1481789/baashas-4k-surround-quad-gtx-titan-black-sc-benchmarks-thread

This guy gets some damn impressive numbers...
Would this be much different from 295x2 quadfire?


----------



## NavDigitalStorm

Sweet rig Elmy!


----------



## Jpmboy

Quote:


> Originally Posted by *Synthaxx*
> 
> Well that seems correct, according to this post:
> http://www.overclock.net/t/1481789/baashas-4k-surround-quad-gtx-titan-black-sc-benchmarks-thread
> 
> This guy gets some damn impressive numbers...
> Would this be much different from 295x2 quadfire?


yup - basha's got it goin on. At 4K, in the same rig, 2 sli (reference, but flashed and unlocked) titans get nearly 2x the FPS of my 295x2 @ 1090/1450. However, power consumption by the titans is huge! >500 watts each, while the 295 is cruising along with ~ 450-500W in the same tests.


----------



## DeadlyDNA

4k eyefinity is not that bad on R9 29x cards, depending on game and support etc. 4k [email protected] is only achievable in older optimized game engines. I dont see any current gen GPU getting 120FPS avg on 4k surround/eyefinity in current game engines.Maybe with major sacrificing graphical settings you could? Its not to terrible to find 60fps sweet spots which could require graphical tweaks. and not lose much.

I dont have top of the line intel to test with but i would be glad to give a go on something specific to see where its at?

Either way Displays are the biggest bottleneck right now for 4k eyefinity/surround. 60hz max for now, in my case 30hz. I have a feeling display technology will lag way behind any PC hardware because it's like the games market, 4k TV's don't have much advantage in the TV market and just like consoles evolution is on a slow path if non-existent.


----------



## pompss

guyz have a problem.
my second gpu doesn't get more then 300 mhz when playing crysis 3 and fps is 32-45 .
My psu is seasonic ss-850 km (850 watt) also i have plug monitor where i can see how much watt my pc is using when gaming.
The max watt i get is 600 when playing so its not the psu .
Any advise?? and can someone do some test with crysis 3 and tell me if the second cpu si the same as the first gpu please?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *pompss*
> 
> guyz have a problem.
> my second gpu doesn't get more then 300 mhz when playing crysis 3 and fps is 32-45 .
> My psu is seasonic ss-850 km (850 watt) also i have plug monitor where i can see how much watt my pc is using when gaming.
> The max watt i get is 600 when playing so its not the psu .
> Any advise?? and can someone do some test with crysis 3 and tell me if the second cpu si the same as the first gpu please?


Do you have a single or 2x 295X2? Try to enable CrossFire in CC


----------



## pompss

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Do you have a single or 2x 295X2? Try to enable CrossFire in CC


Single r9 295x2. crossfire is enable still the second gpu is stuck at 300mgz in crisys 3 and watch dogs max 600 mhz.
I don't know what the problem are as the watt used are 600 so my pc dont even use all 850 watt from the psu.
I think may the 8 pin cable can't delivery the 28A in order for the card to work.


----------



## lowgun

Quote:


> Originally Posted by *pompss*
> 
> Single r9 295x2. crossfire is enable still the second gpu is going stuch at 300 in crisys 3 and watch dogs max 600 mhz.
> I don't what the problem are as the watt used are 600 so my pc dont even use all 850 watt from the psu.


It may be causes by your VRMs overheating. That is what happens to mine. GPUs themselves are cool, but since there is no fan control, the only way for the VRMs to cool down is to slow it to a crawl and wait.


----------



## Satchmo0016

You're in fullscreen, right? crossfire gets disabled or you alt tab. Try alt+enter to make sure you're in fullscreen.


----------



## pompss

Quote:


> Originally Posted by *Satchmo0016*
> 
> You're in fullscreen, right? crossfire gets disabled or you alt tab. Try alt+enter to make sure you're in fullscreen.


thanks was the fullscreen mode not enable


----------



## pompss

Quote:


> Originally Posted by *pompss*


ok now seems to work


----------



## axiumone

Out of curiosity. Does anyone know if the VRM on the 295x2 actually have temp sensors and we just need to wait for support from the msi afterburner or the like?


----------



## rdr09

Quote:


> Originally Posted by *Elmy*
> 
> I took some pics. Sorry they are cell phone quality.
> 
> 
> 
> Club3D 295X2's undergoing surgery
> 
> The top one has all the black screws on the block replaced with Stainless steel screws. Just took this picture to show the difference between the 2.
> 
> 
> 
> Here they are installed in my rig
> 
> 
> 
> Here is my setup at Intel's Infernal LAN last weekend. I won the CPU magazine mod contest while I was at the event. My computer will be on the front cover of a future issue of CPU magazine with a article written up about it.


it's a beauty. 2 ocn members in a row.


----------



## Cool Mike

Any word on when the Powercolor Devil 13 dual core (295x2) will be released?


----------



## kalijaga

Just took the pictures of my Sapphire and MSI 295x2 in my rig.
Still have a lot of issues with regards to the miniDP to DP adapter to the UD590.

Hope I can join the party...

Cheers.

DSC_01081.JPG 390k .JPG file


----------



## DeadlyDNA

Quote:


> Originally Posted by *kalijaga*
> 
> Just took the pictures of my Sapphire and MSI 295x2 in my rig.
> Still have a lot of issues with regards to the miniDP to DP adapter to the UD590.
> 
> Hope I can join the party...
> 
> Cheers.
> 
> DSC_01081.JPG 390k .JPG file


what issues are you having, i know someone who just had a UD590 and took it back because they had issues.


----------



## jpinks

This is the build I have all but the Video Card and this looked like a good place to ask about the R9295x2's, Are there any real differences or are they all reference designs??? IE have any of the Board makers cut corners on memory or anything that I should be aware of???

CPU: Intel Core i7-4930K 3.4GHz 6-Core Processor ($578.99 @ Amazon)
CPU Cooler: Corsair H100i 77.0 CFM Liquid CPU Cooler ($99.98 @ OutletPC)
Motherboard: Asus Rampage IV Black Edition EATX LGA2011 Motherboard ($468.99 @ SuperBiiz)
Memory: G.Skill Ripjaws Z Series 16GB (4 x 4GB) DDR3-1866 Memory ($154.99 @ Newegg)
Storage: Samsung 840 EVO 250GB 2.5" Solid State Drive ($134.99 @ Best Buy)
Storage: Seagate Barracuda 3TB 3.5" 7200RPM Internal Hard Drive ($104.99 @ Newegg)
Video Card: Sapphire Radeon R9 295X2 8GB Video Card ($1507.57 @ Newegg)
Case: Corsair 760T Black ATX Full Tower Case ($159.99 @ Newegg)
Power Supply: EVGA SuperNOVA 1300 G2 1300W 80+ Gold Certified Fully-Modular ATX Power Supply ($167.04 @ Newegg)
Operating System: Microsoft Windows 7 Home Premium SP1 (OEM) (64-bit) ($89.98 @ OutletPC


----------



## kalijaga

Hi DeadlyDNA,

my problems with the Sammy started when I started using the MSI 295x2. The latest Catalyst driver caused a crash, and then the screen went dark. I had to unplug the power to the screen and then it would restart. After installing the Sapphire 295x2, I had to fall back to the the 14.4 driver and only then CF works.
Now, almost everytime after 2-3 minutes booting up I will get a black screen for a few seconds and then the screen came back saying "samsung monitor not found" and " DP link failure".
My feeling is that there is some driver issues with AMD and the DP input on the Sammy.
I have no other output connected to the card.
I am using the MiniDP to DP adapter that came with Sapphire card and the DP to DP cable that came with the Sammy.
ULPS and frame pacing off. NO overclocking done.

I hope the next driver from AMD would be of help...

Cheers.


----------



## electro2u

Received an MSI 295x2 on Friday. Have been pretty amazed at the scaling performance. Came from 2x 780sli. Have had nightmares getting XF to work with Furmark or final fantasy XIV. Reinstalled windows 8.1 like 7 times trying a narrow down what is causing one of the Gpus to run at 300mhz even though it's at 100% utilization after shutting the game down from first login after windows reinstall. Crossfire works great the first time I load it up and then never again. Yes I know to use full screen only with crossfire. This is a definite disadvantage but I'll get used to it.

I've had mild success by not installing intel rapid storage or samsung magician. Game runs both Gpus at 1018mhz even after shutting it down. But then I hit alt-tab to desktop and the problem is back for good.

I have received a new UEFI bios from msi and it will allow me to correctly configure win 8.1 in UEFI mode. But I'm not optimistic. Anyone have any suggestions?

Furmark doesn't seem to use my 295x2 in cf at all. Ffxiv does but then doesn't ever again unless I stand on one foot. Unigine valley runs beautifully at 1440p but the card starts to lightly throttle at 70C after one round and it kinda freaks me out. I'm sold on CF on a stick but my card is running pretty damn sketchy even though I feel like my install is decent. In games I guess it wouldn't be throttling? I use vsync 24/7 but I always try to hit 120fps. Should I send it for exchange?


----------



## PachAz

Whats the point with the 295x2 anyways?


----------



## DeadlyDNA

Quote:


> Originally Posted by *kalijaga*
> 
> Hi DeadlyDNA,
> 
> my problems with the Sammy started when I started using the MSI 295x2. The latest Catalyst driver caused a crash, and then the screen went dark. I had to unplug the power to the screen and then it would restart. After installing the Sapphire 295x2, I had to fall back to the the 14.4 driver and only then CF works.
> Now, almost everytime after 2-3 minutes booting up I will get a black screen for a few seconds and then the screen came back saying "samsung monitor not found" and " DP link failure".
> My feeling is that there is some driver issues with AMD and the DP input on the Sammy.
> I have no other output connected to the card.
> I am using the MiniDP to DP adapter that came with Sapphire card and the DP to DP cable that came with the Sammy.
> ULPS and frame pacing off. NO overclocking done.
> 
> I hope the next driver from AMD would be of help...
> 
> Cheers.


Are you using AMD pixel patch by chance? Can you skip the adapter or you need it? Also, does it have HDMI/DVI outputs, perhaps you can test that even if it's only 30hz to see if it has issues. I would imagine alot of the 295x2 users here have DP in use. Maybe they can shed some light on if they have had any issues.

Quote:


> Originally Posted by *PachAz*
> 
> Whats the point with the 295x2 anyways?


Same as any Dual GPU card. Quad SLI/Cf in 2 pcie slots for those who don't have enough PCIE slots. Dual GPU cards also have more connections for things like large eyefinity setups(5 monitors)
I'm sure i missed other reasons. Realistically if someone says they just like it that should be a good enough reason.


----------



## pompss

Quote:


> Originally Posted by *PachAz*
> 
> Whats the point with the 295x2 anyways?


I have a xfx 295x2 but after see some bench of two 290 in crossfire i think the 295x2 its not worth the extra money.
I found two sapphire tri x 290 brand new for $700 dollars and two 1 month used r9 290 reference for $520
700 dollars more for 2-5 fps is not worth in my opinion.
they drain the same power around 700-800 watt.
So you right there is no point to get 295x2.
Planning to return it and get two r9 290


----------



## PachAz

I believe the same though.


----------



## DeadlyDNA

Quote:


> Originally Posted by *PachAz*
> 
> I believe the same though.


Just like you do about AMD FX and multi-GPU, you know when you harassed Red1776 about his quadfire build. Crapped on his thread.

News flash you were wrong on that:


Source:


Spoiler: Warning: Spoiler!



1xGPU

2xGPU

3xGPU

4xGPU




Just cause you believe something isn't worth it doesn't mean it isn't useful to someone else. Dual GPU cards have their place for very specific reasons. Look at Elmy's system pics. He has 5 monitors in eyefinity. if you look at a 295x2 you will notice its has an abundance of display ports. Try doing that on your 290. Oh wait you can't, you have to get a MST DP Hub wouldn't you. You only have 4 ports on your card.

If i sound annoyed it's because i am. I see you around quite a bit to put stuff down and no amount of explaining seems to help you much.


----------



## PachAz

I still dont see any point in getting the 295x2 for most of the users. Also isnt a 295x2 too weak for 5 monitors anyways? What I mean is that you get similar or better performance getting two r9 290, or even used ones these days. Its always controversal getting these PCB with two gpus on.

Edit: I saw your sy....never mind.


----------



## Roikyou

My two cents, I jumped on the 295x2 bandwagon as I went with the U590, yes, one monitor as there was mention of Samsung working with AMD on this one for better frame rates and compatibility with single stream transport vs multi stream transport. This is where AMD had the jump on Nvidia for a short time till they worked out the driver issue. The 295x2 has better temps and lower noise level (if you were going stock fans and cooling solution) compared to the 290x in crossfire. As mentioned, two cards, one slot, crossfire when supported by software, yes it's new, kind of buggy at times but so is everything that first comes out. Yes, your paying for a premium for single slot, lower noise and thermal levels.


----------



## DeadlyDNA

a 290 or 290x can't do this either



I am sorry but that is just pure awesome. 1 slot with all the power of 2 Hawaii's. Having crossfire or even quad crossfire in 1 or 2 slots, leaves a whole new world of options open.
Case size, Mainboard size and slots available. Someone can run a nice internal sound card, revo drives, hell even an nvidia card for physx. Mini HTPC that serves as a steambox also?

You can't just look at the performance and make a judgement that fits all ranges of hardware. Would i buy a 295x2, most likely not unless i factored in all the changes i may have to do for 290 or 290x CF.
Your looking at noise, heat, power, space. If i had to buy a new case/mobo or a water cooling setup to accommodate 290s that could easily make up the price gap and then some. I also personally would kill for 4 DP ports myself.


----------



## 4K-HERO

Hey, I had problems too. But it was from not doing enough research. So after some effort, I realized that Windows 7 and 8.1 automatically sets the resolution to 4k 60Hz as soon as its connected to the UD590. If you're having issues with the mini DP to DP cable, you have to make sure it is a version 1.2 cable.

When I got the card a couple of months ago, it was extremely difficult for me to find that cable. If you buy any cables from any store and its not labeled as version 1.2, it wont have enough bandwidth to handle a 4k resolution. I ended up finding a local pc repair shop with a supplier that brings in these 1.2 mini DP to DP cables made by Lenovo and only paid 25 bucks for it. All the shops on amazon were selling these cables for $49.99 because they knew it was hard to find.

With a version 1.1 cable you'll get a permanent black screen with an occasional flicker. I originally thought my card was busted, almost lost it.
Hope this helps someone.


----------



## kalijaga

Hi 4K-HERO,

thanks for the tips, I surely will check the cable version. It is the DP to DP cable that comes with the Sammy.

I am waiting for my miniDP to DP cable to arrive from Amazon, and I hope the issues will be resolved.
The price for being an early adopter I guess.

By the way, anybody tried Battlefield Hardline beta?
I am wondering because I am getting only 25-30fps either D3D or Mantle on 4k Max settings.
All 4 cores seemed to be running max.

Cheers.


----------



## axiumone

Quote:


> Originally Posted by *kalijaga*
> 
> Hi 4K-HERO,
> 
> thanks for the tips, I surely will check the cable version. It is the DP to DP cable that comes with the Sammy.
> 
> I am waiting for my miniDP to DP cable to arrive from Amazon, and I hope the issues will be resolved.
> The price for being an early adopter I guess.
> 
> By the way, anybody tried Battlefield Hardline beta?
> I am wondering because I am getting only 25-30fps either D3D or Mantle on 4k Max settings.
> All 4 cores seemed to be running max.
> 
> Cheers.


Im in the beta, chose the BF4 crossfire profile manually. I get around 90-140 fps with max available settings, at 5400x1920.


----------



## crazygamer123

Hi,

Can you guys give me some helps for this problem?

http://www.overclock.net/t/1496773/little-help#post_22439073

Thanks


----------



## DeadlyDNA

Quote:


> Originally Posted by *crazygamer123*
> 
> Hi,
> 
> Can you guys give me some helps for this problem?
> 
> http://www.overclock.net/t/1496773/little-help#post_22439073
> 
> Thanks


If you have MSI Afterburner you can disable ULPS in the advanced menu. Hope that helps


----------



## Zaxis01

So is this card really 8gb of dedicated vram or just 4gb?


----------



## kalijaga

Hi crazygamer123,

I did what was suggested in this thread, went into regedit and search for 'enableULPS', change setting to 0.
Or use afterburner. Now all GPUs should be detected and running ALL the time.

Axiumone,
must be CPU limited for me I think. you BHardline fps is way way higher than me.
I think an upgrade may be in order....or wait until Haswell E is available.....

Decision...decision....


----------



## crazygamer123

Quote:


> Originally Posted by *axiumone*
> 
> Im in the beta, chose the BF4 crossfire profile manually. I get around 90-140 fps with max available settings, at 5400x1920.


Quote:


> Originally Posted by *kalijaga*
> 
> Hi crazygamer123,
> 
> I did what was suggested in this thread, went into regedit and search for 'enableULPS', change setting to 0.
> Or use afterburner. Now all GPUs should be detected and running ALL the time.


Yes , its the ULPS, i wonder why they turn in on by default. It minimizes the Card's performance.

Axiumone, thats insane fps. Nice rig also


----------



## crazygamer123

You guys know what is the safe Overclock (Core clock and memory clock) for the card?


----------



## jnataros

Can I join the club?









MATX build, two R9 295x2's with a 4960X on a rampage IV Gene

Alex


----------



## Elmy

Quote:


> Originally Posted by *jnataros*
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MATX build, two R9 295x2's with a 4960X on a rampage IV Gene
> 
> Alex


Powerful little beast you got there. Nice work!


----------



## kalijaga

Quote:
Quote:


> Originally Posted by *jnataros*
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MATX build, two R9 295x2's with a 4960X on a rampage IV Gene
> 
> Alex


Superb machine there. Congrats.

I have been trying to think of a chassis that can fit both 295x2 rads on top while maintaining space for the 240mm rads for the CPU.
This is an excellent idea.


----------



## DeadlyDNA

Haha thats a hell of a box to take to a lan party. Nice!


----------



## Jpmboy

Quote:


> Originally Posted by *jnataros*
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MATX build, two R9 295x2's with a 4960X on a rampage IV Gene
> 
> Alex
> 
> 
> Spoiler: Warning: Spoiler!


beautiful, compact quad-fire rig!!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *jnataros*
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MATX build, two R9 295x2's with a 4960X on a rampage IV Gene
> 
> Alex
> 
> 
> Spoiler: Warning: Spoiler!


Accepted!


----------



## pompss

I still have problems with my r9 295x2.
The fan is off and I need to push it in order to function.
Also with a old tn 24 monitor at full HD resolution I get 30 fps in watch dogs.
I disable the ULPS and still doesn't function well.
Also in tomb raider and crisis 3 I have frame drops from 100 fps to 32 for 20sec and crisis 3 from 65 fps to 30 fps.This happen every 30seconds.
This is my second r9 295x2 that gives me problems.
Well lesson learned never ever again AMD cards.


----------



## axiumone

Sounds a little like a driver issue. Have you tried a clean driver install with a third party tool?


----------



## jnataros

Quote:


> Originally Posted by *pompss*
> 
> I still have problems with my r9 295x2.
> The fan is off and I need to push it in order to function.
> Also with a old tn 24 monitor at full HD resolution I get 30 fps in watch dogs.
> I disable the ULPS and still doesn't function well.
> Also in tomb raider and crisis 3 I have frame drops from 100 fps to 32 for 20sec and crisis 3 from 65 fps to 30 fps.This happen every 30seconds.
> This is my second r9 295x2 that gives me problems.
> Well lesson learned never ever again AMD cards.


Sounds like you got a defective board. Return it for another..

Now don't go hating AMD for this. I own literally thousands of GPUs in arrays and I can honestly say that failure rates between AMD and Nvidia GPUs is about equal. It's just luck of the draw on a defective item. No one is immune to it. Not AMD and not Nvidia.

Alex


----------



## crazygamer123

Quote:


> Originally Posted by *pompss*
> 
> This is my second r9 295x2 that gives me problems.
> Well lesson learned never ever again AMD cards.


Hi, you are still lucky. My first Sapphire R9 295x2 failed after 15 minutes out of the box. But the replacement from Newegg was running just fine.

Here is mine : the card inside a Haf-X. Want to add one more 295x2. But there is no more room







((


----------



## Roikyou

Came so close to a second 295x2 but reading the scaling with quadfire wasn't worth it, decided to wait and besides, the 20nm will come some day, new gpu's coming down the road...

My other reservation for a second card besides scaling is my card I have now has a mild electric buzz under load, it's not too loud but considering my whole computer is so quiet, it sticks out like a sore thumb.


----------



## pompss

unistall the driver with DDU and i reinstall it

I receive this message after installing the drivers:

Installation complete ( warnings occured during installation ) view file log for details

here is the file log content

-UI
06/21/14 11:30:16
install

AMD Radeon R9 200 Series
Advanced Micro Devices, Inc.

0x67b9
0x1002
0x0b2a
0x1002
0x030000
0x00

AMD Radeon R9 200 Series
Advanced Micro Devices, Inc.

0xaac8
0x1002
0xaac8
0x1002
0x040300
0x00

AMD Radeon R9 200 Series
Advanced Micro Devices, Inc.

0x67b9
0x1002
0x1b2a
0x1002
0x038000
0x00

AMD Catalyst Install Manager
Succeed
8.0.916.0
20

Microsoft Visual C++ 2012 Redistributable (x86)
Succeed
11.0.50727
9

AMD Display Driver
Succeed
14.200.0.0000
90

HDMI Audio Driver
Succeed
9.0.0.9905
1

Microsoft Visual C++ 2012 Redistributable (x64)
Succeed
11.0.50727
9

AMD Accelerated Video Transcoding
Succeed
13.30.100.40522
3

AMD Catalyst Control Center
Succeed
2014.0522.2157.37579
150

ACP Application
Succeed
1.00.0000
1

AMD Gaming Evolved App
Succeed
1.10.000
49

Hardware information
Existing packages
Packages for install
Packages for uninstall
Other detected devices
Error messages
Name
Manufacturer
Chip type
Device ID
Other hardware
Download packages
Success
Fail
Vendor ID
Class Code
Revision ID
Subsystem ID
Subsystem vendor ID
Catalyst™ Install Manager
Installation Report
Final Status:
Version of Item:
Size:
Mbytes


----------



## axiumone

That seems fine. I get that message once in a while too. It looks like everything installed without issues.


----------



## pompss

Coming from gtx 780 ti kingpin which i had problems too. Evga customers service was not so good but at least you have phone number to call.
They delivered a brand new kinping after 1 week.
Xfx after two days i dont even get an email or an answer for the r9 295x2 and there is no phone number to call.
How its possible selling $1500 video cards and no phone number to call.
I waited 3 month for a refund of my msi r9 290x because they don't have it in stock this happens 6 month ago
As i said amd work with companies that have really bad customers support also the defective rate are triple compared to nvidia card.
I bought amd and video card for 20 years now and i come to conclusion that is better to go with nvidia.
I'm waiting now for the gtx 780 ti kingpin 6 gb and return or sell the r9 295x2.


----------



## Roikyou

Wish I could remember where I found the number for XFX. I might have created a support ticket first but I called just to confirm I wasn't going to void my warranty by putting on a custom water block, which they responded that they expected North America to use custom waterblocks. I had crazy faith and just went ahead, put on the block and set it up in my custom loops, good thing, no issues other than the card has a mild buzz under load.


----------



## ImperialOne

Looking into doing a Trifire setup with a 295x2 and a 290x using an Asus Z97 Sabertooth. Those two card will occupy the two PCIe 3.0x16 slots, right? So, fundamentally running x8/x8? That really shouldn't make a noticeable difference I would think (speedwise)... Am I off base?


----------



## hotrod717

Quote:


> Originally Posted by *pompss*
> 
> Coming from gtx 780 ti kingpin which i had problems too. Evga customers service was not so good but at least you have phone number to call.
> They delivered a brand new kinping after 1 week.
> Xfx after two days i dont even get an email or an answer for the r9 295x2 and there is no phone number to call.
> How its possible selling $1500 video cards and no phone number to call.
> I waited 3 month for a refund of my msi r9 290x because they don't have it in stock this happens 6 month ago
> As i said amd work with companies that have really bad customers support also the defective rate are triple compared to nvidia card.
> I bought amd and video card for 20 years now and i come to conclusion that is better to go with nvidia.
> I'm waiting now for the gtx 780 ti kingpin 6 gb and return or sell the r9 295x2.


With all those recent failed cards, I'd be looking at something else in your system. 3 bad cards in a row seems to be very,very high odds.


----------



## pompss

Quote:


> Originally Posted by *hotrod717*
> 
> With all those recent failed cards, I'd be looking at something else in your system. 3 bad cards in a row seems to be very,very high odds.


The second kingpin was working perfect so i really doubt its my system.


----------



## pompss

Can please someone tell me the average fps you guys get when playing watch dogs at 1440p and maybe 4k???
Tr


----------



## crazygamer123

hi.

i am planning to get a second one, which brand should i go for? sapphire, XFX or Asus...?


----------



## electro2u

Anybody have their stock 295x2 radiator set up with push/pull and both fans plugged into the card with a DC fan splitter? Seems like it's working fine to me...


----------



## pompss

Quote:


> Originally Posted by *crazygamer123*
> 
> hi.
> 
> i am planning to get a second one, which brand should i go for? sapphire, XFX or Asus...?


Sapphire and xfx sucks on customers care my personal exp. Go with asus


----------



## Jamble

Owned a 295x2 for about 3 weeks now. Its great, came from xfire 7970's.

I live in australia and room temp is usually above 30Celsius. It throttles before a single run through firestrike.
Its in nzxt 820 phantom with plenty of airflow, has anyone had any gains using a push/pull setup on the radiator?

If so which fans do you recommend using?

If i down clock to 950mhz i can run 100% load all day.
or if my aircon is on i never get over 67 degrees. so setup seems to be okay, just that ambient temp is letting me down.


----------



## jpinks

Might I please join this August Company









https://www.dropbox.com/s/4uvynwn37adb1lq/2014-06-21%2020.32.11.jpg

Anyone else using the Asus Rampage IV Black Ed?? It keeps forgetting my mouse on restart. I have to unplug and replug it in each time for the mouse to work..... Any ideas on that?

Here is the down and dirty before I get to work making it look pretty

https://www.dropbox.com/s/eb6ndt9z1ttgjww/2014-06-21%2023.24.05.jpg


----------



## fireedo

Quote:


> Originally Posted by *Jamble*
> 
> Owned a 295x2 for about 3 weeks now. Its great, came from xfire 7970's.
> 
> I live in australia and room temp is usually above 30Celsius. It throttles before a single run through firestrike.
> Its in nzxt 820 phantom with plenty of airflow, has anyone had any gains using a push/pull setup on the radiator?
> 
> If so which fans do you recommend using?
> 
> If i down clock to 950mhz i can run 100% load all day.
> or if my aircon is on i never get over 67 degrees. so setup seems to be okay, just that ambient temp is letting me down.


same here, ambient tempt can go crazy around 32-34 C at noon without Aircon
but for just benching it never goes above 67-68 C
gaming is another story, at least when I play Metro : LL it throtlles coz tempt is reaching 74 C.... but when gaming using Aircon (25-26 C ) I can just play fine, temp around 68-69 C max

I'm thinking getting custom water cooling from EKWB








so, is it really a custom water loop can give me a big difference while ambient here @ 32-33 C (without using aircon) ?


----------



## Jamble

I put an extra fan on the radiator that i had laying around. Just that made a decent difference.
I think getting two good radiator fans in push pull would sort it out.


----------



## PachAz

Even my stock r9 290 did throttle and I live in sweden and its not that hot here. So yeah a waterblock for the 290/295x2 is worth it by far. Its possibly one of the very few gpus a waterblock is really justified in terms of performance and noise level.


----------



## kalijaga

Quote:


> Originally Posted by *crazygamer123*
> 
> Hi, you are still lucky. My first Sapphire R9 295x2 failed after 15 minutes out of the box. But the replacement from Newegg was running just fine.
> 
> Here is mine : the card inside a Haf-X. Want to add one more 295x2. But there is no more room
> 
> 
> 
> 
> 
> 
> 
> ((


Hi crazygamer123,

Great stuff mate!

I never thought that someone here use haf x as well, as I had to ghetto mod the case to fit the 2 cards. One card is using the exit port at the back and attached with a jetflo 120 for push pull. The crazy mod I had to do was to put the second radiator towards the front exhausting forward and attached to another jetflo 120.
Now the room heater is complete, exhausting hot air from front, back and top (CPU CLC)...
Intake is only from the side window 200mm, bottom (one 140mm pulling), front 200mm and opened side behind the motherboard (due to the second PSU running).

Honestly, I am thinking of moving to haf stacker to accommodate the 2 PSUs, but I am still planning the position of rads ...and the dosh needed ..hahahah

Cheers.


----------



## crazygamer123

Quote:


> Originally Posted by *kalijaga*
> 
> I never thought that someone here use haf x as well, as I had to ghetto mod the case to fit the 2 cards. One card is using the exit port at the back and attached with a jetflo 120 for push pull. The crazy mod I had to do was to put the second radiator towards the front exhausting forward and attached to another jetflo 120.
> Now the room heater is complete, exhausting hot air from front, back and top (CPU CLC)...
> Intake is only from the side window 200mm, bottom (one 140mm pulling), front 200mm and opened side behind the motherboard (due to the second PSU running).
> 
> Honestly, I am thinking of moving to haf stacker to accommodate the 2 PSUs, but I am still planning the position of rads ...and the dosh needed ..hahahah
> 
> Cheers.


Thanks, You too have a good airflow in your system, i might consider change to a super tower case. Plenty of rooms to upgrade int he future as well.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *crazygamer123*
> 
> Hi, you are still lucky. My first Sapphire R9 295x2 failed after 15 minutes out of the box. But the replacement from Newegg was running just fine.
> 
> Here is mine : the card inside a Haf-X. Want to add one more 295x2. But there is no more room
> 
> 
> 
> 
> 
> 
> 
> ((


Accepted!


----------



## jpinks

I have a question regarding Watchdogs..... UHM I would expect this card to annihilate this game but I am getting CRAP FPS and WOW the supposed high graphics are very questionable!! Is it just drivers or what??


----------



## NavDigitalStorm

Quote:


> Originally Posted by *jpinks*
> 
> I have a question regarding Watchdogs..... UHM I would expect this card to annihilate this game but I am getting CRAP FPS and WOW the supposed high graphics are very questionable!! Is it just drivers or what??


Watch_Dogs is a crap optimized game in general.


----------



## ImperialOne

295x2 Owners: Has anyone here had problems with viewing @60Hz using DP1.2 on the new Asus PB287 4K screen?


----------



## crazygamer123

Quote:


> Originally Posted by *ImperialOne*
> 
> 295x2 Owners: Has anyone here had problems with viewing @60Hz using DP1.2 on the new Asus PB287 4K screen?


I am using asus pq321q (60hz DP 1.2). Sometime the monitor displays only half of the screen, the other half is black. The DP to mini DP adapter from Sapphire might cause the problem.


----------



## ImperialOne

Quote:


> Originally Posted by *crazygamer123*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ImperialOne*
> 
> 295x2 Owners: Has anyone here had problems with viewing @60Hz using DP1.2 on the new Asus PB287 4K screen?
> 
> 
> 
> I am using asus pq321q (60hz DP 1.2). Sometime the monitor displays only half of the screen, the other half is black. The DP to mini DP adapter from Sapphire might cause the problem.
Click to expand...

Don't you have a full DP to (full) DP connection option with the Asus PQ231?


----------



## crazygamer123

Quote:


> Originally Posted by *ImperialOne*
> 
> Don't you have a full DP to (full) DP connection option with the Asus PQ231?


Yes i do. But the r9 295 only support mini display port.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *crazygamer123*
> 
> I am using asus pq321q (60hz DP 1.2). Sometime the monitor displays only half of the screen, the other half is black. The DP to mini DP adapter from Sapphire might cause the problem.


That monitor has that issue with all GPUs. I've had that happen on NVIDIA solutions as well.


----------



## DeadlyDNA

Are any of you guys having issues with overclocking your video cards and displayport blanking out like its losing synch alot?


----------



## axiumone

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Are any of you guys having issues with overclocking your video cards and displayport blanking out like its losing synch alot?


Nope, among my multitude of issues, that hasnt been one of them across all of my 5 screens. Using any adapters?


----------



## DeadlyDNA

Quote:


> Originally Posted by *axiumone*
> 
> Nope, among my multitude of issues, that hasnt been one of them across all of my 5 screens. Using any adapters?


in my case yeah i was using an adapter, also on a 290x not a 295, but i wanted to ask here because i knew you guys were using Display ports.


----------



## papi4baby

So i am still having a weird (water leak sounds) after pushing the card hard. Not OC just after playing. It goes away and it returns.

I took the rad out to try and mount a push pull config and i shook it and i can hear water sloshing inside! Is this normal? I thought these system are design to be pretty much pressurized. Card does reach 74 and it throtles a bit, with room temperature of 76.

Case is HAF-X with a Seidon 240 on top cooling the CPU. Everything else standard.

Also anyone get AMD Game DVR to work with the 295X2?

Thanks.


----------



## axiumone

Quote:


> Originally Posted by *DeadlyDNA*
> 
> in my case yeah i was using an adapter, also on a 290x not a 295, but i wanted to ask here because i knew you guys were using Display ports.


So not using an adapter on the 295x2, right? Is it only when overclocking? Are you using the pixel clock patch by any chance?
Quote:


> Originally Posted by *papi4baby*
> 
> So i am still having a weird (water leak sounds) after pushing the card hard. Not OC just after playing. It goes away and it returns.
> 
> I took the rad out to try and mount a push pull config and i shook it and i can hear water sloshing inside! Is this normal? I thought these system are design to be pretty much pressurized. Card does reach 74 and it throtles a bit, with room temperature of 76.
> 
> Case is HAF-X with a Seidon 240 on top cooling the CPU. Everything else standard.
> 
> Also anyone get AMD Game DVR to work with the 295X2?
> 
> Thanks.


I can hear water sloshing around occasionally in my cards as well, so I'd say that's normal. If your temps seem good, I wouldn't worry about it.


----------



## DeadlyDNA

Quote:


> Originally Posted by *axiumone*
> 
> So not using an adapter on the 295x2, right? Is it only when overclocking? Are you using the pixel clock patch by any chance?
> I can hear water sloshing around occasionally in my cards as well, so I'd say that's normal. If your temps seem good, I wouldn't worry about it.


i don't have the pixel patch on right now, its when i was overclocking a 290x once i hit around 1200ish the screen kept blanking out and coming back. It was worse when in 3d benches. Also, i was at 1080p 60hz is that matters. I've never seen it do that before unless i WAS using the pixel patch.

also meant to say i swapped over to dvi/hdmi and it works fine when overclocked on those ports


----------



## kalijaga

Maybe, maybe the miniDP port cannot stand the heat (and power) of the board and caused some short circuit of the different traces of the miniDP output.
This however does not explain when my ud590 black out and returned after a few seconds while doing nothing 4-5 minutes after booting up.

Any ideas guys?


----------



## Zaxis01

What overclock is everyone getting and what manufacturer?


----------



## cennis

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Are any of you guys having issues with overclocking your video cards and displayport blanking out like its losing synch alot?


yea that happened on my samsung 4k 28" with r9 290when voltage is pushed up


----------



## PachAz

WOW is coded for using 1 core on the cpu, how would one expect not to get bottlenecked while using one 290x and let alone two 290x? As I said, most mmos are like that, any cpu will get bottlenecked as it is, and the differance between one high end card and two in these games will be zero. There are simply no cpus out there that has such insane single core performance to run these mmos at full settings. Watch dogs do take advantage of multicore cpus, but I think you will get cpu limited either way, even with a 4930k if im not wrong. Im not sure if that game utilize 6 core at all.


----------



## batman900

Quote:


> Originally Posted by *PachAz*
> 
> WOW is coded for using 1 core on the cpu, how would one expect not to get bottlenecked while using one 290x and let alone two 290x? As I said, most mmos are like that, any cpu will get bottlenecked as it is, and the differance between one high end card and two in these games will be zero. There are simply no cpus out there that has such insane single core performance to run these mmos at full settings. Watch dogs do take advantage of multicore cpus, but I think you will get cpu limited either way, even with a 4930k if im not wrong. Im not sure if that game utilize 6 core at all.


Actually it loads up 4 cores but only pushes one to 100%. Also, while running a single 290X with max details on a 120hz monitor, my 290X actually becomes the bottleneck. My CPU cores only load up to 75% max while my gfx card is at 100%. Adding in a second 290X drops their usage to about 70% while making the CPU hit 100% on a single core with about 60-80% on the others. Thus the CPU is now the bottleneck.

This was not the case with my 780 Ti. It ran at about 80% while maxing the CPU. WoW just loves Nvidia though.

Edit: If you aren't trying to push 120fps then the above doesn't matter. At 60hz single card is indeed more than enough.


----------



## PachAz

Well, okay. But the game is still single thread coded, its just that the other features are loaded on the other cores. This is common with many games. So no, not many games are multithread the way bf4 is, where the whole game takes advantage of as many cores as the cpu has. World of tanks also have the same system, game is single threaded but all 4 cores are used back and forth, but not at the same time I believe. Something like core switching, that the load is switched to what ever core is less used or something liek that. But that doesnt matter since at any given moment, running full details in the game, core 1-4 gets maxed out and that gives lower gpu usage and lower fps.

As far as I know the game engine for WOW is coded for using 2 cores, but it can spread the load on all 8 threads that are avalable, but I dont know if WOW is coded for using 8 threads/cores at the same time like bf4 is, I really doubt it since the game engine is so old. I dont think WOW is proper multithreaded application though, maybe some one with programming knowledge can comment on this.


----------



## pompss

i have a lot of problems with the r9 295x2 with Crysis 3 and watch dogs. Frame drops from 60 fps to 30 every 20 second with crysis 3 at 1440p. with watch dogs is should have 50-60 fps instead i get 24-30 fps at 1440p.
I change m psu from 850 to 1250 and i still have the same damn issues.this is my second r9 295x2 and i don't what is wrong.
Ask XFX for rma and they doesnt even pay the shipping and also they told me it can takes up to 4 weeks. What???? 4 week after speding 1500 dollars???
I love my pc but after spending some much money i'm really pissed off how the software house doesn't optimizes games like watch dogs.
At this point i really consider to just have one used gtx 780 ti and buy ps4.
Doesn't make any sense spending so much money to have more trouble and issues


----------



## PachAz

Look at your cpu usage and gpu usage, and maybe we can tell if you are getting bottlenecked. Also test running valley benchmark and firestrike and look at your gpu usage. It it hits 80-99% most of the times, you are beeing cpu limited in the games. Also not all games scale very well with CF.

I warned many people going crossfire and sli with high end cards, most games dont perform very well with those solutions other than bf4 and other proper multithreaded and cf/sli supported games. Watch dogs is a joke and crysis 3 can also be questioned. For smooth gameplay and less issues, a single card will always be better.


----------



## lowgun

Quote:


> Originally Posted by *pompss*
> 
> i have a lot of problems with the r9 295x2 with Crysis 3 and watch dogs. Frame drops from 60 fps to 30 every 20 second with crysis 3 at 1440p. with watch dogs is should have 50-60 fps instead i get 24-30 fps at 1440p.
> I change m psu from 850 to 1250 and i still have the same damn issues.this is my second r9 295x2 and i don't what is wrong.
> Ask XFX for rma and they doesnt even pay the shipping and also they told me it can takes up to 4 weeks. What???? 4 week after speding 1500 dollars???
> I love my pc but after spending some much money i'm really pissed off how the software house doesn't optimizes games like watch dogs.
> At this point i really consider to just have one used gtx 780 ti and buy ps4.
> Doesn't make any sense spending so much money to have more trouble and issues


Check your core clocks when it happens. If they are dropping down to 300MHz, that is your VRMs overheating.


----------



## DeadlyDNA

Quote:


> Originally Posted by *PachAz*
> 
> Look at your cpu usage and gpu usage, and maybe we can tell if you are getting bottlenecked. Also test running valley benchmark and firestrike and look at your gpu usage. It it hits 80-99% most of the times, you are beeing cpu limited in the games. Also not all games scale very well with CF.
> 
> I warned many people going crossfire and sli with high end cards, most games dont perform very well with those solutions other than bf4 and other proper multithreaded and cf/sli supported games. Watch dogs is a joke and crysis 3 can also be questioned. For smooth gameplay and less issues, a single card will always be better.


He is saying just buy a game console. He also knows WoW only runs one one core.

On a side note. That sucks to here your 295x2 is giving you issues.


----------



## PachAz

Overheating? If a stock card is overheating then it is a serious problem. Imo, all manufactures should make it legal to mount a waterclock and still get an warranty. In my book, a throttling stock card is not working as intended. If I were you, I would slap on and also slap a waterblock on the gpu. As I said, the r9 290x cards really benefit from going under water. Dont forget the googles tihihi.


----------



## pompss

Quote:


> Originally Posted by *PachAz*
> 
> Look at your cpu usage and gpu usage, and maybe we can tell if you are getting bottlenecked. Also test running valley benchmark and firestrike and look at your gpu usage. It it hits 80-99% most of the times, you are beeing cpu limited in the games. Also not all games scale very well with CF.
> 
> I warned many people going crossfire and sli with high end cards, most games dont perform very well with those solutions other than bf4 and other proper multithreaded and cf/sli supported games. Watch dogs is a joke and crysis 3 can also be questioned. For smooth gameplay and less issues, a single card will always be better.


I have intel i7 3820 overclocked at 4.6 ghz.
I dont think i have bottleneck problem with this cpu.
3d mark firestrike score are 16000 and extreme 8800
the scores are good.

With games i have problems
Get terrible fps.


----------



## DeadlyDNA

This post is a suggestion
Quote:


> Originally Posted by *lowgun*
> 
> Check your core clocks when it happens. If they are dropping down to 300MHz, that is your VRMs overheating.


This post is an assumption
Quote:


> Originally Posted by *PachAz*
> 
> Overheating? If a stock card is overheating then it is a serious problem. Imo, all manufactures should make it legal to mount a waterclock and still get an warranty. In my book, a throttling stock card is not working as intended. If I were you, I would slap on and also slap a waterblock on the gpu. As I said, the r9 290x cards really benefit from going under water. Dont forget the googles tihihi.


have you looked at a 295x2 ever?


----------



## lowgun

On a side note, if anyone was looking to get a second 295x2, let me know. I am selling mine off.


----------



## PachAz

@ pomps

Yes, that is why you should monitor your gpu usage and cpu usage in games. If you get good performance in benchmarks but not in games, you have a big limitation, either hardware or software.
Your cpu is basicly a 3770k which is similar to a 3570k in terms of performance, so it can very well bottleneck your gpus. It is a quad core. You know quad cores do bottleneck pretty much any dual gpu setups like these in games like crysis and watch dogs that suppose to use 8 cores?

@ DeadlyDNA
I am aware the 295x2 suppose to be "water cooled" but you cant compare a low grade generic solution with a proper copper block from EK and a thick 240mm radiator with 2 fans and a real pump. As I said, if the card throttles due to high temperatures the card doesnt work proper and we all know most stock r9 290/290x do throttle if they have stock cooler and stock fan speed. Basicly you are cooling two gpus, and that would need two 240mm radiators to be on the safe side. The 295x2 is sold with 4 times less radiator space than what is needed to cool it. A 120mm radiator is by no means and stretch of the imagination enough to cool it. I will even go so far to say a thin 240mm is not enough to cool a stock r9 290!!11


----------



## axiumone

Does anyone know how thick the thermal pads on the VRM are? I may be interested in replacing mine with some fujipoly.


----------



## pompss

Quote:


> Originally Posted by *PachAz*
> 
> @ pomps
> 
> Yes, that is why you should monitor your gpu usage and cpu usage in games. If you get good performance in benchmarks but not in games, you have a big limitation, either hardware or software.
> Your cpu is basicly a 3770k which is similar to a 3570k in terms of performance, so it can very well bottleneck your gpus. It is a quad core. You know quad cores do bottleneck pretty much any dual gpu setups like these in games like crysis and watch dogs that suppose to use 8 cores?
> 
> @ DeadlyDNA
> I am aware the 295x2 suppose to be "water cooled" but you cant compare a low grade generic solution with a proper copper block from EK and a thick 240mm radiator with 2 fans and a real pump. As I said, if the card throttles due to high temperatures the card doesnt work proper and we all know most stock r9 290/290x do throttle if they have stock cooler and stock fan speed. Basicly you are cooling two gpus, and that would need two 240mm radiators to be on the safe side. The 295x2 is sold with 4 times less radiator space than what is needed to cool it. A 120mm radiator is by no means and stretch of the imagination enough to cool it. I will even go so far to say a thin 240mm is not enough to cool a stock r9 290!!11


so in case i have this bottleneck i should get a six core cpu?? as any quad core will be not enough right??


----------



## PachAz

Nope. That will only help you in a few games. But generally speaking we are getting limited by games more than anything these days. The reason people are getting the intel 6 cores for games is because they want the best in the consumer market and because their budget allow for a enthusiast cpu. But, as with everything in this hobby, the higher premium stuff, the lower performance per dollar.

Think of it like this. Many games that use 1-2 cores and that are badly optimized will perform as good on a system with a 3570k + a r9 290x and a 4930k + 2x r9 290x. So no, more components wont necessarily give you better fps. Its just that people who spend alot of money are affraid to admit it. But yeah, I have been told that if you play on insane resolutions and multipe screen, you do benefit from multiple gpus, but that doesnt mean the cpu will get less stressed and maxed though. But you play at 1440p so I will say you do benefit from 2 gpus, I just dont know if you are getting cpu limited or not. Check your cpu usage and gpu usage and maybe we can tell. Some times neither the cpu or gpu usage is high for some reason and that is mostly the "funny" nature of the game, not much one can do


----------



## SaLX

Quote:


> Originally Posted by *PachAz*
> 
> @ pomps
> Your cpu is basicly a 3770k which is similar to a 3570k in terms of performance, so it can very well bottleneck your gpus. It is a quad core. You know quad cores do bottleneck pretty much any dual gpu setups like these in games like crysis and watch dogs that suppose to use 8 cores?


bit-tech.net seemed to be quite happy testing this card with a [email protected] ghz. Perhaps they're doing this because of it's popularity and don't want to appear elitist; however, with such an eyeball popping card I'd have liked to have seen a comparison using the latest i7's.

Are you completely sure that you'd require the very latest i7 to get a noticeable increase in performance out of the r295, or are we just talking very small and marginal increments here?


----------



## pompss

I wanna check the cpu usage and gpu later tonite
In case i wanna do some test witg a brand new windows 7 installation and see whats happen.


----------



## PachAz

Quote:


> Originally Posted by *SaLX*
> 
> bit-tech.net seemed to be quite happy testing this card with a [email protected] ghz. Perhaps they're doing this because of it's popularity and don't want to appear elitist; however, with such an eyeball popping card I'd have liked to have seen a comparison using the latest i7's.
> 
> Are you completely sure that you'd require the very latest i7 to get a noticeable increase in performance out of the r295, or are we just talking very small and marginal increments here?


We are talking very small performance gains of course even at high resolutions.


----------



## Roikyou

Quote:


> Originally Posted by *axiumone*
> 
> Does anyone know how thick the thermal pads on the VRM are? I may be interested in replacing mine with some fujipoly.




Here's the original discussed about the T-pads that I was discussing with Jpmboy

http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/520#post_22229180


----------



## sugarhell

Quote:


> Originally Posted by *PachAz*
> 
> @ pomps
> 
> Yes, that is why you should monitor your gpu usage and cpu usage in games. If you get good performance in benchmarks but not in games, you have a big limitation, either hardware or software.
> Your cpu is basicly a 3770k which is similar to a 3570k in terms of performance, so it can very well bottleneck your gpus. It is a quad core. You know quad cores do bottleneck pretty much any dual gpu setups like these in games like crysis and watch dogs that suppose to use 8 cores?
> 
> @ DeadlyDNA
> I am aware the 295x2 suppose to be "water cooled" but you cant compare a low grade generic solution with a proper copper block from EK and a thick 240mm radiator with 2 fans and a real pump. As I said, if the card throttles due to high temperatures the card doesnt work proper and we all know most stock r9 290/290x do throttle if they have stock cooler and stock fan speed. Basicly you are cooling two gpus, and that would need two 240mm radiators to be on the safe side. The 295x2 is sold with 4 times less radiator space than what is needed to cool it. A 120mm radiator is by no means and stretch of the imagination enough to cool it. I will even go so far to say a thin 240mm is not enough to cool a stock r9 290!!11


You only cool the cores. Not vrms,mems etc etc. A 120mm rad with a fast fan is enough for the cores.


----------



## PachAz

Well, then it is even worse than I imagined because a WB cool all components.


----------



## sugarhell

Quote:


> Originally Posted by *PachAz*
> 
> Well, then it is even worse than I imagined because a WB cool all components.


Uni blocks?


----------



## DeadlyDNA

Quote:


> Originally Posted by *sugarhell*
> 
> Uni blocks?


who knows... who knows if its even a throttling issue.. but a faster cpu will make the gpu faster ...


----------



## pompss

all right guyz.
My MAX cpu usage with watch dogs was around 40-50 % and crysis 3 around 50-65%.
One gpu max core speed was 1015 mhz and the other max gpu speed was around 600 mhz.
the same exact problem that i had with my previous sapphire and my psu 850 watt.
Now even with a new psu sea sonic x 1250 watt i still have the same exact problem.

If someone can please make a test with watch dogs or crysis 3 and please tell me what fps you guyz get? max gpu usage ??
Please!!!


----------



## pompss

test with tombraider one gpu 100% the other one dead !!!!


----------



## Jpmboy

Quote:


> Originally Posted by *PachAz*
> 
> Overheating? If a stock card is overheating then it is a serious problem. *Imo, all manufactures should make it legal to mount a waterclock* and still get an warranty. In my book, a throttling stock card is not working as intended. If I were you, I would slap on and also slap a waterblock on the gpu. As I said, the r9 290x cards really benefit from going under water. Dont forget the googles tihihi.


that's why EVGA is the best.

And yes, the 295x2 really does benefit from (real) watercooling !


----------



## PachAz

Quote:


> Originally Posted by *pompss*
> 
> all right guyz.
> My MAX cpu usage with watch dogs was around 40-50 % and crysis 3 around 50-65%.
> One gpu max core speed was 1015 mhz and the other max gpu speed was around 600 mhz.
> the same exact problem that i had with my previous sapphire and my psu 850 watt.
> Now even with a new psu sea sonic x 1250 watt i still have the same exact problem.
> 
> If someone can please make a test with watch dogs or crysis 3 and please tell me what fps you guyz get? max gpu usage ??
> Please!!!


Hmm, then we know that you dont get limited by cpu. But did you actually saw the gpu usage while playing? Seem like your second gpu is throttling a bit. My bet is that neither cpu usage or gpu usage is high, and in that case its game optimization I believe that dont utilize the gpu or cpu proper for some reason. These cards are very dynamic and can throttle because of many reasons, it can be due to power limit is too low, and even core and vrm temps if im not wrong. Have you tested increasong the power limit and see it that helps, me bet is that it wont make any differance but you can try.


----------



## pompss

The second gpu speed drops from 100% to 0% randomly. this cause the fps to drop form 60 to 30 for 10 second and back to 60 .
this happens all the time during the games. Very frustrating. Yes i disable UPLS but the second gpu speed in crysis 3 drops every time.
Good with tombraider , bad with thief and really bad with watch dogs.
i remember with gtx 690 i never had this issues.
45-50 fps with crysis 3 full detail at 1440p with a gtx 690
Now i get 30-60 fps at 1440p with this crap of gpu from amd !! And this is the second one i tested. Very Very disappointing.
Damn Nvidia to price the titan z at 2999 dollars .
I really hope that i win the fight with paypal to return this crap of video card asap.
This time i'm done with amd Vga. FOREVER !!!!
After 9 house of work i aspect to go home and relax instead to mess around with this crap of videocard








More And More Convinced to go back to the Consoles


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> \
> check CCC to be sure FRame pacing is ON. Also... is gpuZ showing both gpus? with CFX enabled?
> Lastly, load msi afterburner, and disable UPLS (if you haven't already). Or do the registry hack.


yes CFX is enabled


----------



## PachAz

Quote:


> Originally Posted by *pompss*
> 
> The second gpu speed drops from 100% to 0% randomly. this cause the fps to drop form 60 to 30 for 10 second and back to 60 .
> this happens all the time during the games. Very frustrating. Yes i disable UPLS but the second gpu speed in crysis 3 drops every time.
> Good with tombraider , bad with thief and really bad with watch dogs.
> i remember with gtx 690 i never had this issues.
> 45-50 fps with crysis 3 full detail at 1440p with a gtx 690
> Now i get 30-60 fps at 1440p with this crap of gpu from amd !! And this is the second one i tested. Very Very disappointing.
> Damn Nvidia to price the titan z at 2999 dollars .
> I really hope that i win the fight with paypal to return this crap of video card asap.
> This time i'm done with amd Vga. FOREVER !!!!
> After 9 house of work i aspect to go home and relax instead to mess around with this crap of videocard
> 
> 
> 
> 
> 
> 
> 
> 
> More And More Convinced to go back to the Consoles


I understand your frustration, alot of AMD gous dont really work as the suppose to, im in a battle myself regarding the functionality of one 290 card. So I feel you. I dont know why the gpu usage drop like that, and it must be very annoying because no freaking card is made to drop from 100% to 0% usage. I think the card throttles randomly, because these r9 cards throttle (perform worse) because of many reasons. It could be driver issues, but I think the 295x2 is a mess and I asked what the point was to even buy it. Now people know, it is underperforming as we can see. The functionality of the card is no where near acceptable if the second gpu drop the usage all time. I dont recommend anyone buying a 295x2 after reading this.


----------



## pompss

Quote:


> Originally Posted by *PachAz*
> 
> I understand your frustration, alot of AMD gous dont really work as the suppose to, im in a battle myself regarding the functionality of one 290 card. So I feel you. I dont know why the gpu usage drop like that, and it must be very annoying because no freaking card is made to drop from 100% to 0% usage. I think the card throttles randomly, because these r9 cards throttle (perform worse) because of many reasons. It could be driver issues, but I think the 295x2 is a mess and I asked what the point was to even buy it. Now people know, it is underperforming as we can see. The functionality of the card is no where near acceptable if the second gpu drop the usage all time. I dont recommend anyone buying a 295x2 after reading this.


I completely agree with you.
Buying this card is like throwing money out of the window.
This will cause you only issues.
Just for curiosity i will install windows 7 on a new hard drive and see if this fix the issues.
After that I will go back togtx 780 ti .
I sold my samsung 4k because its not the time. I'm back to 1440p and as soon paypal allow me to return this crap for 295x2 i will go with the kingpin.
The next adventure will be the lg 34um95 monitor with the gtx 780 ti kinping or the new gtx 880.
I will never touch amd card again in my life.
I owned 2x 290x and i had always problems with games and drivers ,now i tested two 295x2 and run into issues again.
Amd Crossfire its just a joke like all amd cards !! Never tested Nvidia Sli so i cannot tell but for sure i will never buy two card or dual gpu after this.
Doesn't make any sense buying cards like this right now to run games at 35 fps at 4k (if you are lucky)


----------



## DeadlyDNA

Quote:


> Originally Posted by *PachAz*
> 
> I understand your frustration, alot of AMD gous dont really work as the suppose to, im in a battle myself regarding the functionality of one 290 card. So I feel you. I dont know why the gpu usage drop like that, and it must be very annoying because no freaking card is made to drop from 100% to 0% usage. I think the card throttles randomly, because these r9 cards throttle (perform worse) because of many reasons. It could be driver issues, but I think the 295x2 is a mess and I asked what the point was to even buy it. Now people know, it is underperforming as we can see. The functionality of the card is no where near acceptable if the second gpu drop the usage all time. I dont recommend anyone buying a 295x2 after reading this.


Your 50% there, once you dump your cards for the green team you can preach AMD is horrible like the majority on here.You will be a blue/green force of persuasion! Share your wealth of knowledge for all of us lesser folks of mediocre intelligence.
Quote:


> Originally Posted by *pompss*
> 
> I completely agree with you.
> Buying this card is like throwing money out of the window.
> This will cause you only issues.
> Just for curiosity i will install windows 7 on a new hard drive and see if this fix the issues.
> After that I will go back togtx 780 ti .
> I sold my samsung 4k because its not the time. I'm back to 1440p and as soon paypal allow me to return this crap for 295x2 i will go with the kingpin.
> The next adventure will be the lg 34um95 monitor with the gtx 780 ti kinping or the new gtx 880.
> I will never touch amd card again in my life.
> I owned 2x 290x and i had always problems with games and drivers ,now i tested two 295x2 and run into issues again.
> Amd Crossfire its just a joke like all amd cards !! Never tested Nvidia Sli so i cannot tell but for sure i will never buy two card or dual gpu after this.
> Doesn't make any sense buying cards like this right now to run games at 35 fps at 4k (if you are lucky)


I feel for you man, it sucks to spend alot of money to have something not working right. Good luck with your refund and the next GPU you buy. Hope if anything you get a peace of mind and some gaming.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> I understand your frustration, alot of AMD gous dont really work as the suppose to, im in a battle myself regarding the functionality of one 290 card. So I feel you. I dont know why the gpu usage drop like that, and it must be very annoying because no freaking card is made to drop from 100% to 0% usage. I think the card throttles randomly, because these r9 cards throttle (perform worse) because of many reasons. It could be driver issues, but I think the 295x2 is a mess and I asked what the point was to even buy it. Now people know, it is underperforming as we can see. The functionality of the card is no where near acceptable if the second gpu drop the usage all time. I dont recommend anyone buying a 295x2 after reading this.


Please explain to us mere mortals how these cards throttle besides temps (non issue here)?

Second GPU dropping load can be related to crossfire profile/driver issues.

Ive been looking at maybe picking up one of these cards so I have been following this thread rather closely and ive seen a few people have issues out of the many here.

You are going to have bad cards occasionally.......it happens, it dies not mean that every GPU is broken.

And the issue you are having is getting your store to accept the RMA, different card and different circumstances.


----------



## pompss

After all this i really consider to go back to the consoles.
Spending all this money to only have issue and after seen a games like infamous on ps4 with a stunning details and graphic makes me really think.
The only think that hold me to sell my Pc is the new project cars and the witcher 3.
Company like Ubisoft and EA are the worst after hiding the ultra quality settings of watch dogs.
This is something really bad to do and i will never ever again buy one of they games.
Hope the others developers doesn't follow this two idiots.


----------



## Sgt Bilko

Quote:


> Originally Posted by *pompss*
> 
> After all this i really consider to go back to the consoles.
> Spending all this money to only have issue and after seen a games like infamous on ps4 with a stunning details and graphic makes me really think.
> The only think that hold me to sell my Pc is the new project cars and the witcher 3.
> Company like Ubisoft and EA are the worst after hiding the ultra quality settings of watch dogs.
> This is something really bad to do and i will never ever again buy one of they games.
> Hope the others developers doesn't follow this two idiots.


Ubisoft just dont care about PCs anymore.

Hope you get your cash back and have no issues with your next card


----------



## pompss

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Please explain to us mere mortals how these cards throttle besides temps (non issue here)?
> 
> Second GPU dropping load can be related to crossfire profile/driver issues.
> 
> Ive been looking at maybe picking up one of these cards so I have been following this thread rather closely and ive seen a few people have issues out of the many here.
> 
> You are going to have bad cards occasionally.......it happens, it dies not mean that every GPU is broken.
> 
> And the issue you are having is getting your store to accept the RMA, different card and different circumstances.


even if it's a driver issues is not acceptable after spending 1500 dollars. I can assure you its not a driver issues but i will test it with a new windows 7 installation soon.
I understand that u get faulty card and its happen to me a lot but after testing two card from different brands and having the same issue its not normal for a 1500 dollars card.
Nvidia card becomes faulty or arrive Doa too but Amd have higher faulty rate ,very poor driver support and bad customer assistance.This is why evga sells only nvidia Card.
I Bought my first video card 20 year ago the voodoo fx so i have some exp and some knowledge about this.

Xfx and sapphire doesn't allow cross shipping and with no replacement card. So you need to wait 1 month without video card after spending 1500 dollars.
This is just ridiculous.
If one day evga will sell amd card then maybe i will think to buy it again .


----------



## Arizonian

Sorry to hear about your troubles pompass with your R9 295X2 which not everyone shares. I can certainly understand your frustration.

I just ask we stay on topic to this card and not bring Nvidia debate to the AMD owners thread and not turn this into another Nvidia AMD argument which is where it's heading. Thank you.


----------



## fireedo

anyone here who already using custom Water cooling for this 295x2 can share about temperature before and after?

I really want to buy EK WB for this card but need for sure information that I will gain a lot of temperature different (At least 6-7 C ) better then default cooling ( Already using push/pull config here)


----------



## kalijaga

Hi pomps,

I feel your frustration and anger, as I have many times feel since I bought my 2 cards. Just to prove that point about frustration, last night, after upgrading to 14.6 rc2 , I cannot even run BF4 and BFHardline anymore. So, I totally understand where you are coming from.

For me, since I decided to go with my 780 sli last year, I had expected and known the problems I would be facing with multi GPU config. I still remember the day, after getting my BF4, installing and practically salivating for 'next gen graphics' on my new setup. And then..bammm...flickering and stuttering. Using my old rig using a trusted 680, no problems...smooth as silk. And since then, BSOD, crashing, flickering, GPU not beings used, driver updates that crash windows ..are all part and parcel of multiGPU setup.
When I decided for quadfire on 4k, I had expected the problems to essentially quadruple. I know my issues would only be understood by a few people, faced by even less, thus my decision to join this group to seek help from the more knowledgeable and experienced.

Humbly, I would equate having this card or 2 for that matter, as being at the very edge of GPU technology, for which neither AMD nor NVIDIA (both I still use daily) will care to think about regarding incompatibilities, stuttering, throttling and finally crashing when we only represent about 0.0001% of the PC population. It is like the price you pay to have a Bugatti Veyron in your garage; its good when the weather and the road is clear, but you will drive it knowing full well, no mechanic in your country can do anything if something goes wrong.

I wish you all the best in your future gaming.

God bless.


----------



## crazygamer123

Quote:


> Originally Posted by *fireedo*
> 
> anyone here who already using custom Water cooling for this 295x2 can share about temperature before and after?
> 
> I really want to buy EK WB for this card but need for sure information that I will gain a lot of temperature different (At least 6-7 C ) better then default cooling ( Already using push/pull config here)


Nice push/pull configuration. what is the temp different with the stock cooler?


----------



## electro2u

I'd never want to belong to a club that would have me as a member.

...except this one.


----------



## iamimpossible

Here's mine


----------



## iamimpossible

Have people noticed good performance gains with basic overclocking ?

I'm thinking of adding a 290x. Do you guys think I should wait for X99 because the extra pci lains will help or will it not make a difference?

I also noticed running Heaven on 14.4 drivers, it kept crashing. Running the beta 14.6 it was fine.

How can I check throttling, is there recommended software?


----------



## Chephren

Hi there!

Switched from Nvidia to AMD-Radeon, so i am new to Catalyst Control Center.

My R9 295x2 from Asus dont show the Option in CCC, to enable Crossfire.

GPUz shows that the Card runs in Crossfire mode, but i have no Options in CCC, neither in Games nor in Performance.


----------



## iamimpossible

ohh I assumed it ran in crossfire by default. Ill have a look too. When I ran some msi software you get with Afterburn it has some stress tests that showed both GPU running load.


----------



## iamimpossible

looking again and the 2nd gpu is not showing much load 1-5% aprox


----------



## King4x4

You need to do it in full screen for crossfire to work.

101 AMD Crossfire
Crossfire will not work unless the application is running in full screen.


----------



## Sgt Bilko

Quote:


> Originally Posted by *King4x4*
> 
> You need to do it in full screen for crossfire to work.
> 
> 101 AMD Crossfire
> Crossfire will not work unless the application is running in full screen.


Except when running under Mantle API, Will work in windowed mode then


----------



## iamimpossible

Thank you guys!! I was a bit confused when I ran a test yesterday and it showed both gpu's running









So can anyone share their before and after water cooling results? I really would like a quiet machine.


----------



## PachAz

We can still discuss what full functionality means in this context. No possible way can one consider full functionality with mediocre performance and thermal throttling despite stock settings, with a 1500+ dollar card. I think most people here do agree the 295x2 needs a WB to function proper and that alone is an indication that these chips really are too hot, as well as the vrm that are not cooled by the stock cooler effectively.

Also what do you mean by "im 50% there". Arnt people allowed to be critical on products that are advertised as one thing, and perform totally different in some cases? We are discussing hardware here we dont need people "defending" a particular brand.


----------



## sugarhell

Quote:


> Originally Posted by *PachAz*
> 
> We can still discuss what full functionality means in this context. No possible way can one consider full functionality with mediocre performance and thermal throttling despite stock settings, with a 1500+ dollar card. I think most people here do agree the 295x2 needs a WB to function proper and that alone is an indication that these chips really are too hot, as well as the vrm that are not cooled by the stock cooler effectively.
> 
> Also what do you mean by "im 50% there". Arnt people allowed to be critical on products that are advertised as one thing, and perform totally different in some cases? We are discussing hardware here we dont need people "defending" a particular brand.


I used a 295x2 like a week ago. I dont think that with 65C max temps 295x2 needs a proper WB. Please stop spamming that. Its not true. 7990 ares had a single 120mm rad with a high oc too. And it was fine. Maybe you aim for 40C temps and 40C vrms temps. Thats why you go to custom solutions. But for a stock 295x2 the aio is more than enough.

If you have problems with performance or 'throttling' i bet its not the cooler but either settings,user error or the game sucks. The vrms of 295x2 is okay for up to 140C. You will not get a throttle until that point at stock settings.


----------



## electro2u

Quote:


> Originally Posted by *sugarhell*
> 
> I used a 295x2 like a week ago. I dont think that with 65C max temps 295x2 needs a proper WB. Please stop spamming that. Its not true. 7990 ares had a single 120mm rad with a high oc too. And it was fine. Maybe you aim for 40C temps and 40C vrms temps. Thats why you go to custom solutions. But for a stock 295x2 the aio is more than enough.
> 
> If you have problems with performance or 'throttling' i bet its not the cooler but either settings,user error or the game sucks. The vrms of 295x2 is okay for up to 140C. You will not get a throttle until that point at stock settings.


I have been struggling just a little bit with keeping mine from "throttling" when it gets hot here (South Texas Summer) during stress tests. In gaming (@1440p 120FPS Vsynced) it does not typically throttle at all unless my settings are dropping me below my frame limiter. I can MAKE it throttle on the stock cooler, but in normal operation it is much more well behaved than it would be on air. In fact. it would not be possible on air. Also, mine has a MAX temp of 75C, iirc, when I briefly had CCC set up on Windows 8.1 it would allow me to drop that MAX temp down in Overdrive settings. But on windows 7 I am not seeing the overdrive option at all. Strange.

It is, in my opinion, as an owner who is not *100%* pleased with the card (caused me to have to downgrade to windows 7 x64 to play the game I'm on atm, which really pissed me off) a good STOCK cooler. I am not overclocking, and I will be putting an Aquacomputer block as soon as I can figure out what all I need... I don't want or need to overclock the card, when I get it up and running on water, because I'll be adding another 290x to the setup as well, that's how impressive the scaling, the performance, the color (no one talks about temporal dithering that Radeons have and GTX don't in Windows) is to me. I came from 2 GTX 780s and I'm blown away.

That said, when I first got it, it was running hot and I had to switch cases from the budget one I had to a slightly larger midtower, Corsair 450D. I put good fans (The 3 NF-A14's I put for intakes are really *really* good) on every possible mounting point and I have my 295x2 set up to exhaust out of the rear in push/pull with 2 NF-P12s. The top three 120 slots are all exhaust NF-F12s (not good, need replacement) including the push/pull exhaust for my Corsair H55, which is at the front end. So I can run Valley at 1440p at 120FPS vsynced (I use a single Yamakasi Catleap @120Hz) and it won't throttle as long as I don't let the ambient temperature in my room get out of control.

So, if I set it to try to MAX its FPS out with no vsync on for an hour yah I'm sure it would throttle a lot. That's part of why I'm going water. Until this point there was no incentive for me to go water.


----------



## sugarhell

I have 40C ambient right now. Its a special situation and even with my custom loop i see temps up to 60 with just a 7970.


----------



## joeh4384

How is your ambient that high? Do you keep your computer outside in the desert?


----------



## 4K-HERO

I don't understand what everyone's issue is. Im running a 295x2 crossfired with a 290. Most games run beautifully with no bottlenecking. I have a 4670k overclocked to 4.4 ghz, which is a baby processor compared to what some of the guys in this club have. If you are having problems with heat, make sure you put your radiator on the exhaust vent, or add another fan. I changed the fan on mine and put the rad in a place where it could exhaust properly. If the temps in your home are high than turn on your AC. Clearly if you're pulling hot air into your case, all your temps will be higher. If you're not getting full usage from the 295x2 than make sure you're in full screen mode when playing a game. Find a solution. I have a complex setup and got things to work perfectly. I have a 4k monitor and another 2 1080p monitors for eyefinity. It didn't work perfect at the get go, but a little effort got things going. If you want plug and play, go get a ps4.

Dont base your judgements of this card on watch dogs. That game is a poorly optimized piece of crap. I get full usage of my cards with all the big games from 2013 and on. Minus a few here and there from companies like ubisoft who don't care about their customers.

This card made 4k gaming with ultra settings possible for me at 60 fps(with the extra 290). The only other way to get that is to tri-fire three cards and buy a 6 core processor and motherboard so you can get 8x pci speeds. Also good luck making that happen with all the heat. I tried believe me. Amd ushered in a new era for PC enthusiasts with this card. So if anybody wants to talk smack and say they're going the Nvidia route, than go get a titan Z so you can really feel ripped off.

BTW here is a list of what i've played with all settings maxed out in 4k and getting 60fps and above.(100% usage of 295x2 and r9 290)

-Crysis 3
-Battlefield 4 and Battlefield Hardline Beta.
-Tomb Raider with tressfx and 2x SSAA!!!
-Hitman Absolution
-Call of Duty Ghosts
-Grid Autosport
and a bunch more I cant think of off the top of my head.

If I can make it work perfectly with a crappy processor and 3 different monitors so should everyone else unless their card is broken or they're lazy. Once again, go get a PS4


----------



## crazygamer123

I still using the stock cooler (same fan, same radiator), 4k gaming 60hz . Its summer here and the Idle temp is about 41, max 65~69. Both cores are running fine


----------



## DeadlyDNA

One particular OCN member came into this thread who doesn't even own the card and started talking about "295x2 usless", "cpu bottlenecks" ,and "throttling". While an actual user of the 295x2 was having issues with his, he was being given advice by this same user. When you read that advice it's more of a here the "facts"(his opinions) and that user with his frustration believes all that was said. Don't get me wrong i like to see people getting help especially when they are having a difficult time. I would be listening to the guys who actually own this card. for advice because they may have already went and fixed the same issue.

Case in point:


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *PachAz*
> 
> Whats the point with the 295x2 anyways?


Quote:


> Originally Posted by *PachAz*
> 
> I still dont see any point in getting the 295x2 for most of the users. Also isnt a 295x2 too weak for 5 monitors anyways? What I mean is that you get similar or better performance getting two r9 290, or even used ones these days. Its always controversal getting these PCB with two gpus on.
> 
> Edit: I saw your sy....never mind.


Quote:


> Originally Posted by *PachAz*
> 
> WOW is coded for using 1 core on the cpu, how would one expect not to get bottlenecked while using one 290x and let alone two 290x? As I said, most mmos are like that, any cpu will get bottlenecked as it is, and the differance between one high end card and two in these games will be zero. There are simply no cpus out there that has such insane single core performance to run these mmos at full settings. Watch dogs do take advantage of multicore cpus, but I think you will get cpu limited either way, even with a 4930k if im not wrong. Im not sure if that game utilize 6 core at all.


Quote:


> Originally Posted by *PachAz*
> 
> Look at your cpu usage and gpu usage, and maybe we can tell if you are getting bottlenecked. Also test running valley benchmark and firestrike and look at your gpu usage. It it hits 80-99% most of the times, you are beeing cpu limited in the games. Also not all games scale very well with CF.
> 
> I warned many people going crossfire and sli with high end cards, most games dont perform very well with those solutions other than bf4 and other proper multithreaded and cf/sli supported games. Watch dogs is a joke and crysis 3 can also be questioned. For smooth gameplay and less issues, a single card will always be better.


Quote:


> Originally Posted by *PachAz*
> 
> Overheating? If a stock card is overheating then it is a serious problem. Imo, all manufactures should make it legal to mount a waterclock and still get an warranty. In my book, a throttling stock card is not working as intended. If I were you, I would slap on and also slap a waterblock on the gpu. As I said, the r9 290x cards really benefit from going under water. Dont forget the googles tihihi.


Quote:


> Originally Posted by *PachAz*
> 
> @ pomps
> 
> Yes, that is why you should monitor your gpu usage and cpu usage in games. If you get good performance in benchmarks but not in games, you have a big limitation, either hardware or software.
> Your cpu is basicly a 3770k which is similar to a 3570k in terms of performance, so it can very well bottleneck your gpus. It is a quad core. You know quad cores do bottleneck pretty much any dual gpu setups like these in games like crysis and watch dogs that suppose to use 8 cores?
> 
> @ DeadlyDNA
> I am aware the 295x2 suppose to be "water cooled" but you cant compare a low grade generic solution with a proper copper block from EK and a thick 240mm radiator with 2 fans and a real pump. As I said, if the card throttles due to high temperatures the card doesnt work proper and we all know most stock r9 290/290x do throttle if they have stock cooler and stock fan speed. Basicly you are cooling two gpus, and that would need two 240mm radiators to be on the safe side. The 295x2 is sold with 4 times less radiator space than what is needed to cool it. A 120mm radiator is by no means and stretch of the imagination enough to cool it. I will even go so far to say a thin 240mm is not enough to cool a stock r9 290!!11


Quote:


> Originally Posted by *PachAz*
> 
> Nope. That will only help you in a few games. But generally speaking we are getting limited by games more than anything these days. The reason people are getting the intel 6 cores for games is because they want the best in the consumer market and because their budget allow for a enthusiast cpu. But, as with everything in this hobby, the higher premium stuff, the lower performance per dollar.
> 
> Think of it like this. Many games that use 1-2 cores and that are badly optimized will perform as good on a system with a 3570k + a r9 290x and a 4930k + 2x r9 290x. So no, more components wont necessarily give you better fps. Its just that people who spend alot of money are affraid to admit it. But yeah, I have been told that if you play on insane resolutions and multipe screen, you do benefit from multiple gpus, but that doesnt mean the cpu will get less stressed and maxed though. But you play at 1440p so I will say you do benefit from 2 gpus, I just dont know if you are getting cpu limited or not. Check your cpu usage and gpu usage and maybe we can tell. Some times neither the cpu or gpu usage is high for some reason and that is mostly the "funny" nature of the game, not much one can do


Quote:


> Originally Posted by *PachAz*
> 
> We are talking very small performance gains of course even at high resolutions.


Quote:


> Originally Posted by *PachAz*
> 
> Well, then it is even worse than I imagined because a WB cool all components.


Quote:


> Originally Posted by *PachAz*
> 
> I understand your frustration, alot of AMD gous dont really work as the suppose to, im in a battle myself regarding the functionality of one 290 card. So I feel you. I dont know why the gpu usage drop like that, and it must be very annoying because no freaking card is made to drop from 100% to 0% usage. I think the card throttles randomly, because these r9 cards throttle (perform worse) because of many reasons. It could be driver issues, but I think the 295x2 is a mess and I asked what the point was to even buy it. Now people know, it is underperforming as we can see. The functionality of the card is no where near acceptable if the second gpu drop the usage all time. I dont recommend anyone buying a 295x2 after reading this.


Quote:


> Originally Posted by *PachAz*
> 
> We can still discuss what full functionality means in this context. No possible way can one consider full functionality with mediocre performance and thermal throttling despite stock settings, with a 1500+ dollar card. I think most people here do agree the 295x2 needs a WB to function proper and that alone is an indication that these chips really are too hot, as well as the vrm that are not cooled by the stock cooler effectively.
> 
> Also what do you mean by "im 50% there". Arnt people allowed to be critical on products that are advertised as one thing, and perform totally different in some cases? We are discussing hardware here we dont need people "defending" a particular brand.


----------



## PachAz

Is this an personal agenda against me or what? We are allowed to discuss a product even if we dont own it. I am also intrested in problems occuring with the 295x2 because I kinda have the same setup myself soon, but not in the exact form. I mean I like and use AMD myself, but that doesnt mean I cant give my opinion on a product, that very well may not function as it was advertised as.

_"go get a titan Z so you can really feel ripped off"_ - isnt this exactly what was not allowed in this thread :S.


----------



## pompss

Quote:


> Originally Posted by *4K-HERO*
> 
> I don't understand what everyone's issue is. Im running a 295x2 crossfired with a 290. Most games run beautifully with no bottlenecking. I have a 4670k overclocked to 4.4 ghz, which is a baby processor compared to what some of the guys in this club have. If you are having problems with heat, make sure you put your radiator on the exhaust vent, or add another fan. I changed the fan on mine and put the rad in a place where it could exhaust properly. If the temps in your home are high than turn on your AC. Clearly if you're pulling hot air into your case, all your temps will be higher. If you're not getting full usage from the 295x2 than make sure you're in full screen mode when playing a game. Find a solution. I have a complex setup and got things to work perfectly. I have a 4k monitor and another 2 1080p monitors for eyefinity. It didn't work perfect at the get go, but a little effort got things going. If you want plug and play, go get a ps4.
> 
> Dont base your judgements of this card on watch dogs. That game is a poorly optimized piece of crap. I get full usage of my cards with all the big games from 2013 and on. Minus a few here and there from companies like ubisoft who don't care about their customers.
> 
> This card made 4k gaming with ultra settings possible for me at 60 fps(with the extra 290). The only other way to get that is to tri-fire three cards and buy a 6 core processor and motherboard so you can get 8x pci speeds. Also good luck making that happen with all the heat. I tried believe me. Amd ushered in a new era for PC enthusiasts with this card. So if anybody wants to talk smack and say they're going the Nvidia route, than go get a titan Z so you can really feel ripped off.
> 
> BTW here is a list of what i've played with all settings maxed out in 4k and getting 60fps and above.(100% usage of 295x2 and r9 290)
> 
> -Crysis 3
> -Battlefield 4 and Battlefield Hardline Beta.
> -Tomb Raider with tressfx and 2x SSAA!!!
> -Hitman Absolution
> -Call of Duty Ghosts
> -Grid Autosport
> and a bunch more I cant think of off the top of my head.
> 
> If I can make it work perfectly with a crappy processor and 3 different monitors so should everyone else unless their card is broken or they're lazy. Once again, go get a PS4


So you are getting 60 fps by using three video card right.
And this for you is normal ??

Speechless.

i was getting 30 fps 4k with one evga kingpin and for sure with two im getting 60 fps without all this issues i get with r9 295x2 which can no even stay stable a 60 fps at 1440p.

And yes i will go with ps4 if i can play the same games quality and spend 10 time less.


----------



## electro2u

One thing I don't understand about my 295x2:
I cannot control the main fan in the shroud, nor the fans I have connected to the radiator.
Obviously that center fan can get loud... you can hear it during boot/post, and when you install the card it whirrrrs until you get the drivers installed. After that , it never gets that loud again








Is everyone locked out of fan control on their 295x2?

Can these be undervolted at all? Afterburner seems to want to reset my voltage settings, or lock me out completely from them depending on the catalyst version I have installed.

Am I on mute in here? Is this thing on?


----------



## ImperialOne

Yes, stock r9 285x2 should not need it's on WB unless you are intent in serious OC or your case is small or your build inefficient (hit air in case). Under load playing Arma3, I don't go past 66 degrees, and I don't throttle. I have a quiet large case with efficient air movement up and rearward (mobo turned 90 degrees)


----------



## 4K-HERO

It functions exactly as advertised, which is why I said what I said in my post. I hate companies that make claims about a product and they end up being lies or exaggerations. This is one of the rare products that performs as promised and sometimes even better, which is why I got defensive. The card throttles when an inexperienced consumer buys it and puts it in a case with poor airflow and runs it in a room full of hot air. The reason I mentioned the Titan Z is because that is a perfect example of a product that doesn't work as promised and people should be complaining about. This market is only going to get better when people recognize the difference between gold and crap. The 295x2 is Gold and I dont think it should be ripped on because it is the perfect example of keeping a promise to the consumer, and all other companies should follow suit.


----------



## 4K-HERO

60 average. In Battlefield 4 i get 85 fps. With all setting at their highest in 4k, that is very good. Go to hardocp and look at their benchmarks for a 295x2 crossfired with a 290x. My rig performs the same. Trust me nobody's upset with their purchase except a few unlucky guys who got a malfunctioning card. If it hasn't malfunctioned than you need to make certain adjustments. Dont just come to the 295x2 OWNERS CLUB and tell us that amd is no good and the card is no good. If we weren't happy we wouldn't have joined this thread. Next time, Ask for help instead of trying to put down something we are all proud to own.


----------



## lowgun

I'd love mine if I could just get the VRM temp under control. Since I have an X79 mobo, can't put the 295x2 radiator toe exhaust out the back. I can have it pull cool from outside in the front, but hot dumps on the cards, or blow hot air out the bottom which then circulates back in via the front fans. I got a bigger case with beefier fans to try and overcome it, but no luck.

Its so disappointing because it runs gloriously for 30-45 seconds, then drops to 300MHz for 5-10 then back to gloriousness. Hopefully the person who buys mine will have better luck. Also, the reference 290X I have paired with it (didn't dare try non-reference, was afraid I'd melt the 295x2) sounds like a hair dryer, lol. All my money went towards the cards themselves, if I had the extra dough I'd do a full custom waterloop ad call it a day.


----------



## pompss

Quote:


> Originally Posted by *4K-HERO*
> 
> 60 average. In Battlefield 4 i get 85 fps. With all setting at their highest in 4k, that is very good. Go to hardocp and look at their benchmarks for a 295x2 crossfired with a 290x. My rig performs the same. Trust me nobody's upset with their purchase except a few unlucky guys who got a malfunctioning card. If it hasn't malfunctioned than you need to make certain adjustments. Dont just come to the 295x2 OWNERS CLUB and tell us that amd is no good and the card is no good. If we weren't happy we wouldn't have joined this thread. Next time, Ask for help instead of trying to put down something we are all proud to own.


I test two card not one and for me the r9 295x2 is crap. Doesn't delivered what they promise and its just causing problems.
Again this is my second r9 295x2.
I don't try to put down nothing i just share my experience.


----------



## 4K-HERO

I originally had a nzxt h440 which was way to compact for all my components, and airflow wasnt good either. I solved the problem of the 290's hot air going up to the x2 by getting a kraken x40 with a g10 bracket. 130 bux. No more hot air and the x2 never goes over 71 degrees. The temp in my room is usually 25 degrees.


----------



## pompss

Quote:


> Originally Posted by *lowgun*
> 
> I'd love mine if I could just get the VRM temp under control. Since I have an X79 mobo, can't put the 295x2 radiator toe exhaust out the back. I can have it pull cool from outside in the front, but hot dumps on the cards, or blow hot air out the bottom which then circulates back in via the front fans. I got a bigger case with beefier fans to try and overcome it, but no luck.
> 
> Its so disappointing because it runs gloriously for 30-45 seconds, then drops to 300MHz for 5-10 then back to gloriousness. Hopefully the person who buys mine will have better luck. Also, the reference 290X I have paired with it (didn't dare try non-reference, was afraid I'd melt the 295x2) sounds like a hair dryer, lol. All my money went towards the cards themselves, if I had the extra dough I'd do a full custom waterloop ad call it a day.


So I'm not the only one who have this issues that the core speed of the second gpu drops for 100% to 0% every time during the games.
Good Know you saved me a lot of time. now i dont even have to make any new windows installation to know that this card is garbage high priced!!
After$ 1500 card this shouldn't happens and also they advertise this card for 4k and we need to add another r9 290 to run games and decent fps??

Please that's just ridiculous


----------



## 4K-HERO

Actually his card gets hot because he has a 290x directly below it. So stop trying to justify your laziness. There's a solution for your problem, look harder at your setup, and your drivers. If your card is overheating than you have poor airflow in your case. Period. Go whine about it in the 780 ti owners club.


----------



## sugarhell

You cant whine about your 295x2 on the 780ti club or you can?


----------



## 4K-HERO

LOL i wish, so i dont have to read anymore nvidia fanboy crap


----------



## sugarhell

Quote:


> Originally Posted by *4K-HERO*
> 
> LOL i wish, so i dont have to read anymore nvidia fanboy crap


I think that this happen with both sides







But its hilarious


----------



## pompss

Quote:


> Originally Posted by *4K-HERO*
> 
> Actually his card gets hot because he has a 290x directly below it. So stop trying to justify your laziness. There's a solution for your problem, look harder at your setup, and your drivers. If your card is overheating than you have poor airflow in your case. Period. Go whine about it in the 780 ti owners club.


You call this card gold and you need to use another 290 to play 4k. Please Stop being Fanboy saying this card work as advertised and you need to use another 290 for 4k.

I working on this crap card for two weeks now and still have problems.
My case have 10 fan pushing air and also i have custom water cooling system .my system temperature is around 29c.
Also my card runs at stock settings and if this card have heat problems after 5 minute running at stock bios do you think this is normal??Do you call a video card like this Gold???
I had other cards and never had any problems and any overheating problems.My system work perfect cant tell the same about this card,
I own this crap video card and i can post here many times i want. Ff you don't like what i write just ignore me and i will do the same.


----------



## sugarhell

Why i cant see you on the owners list?


----------



## pompss

Quote:


> Originally Posted by *4K-HERO*
> 
> LOL i wish, so i dont have to read anymore nvidia fanboy crap


If there is a fan boy here its you after saying that this card work like advertised and after you need to use another r9 290 fo 4k

Ridiculous !!!


----------



## pompss

Quote:


> Originally Posted by *sugarhell*
> 
> Why i cant see you on the owners list?


Don't worry when i go back home i will be in that list as i don't come here as fan boy but just to see if i could fix some issues as i liked amd cards in the past and for sure i'm not a nvidia fan as someone else here who speak about gold cards.


----------



## sugarhell

But i still cant understand who said that you only need a single 295x2 for 4k? A highly doubt that any dual gpu atm can max out any demanding graphic engine at 4k.


----------



## Earth Dog

BOOM!

Im in... AMD (I guess that is OEM, LOL!)

Max clocks bench stable are 1125 MHz/ 1625 MHz...

I am wondering how you all are getting more than 1625 as that is what MSI AB and CCC limit it to (maybe not CCC, I don't use it but swear it said 1625 like MSI AB).

Quote:


> Originally Posted by *sugarhell*
> 
> But i still cant understand who said that you only need a single 295x2 for 4k? A highly doubt that any dual gpu atm can max out any demanding graphic engine at 4k.


Try disabling AA on a 30" or less 4K monitor... it will will be juuuuuuuuust fine.


----------



## pompss

Quote:


> Originally Posted by *sugarhell*
> 
> But i still cant understand who said that you only need a single 295x2 for 4k? A highly doubt that any dual gpu atm can max out any demanding graphic engine at 4k.


Actually its Amd who advertise this card for 4k ready.
Which is not ready at all if you need another 290 !!!


----------



## sugarhell

Quote:


> Originally Posted by *Earth Dog*
> 
> BOOM!
> 
> Im in... AMD (I guess that is OEM, LOL!)
> 
> Max clocks bench stable are 1125 MHz/ 1625 MHz...
> 
> I am wondering how you all are getting more than 1625 as that is what MSI AB and CCC limit it to (maybe not CCC, I don't use it but swear it said 1625 like MSI AB).
> Try disabling AA on a 30" or less 4K monitor... it will will be juuuuuuuuust fine.


Yeah i am thinking why he cant max out games (60fps) on a 4k monitor.And i assume he just max out all the settings...


----------



## PachAz

I agree that calling this card gold and the "other" card crap is stretching it too far. We all know most products dont function as advertised. Whats the problem really? Do some people really get offended because some one talks down on their 1500 dollar toy? Grow up pls.


----------



## Earth Dog

That's my guess. I mean by definition, 'maxing out' to me includes AA. However, due to the pixel density on 4k monitors (not large TV's) You either need less or no AA (depending on how picky one may be) in the first place.


----------



## tsm106

What's with all the whining? Is it all pebkac or what?


----------



## Earth Dog

You have been here long enough, so with respect, whining and infighting IS OCN (from the outside looking in).


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> What's with all the whining? Is it all pebkac or what?


Dunno i am checking pcper review and they are mostly over 60 fps on 4k...

ps nice photo m8


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> ps nice photo m8


It should be the bomb cuz you made it iirc lol. Btw, the pic is not referencing me for those unaware. There's a funny story behind it.


----------



## LtMatt

I was directed to this thread. Nice pic TSM, that is sexy. Feel honoured to wear him as your badge of honour.


----------



## iamimpossible

http://www.tomshardware.com/news/ea-mantle-amd,27144.html

woohooo!!

Please add me to the roster.


----------



## DeadlyDNA

Quote:


> Originally Posted by *PachAz*
> 
> Whats the problem really? Do some people really get offended because some one talks down on their 1500 dollar toy? Grow up pls.


I honestly think you just don't read anything on here do you? Did you look at all your posts here? You repeatedly talked down the 295x2 since you started posting in this thread. We gave you examples of why this card is viable for people. You conveniently ignore facts and make your own facts up as you type.
I actually am considering this card and 2 290x cards to continue 4k eyefinity testing and be future ready for a display port 4k eyefinityy at 60hz or so.
I honestly don't know why I bother responding to you because you just ignore facts.

Why would anyone want to debate the 295x2 when they already own it? Owners can provide real feedback for us potential folks on the fence. We dont need someone to come in who read some reviews and thinks hes smarter than the engineers who designed this product. You are not qualified to say the stock hybrid cooler is not good enough. Anyone who buys this card has a very high chance of fully water cooling it anyways. Are you going to buy one of these? Your not even considering it because you started right off with "why would you buy this"


----------



## 4K-HERO

There is NO gpu that can run 4k with maxed out settings. The 295x2 can do 4k on its own with some settings turned down. Once again figure out your problem and stop calling everyone's 1500 toy crap. It was expensive, and when you call it crap, you are implying that every 295x2 owner is a fool for buying it. Go get a titan z and we can all compare benchmarks.


----------



## tsm106

Quote:


> Originally Posted by *LtMatt*
> 
> I was directed to this thread. Nice pic TSM, that is sexy. Feel honoured to wear him as your badge of honour.


That's awesome, you truly are The AMD GOD, but I'm not sure your interpretation of the pic is the same as the artist's.


----------



## Roikyou

Thirty days plus and I'm still happy with my 295 on a Koolance water block. Gaming runs at 52c (still playing Dark Souls 2). I do have the mild buzz that plagued some 290's but it's acceptable. I've mentioned before, the computer is dead silent, so any little noise you hear. Running with Samsung U590, no flicking and very responsive monitor. Guess I got lucky.


----------



## kalijaga

Quote:


> Originally Posted by *Roikyou*
> 
> Thirty days plus and I'm still happy with my 295 on a Koolance water block. Gaming runs at 52c (still playing Dark Souls 2). I do have the mild buzz that plagued some 290's but it's acceptable. I've mentioned before, the computer is dead silent, so any little noise you hear. Running with Samsung U590, no flicking and very responsive monitor. Guess I got lucky.


Wow that is one nice rig.

I am thinking hard to do a full WC block for mine, but maybe once funds is available.
Here in my place, choices will be limited and careful planning will need to be done.

With regards to the red and green team fans, being in 295x2 owner thread, it is quite logical that discussions are focused in trying to improve and solve problems, or share tips, rather than bashing the equipment. If bashing is to be done, as some other member has suggested, maybe in another equipment thread.

Cheers.


----------



## pompss

http://www.techpowerup.com/gpuz/f5ady/
http://www.techpowerup.com/gpuz/d3mca/

Anyway as i proof that i own this Gpu i can finally say that for me it's crap.
After two week testing and testing ,changing psu i still get 45 fps with thief at 1440p.
So for someone this card is gold for me is just crap. Also xfx assistance is crap too.
I share this experience here to let people know !!!
Also crossfire is not working if you not are in full screen.
That's just ridiculous and people are happy with that.
GOLD card Right???


----------



## kalijaga

Quote:


> Originally Posted by *pompss*
> 
> http://www.techpowerup.com/gpuz/f5ady/
> http://www.techpowerup.com/gpuz/d3mca/
> 
> Anyway as i proof that i own this Gpu i can finally say that for me it's crap.
> After two week testing and testing ,changing psu i still get 45 fps with thief at 1440p.
> So for someone this card is gold for me is just crap. Also xfx assistance is crap too.
> I share this experience here to let people know !!!
> Also crossfire is not working if you not are in full screen.
> That's just ridiculous and people are happy with that.
> GOLD card Right???


Hmmm... I cannot run crysis 3, BF4 and tomb raider max out 70fps since updating to 14.6
As I said ..Bugatti Veyron....









Cheers


----------



## pompss

Quote:


> Originally Posted by *kalijaga*
> 
> Hmmm... I cannot run crysis 3, BF4 and tomb raider max out 70fps since updating to 14.6
> As I said ..Bugatti Veyron....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers


Driver issues. No wonder with amd


----------



## ImperialOne

Quote:


> Originally Posted by *pompss*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kalijaga*
> 
> Hmmm... I cannot run crysis 3, BF4 and tomb raider max out 70fps since updating to 14.6
> As I said ..Bugatti Veyron....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> 
> 
> 
> Driver issues. No wonder with amd
Click to expand...

You are funny. I have original Titans x2 in SLI each on a separate WC loop.... And 295x2 with a 290x (air) in Trifire. No driver issues on either, the latter (obviously) is notably faster in 4K. BTW, Trifire because AMD scalability with 3 GPU vastly superior to nvidia at the high end.., Otherwise I would have added a third Titan instead.
It's really a chocolate vs vanilla, lobster vs crab argument. Both options yummy, each with something different (yet similar) to offer.
Whether intentional or not, fanboism is stupid. Nvidia & AMD need each other. WE as consumers need them both to do well so that they push the other and keep each other honest. Like Good and Evil; when you think of it, they need each other as well. lol


----------



## kalijaga

Quote:


> Originally Posted by *pompss*
> 
> Driver issues. No wonder with amd


I accept these issues knowing full well what I was getting into and pray for solution in the future.
My 780s also had their fare share of problems, slightly less considering they have been around longer and most games have mature drivers.

Patience ..patience...

Cheers


----------



## crazygamer123

I just move my rig into the aircon room. And the R9 295x2 's temp drop from 45-35 Celsius (Idle mode). Also, in Metro Last Light you should turn off the option "Advanced Physx" to get a better frame rate.


----------



## Sgt Bilko

Quote:


> Originally Posted by *pompss*
> 
> http://www.techpowerup.com/gpuz/f5ady/
> http://www.techpowerup.com/gpuz/d3mca/
> 
> Anyway as i proof that i own this Gpu i can finally say that for me it's crap.
> After two week testing and testing ,changing psu i still get 45 fps with thief at 1440p.
> So for someone this card is gold for me is just crap. Also xfx assistance is crap too.
> I share this experience here to let people know !!!
> Also crossfire is not working if you not are in full screen.
> That's just ridiculous and people are happy with that.
> GOLD card Right???


Thief crossfire only works with Mantle and you have never been able to use crossfire in full screen in DX mode with any game.

You can run windowed crossfire with Mantle though.

DX crossfire for Thief is just terrible and actually runs better as a single card for me...so I use Mantle.

Research research research.


----------



## kalijaga

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thief crossfire only works with Mantle and you have never been able to use crossfire in full screen in DX mode with any game.
> 
> You can run windowed crossfire with Mantle though.
> 
> DX crossfire for Thief is just terrible and actually runs better as a single card for me...so I use Mantle.
> 
> Research research research.


Good for you man...I don't have thief though.

I am thinking hard about moving my rig to the aircond room as well.....


----------



## LtMatt

Quote:


> Originally Posted by *tsm106*
> 
> That's awesome, you truly are The AMD GOD, but I'm not sure your interpretation of the pic is the same as the artist's.


It matters not, I take it as a compliment. I have some more pictures you can have too. I might even sign one for you if you ask nicely. Wear it with pride dude.








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thief crossfire only works with Mantle and you have never been able to use crossfire in full screen in DX mode with any game.
> 
> You can run windowed crossfire with Mantle though.
> 
> DX crossfire for Thief is just terrible and actually runs better as a single card for me...so I use Mantle.
> 
> Research research research.


Love the fact Mantle works in windowed mode. Currently it's the only way to record gameplay footage. Just wish you got the same performance as you do in full screen mode. Windowed definitely has some cpu overhead and lower scaling to boot. Nonetheless it's better than crossfire not working at all.


----------



## iamimpossible

Ok so got a new case today (corsair 900D !!)

Also added another noctua fan in push pull configuration on the r9 295x2 rad. Temps dont go over 63c.
All case fans are off.

When I run the fan's it crashs, must be lack of power (corsair 850 psu). Will be ordering a new psu, Silverstone Strider SST-ST1500 and switch the fans on. The corsair AX1500i seems way too expensive.


----------



## iamimpossible

doh looks like im getting the AX1500i instead!

its way more efficient and at least i dont ever have to think about upgrading the psu again.


----------



## Elmy

Quote:


> Originally Posted by *iamimpossible*
> 
> Ok so got a new case today (corsair 900D !!)
> 
> Also added another noctua fan in push pull configuration on the r9 295x2 rad. Temps dont go over 63c.
> All case fans are off.
> 
> When I run the fan's it crashs, must be lack of power (corsair 850 psu). Will be ordering a new psu, Silverstone Strider SST-ST1500 and switch the fans on. The corsair AX1500i seems way too expensive.


I blew up a AX1500i at AMD ExtravaLANza last Friday night... LoL ..... Luckily AMD had one up in one of their labs and gave it to me.  I let a couple drops of water fall into the PSU when I was setting up for the event. Didn't think it was going to be a problem and soon as I turned it on I heard a sizzle sound and **** my pants.

If anyone is curious about the event you can find some pictures here. My setup is the white computer with the Ruby on the front towards the end of the pictures.

https://www.facebook.com/media/set/?set=a.680809418652859.1073741836.162718280461978&type=1


----------



## DeadlyDNA

Quote:


> Originally Posted by *Elmy*
> 
> I blew up a AX1500i at AMD ExtravaLANza last Friday night... LoL ..... Luckily AMD had one up in one of their labs and gave it to me.  I let a couple drops of water fall into the PSU when I was setting up for the event. Didn't think it was going to be a problem and soon as I turned it on I heard a sizzle sound and **** my pants.
> 
> If anyone is curious about the event you can find some pictures here. My setup is the white computer with the Ruby on the front towards the end of the pictures.
> 
> https://www.facebook.com/media/set/?set=a.680809418652859.1073741836.162718280461978&type=1


Elmy, PSU aren't waterproof yet!(j/k) Speaking of which when will they become standard water resistant.


----------



## Elmy

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Elmy, PSU aren't waterproof yet!(j/k) Speaking of which when will they become standard water resistant.


I was thinking of spraying a layer of silicone on the circuit board inside the PSU actually but don't know how much that would affect the thermal capabilities of the PSU


----------



## PachAz

Not really smart letting water enter a psu and then turn it on without letting it dry. User error and well...maybe a pattern?


----------



## Elmy

Quote:


> Originally Posted by *PachAz*
> 
> Not really smart letting water enter a psu and then turn it on without letting it dry. User error and well...maybe a pattern?


I've been watercooling for 6 years now and this is the first thing I ever broke...Guess I am pretty smart after all.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Elmy*
> 
> I've been watercooling for 6 years now and this is the first thing I ever broke...Guess I am pretty smart after all.


I had a spill happen on mine about 1 month after i installed all gpus/cpu water blocks. it was a leak actually and i didn't notice it until my pc shutdown while i was using it. turned out it dripped down edge of the cards and the mobo heatsinks into my PCU (Lepa gs1600). I guess i was lucky because once i took it all apart and cleaned it and dried out nothing was broken. I think my mistake was a combination of not tightening a seal enough and to much heat in the system maybe causing pressure.

As the saying goes, Shirt happens.

Good to hear you had a backup to use at least.


----------



## Jpmboy

lol... a guy gets a bum card (or two? pilot error? or problem with other gear?) and has a cathartic episode on-line. send it back and get something else. geeze.
And witrh out a doubt, a real waterblock helps this card shine with the new MSI AB. No issues at all at 4K - AA is meaningless on a 28" 4K monitor. turn up the textures etc and all games look astonishing!


----------



## Zaxis01

I will be getting my card Monday. I picked up a koolance water block. Does anyone in here have this block? If so how does it perform in a mid to high end water cooling system?


----------



## Zaxis01

Quote:


> Originally Posted by *Jpmboy*
> 
> lol... a guy gets a bum card (or two? pilot error? or problem with other gear?) and has a cathartic episode on-line. send it back and get something else. geeze.
> And witrh out a doubt, a real waterblock helps this card shine with the new MSI AB. No issues at all at 4K - AA is meaningless on a 28" 4K monitor. turn up the textures etc and all games look astonishing!


Nicely put.

There's always going to be issues with programs and compatibility issues. Usually if it's a widespread issue it will be a thread relative to your issue. And if it's a isolated problem then you should check your hardware or drivers and do some research for other problems similar to the one you're experiencing.


----------



## Jpmboy

Quote:


> Originally Posted by *Zaxis01*
> 
> I will be getting my card Monday. I picked up a koolance water block. Does anyone in here have this block? If so how does it perform in a mid to high end water cooling system?


yes - it works just fine. never see temps > 50C if you stay with CCC. ~60C if you max out the oC with afterburner.


----------



## fireedo

Quote:


> Originally Posted by *Jpmboy*
> 
> lol... a guy gets a bum card (or two? pilot error? or problem with other gear?) and has a cathartic episode on-line. send it back and get something else. geeze.
> And witrh out a doubt, a real waterblock helps this card shine with the new MSI AB. No issues at all at 4K - AA is meaningless on a 28" 4K monitor. turn up the textures etc and all games look astonishing!


well with push/pull config my temp never reach over 67 c (without air cond) but seems like VRMs are hot because it is throthling while playing Metro : LL .... really need a real waterblock







:doh:


----------



## cennis

can you guys monitor vrm temps in gpu Z like the 290/x cards?


----------



## axiumone

Quote:


> Originally Posted by *cennis*
> 
> can you guys monitor vrm temps in gpu Z like the 290/x cards?


Nope and there's no way to control the fan speeds on this card at the moment either... but you can control the voltage! lol


----------



## rdr09

Quote:


> Originally Posted by *axiumone*
> 
> Nope and there's no way to control the fan speeds on this card at the moment either... but you can control the voltage! lol


i think you quoted the wrong post.


----------



## axiumone

Why? There's no way to monitor the vrm temps on the 295x2.


----------



## rdr09

Quote:


> Originally Posted by *axiumone*
> 
> Why? There's no way to monitor the vrm temps on the 295x2.


sorry. i missed the word - and.


----------



## ImperialOne

Have a case fan blow over/onto the 295x2 VRM?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *iamimpossible*
> 
> Here's mine
> 
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *electro2u*
> 
> I'd never want to belong to a club that would have me as a member.
> 
> ...except this one.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Accepted!


----------



## NavDigitalStorm

I'm currently experimenting with having a 295X2 and a W9100 in the same rig.


----------



## iamimpossible

i really do want a good monitor. I have the benq 144hz 27" (XL2720Z), not played any games yet but im sure the input lag is the best etc.

I've considered the LG 49" 4k TV as a second monitor but input lag is far too high. Now I'm looking for a good 27 or over with a gloss screen.

LG UM65 34UM65 with a glass screen would be great and id even forgive the long response time.

really want to enjoy the card but pointless if display is not doing it justice.


----------



## ImperialOne

Go 4K


----------



## Jpmboy

Quote:


> Originally Posted by *ImperialOne*
> 
> Go 4K


^^ this. Check out the Samsung 4K monitor.


----------



## iamimpossible

It needs to be gloss, colours are much more vibrant, shadow detail is far far better and black are miles better.

Im more then happy to put my desk in a spot where I don't get reflections.

All anti reflective coatings have some sort of dulling/filtering effect on the picture.


----------



## Zaxis01

I have the Samsung 4k monitor and the picture is awesome. Text is so crisp and textures pop so nicely.


----------



## ImperialOne

Zaxis- u have any problems with the 295x2 running 4k at 60Hz over DP1.2 with your Sammy?


----------



## kalijaga

Guys finally got to take the pics of my rig.
Can I join the club now?


----------



## Zaxis01

Quote:


> Originally Posted by *ImperialOne*
> Zaxis- u have any problems with the 295x2 running 4k at 60Hz over DP1.2 with your Sammy?


I can't say yet. But I'm getting my card on Monday. I will run some 4k tests and post my results here.


----------



## Jamble

I finally got my throttling issues sorted. Used the fans off my h100 in push pull. Doesn't reach over 67degrees with full load at 40 degrees ambient.

Anyone know any good programs I can set up custom fan curves for chassis fans?


----------



## fireedo

hahaha, same here finally no throttle or something, Metro : LL finished with average FPS about 80 or more really smooth @ 1440p very high

Maximum temp detected with GPU-Z is 71 C and with HWMonitor is 73 C well still safe at least


----------



## iamimpossible

me too, changed my case to corsair 900d, better airflow and now I get consistent benchmarks.

At first I thought it was my power supply. Also changed my drivers from the beta 14.6 to the 14.4 which helped stability.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *kalijaga*
> 
> Guys finally got to take the pics of my rig.
> Can I join the club now?


Quote:


> Originally Posted by *Jamble*
> 
> I finally got my throttling issues sorted. Used the fans off my h100 in push pull. Doesn't reach over 67degrees with full load at 40 degrees ambient.
> 
> Anyone know any good programs I can set up custom fan curves for chassis fans?


Accepted!

What brand are the cards?


----------



## The Stilt

Could someone please provide me the original "master" and "slave" bios dumps for the card?
Version 015.045.xxx is preferred.


----------



## levism99

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I'm currently experimenting with having a 295X2 and a W9100 in the same rig.


Do they work well together


----------



## Jpmboy

Quote:


> Originally Posted by *ImperialOne*
> 
> Zaxis- u have any problems with the 295x2 running 4k at 60Hz over DP1.2 with your Sammy?


It runs perfectly. not one issue. (and the graphics are simply amazing in any game - whether you're using AMD or NV)


----------



## kalijaga

Nav,
Mine are msi and sapphire...
Cheers


----------



## ImperialOne

Anybody with a 295x2 having issues with the asus pb287q (4k 28" monitor). There is a topic on the boards about this, but it's been heavily corrupted by fanboism it's hard to discern anything. Want to get the Asus board, esp since I need to VESA amount capability...


----------



## iamimpossible

Quote:


> Originally Posted by *kalijaga*
> 
> Nav,
> Mine are msi and sapphire...
> Cheers


What psu are you using? Because in one of the reviews in this thread the corsair 1200i was not enough.


----------



## fireedo

Quote:


> Originally Posted by *iamimpossible*
> 
> What psu are you using? Because in one of the reviews in this thread the corsair 1200i was not enough.


I really want to know, if 1200watt is enough for two R9 295x2 then if i do tri fire hybrid between a 295x2 with a 290 then 1000watt is enough? Or not?


----------



## kalijaga

Nav,

the PSU you saw from the pic I posted is the CM V1200. Behind the chassis (which had been kept opened), I ran a Silverstone strider evo 1200w powering just the lower card.
I cannot find a good 1500w close to my place...thus 2 PSU for me.


----------



## kalijaga

Quote:


> Originally Posted by *fireedo*
> 
> I really want to know, if 1200watt is enough for two R9 295x2 then if i do tri fire hybrid between a 295x2 with a 290 then 1000watt is enough? Or not?


\

If you're going trifire, go big man. At least 1200w. I went overboard...

Good luck.


----------



## electro2u

Quote:


> Originally Posted by *fireedo*
> 
> I really want to know, if 1200watt is enough for two R9 295x2 then if i do tri fire hybrid between a 295x2 with a 290 then 1000watt is enough? Or not?


1200W for 2x 295x2? I don't think so.
1000W for tri fire w 1 295x2 and 1 290x? Barely almost enough--full system trifire uses at least 1050W at full load. A good 1000W PSU can handle this but it's not ideal. I'd go 1200 for trifire and 1500 for quad. I ordered a Seasonic 1250 Gold for my upcoming TriFireWater build.


----------



## kalijaga

Quote:


> Originally Posted by *electro2u*
> 
> 1200W for 2x 295x2? I don't think so.
> 1000W for tri fire w 1 295x2 and 1 290x? Barely almost enough--full system trifire uses at least 1050W at full load. A good 1000W PSU can handle this but it's not ideal. I'd go 1200 for trifire and 1500 for quad. I ordered a Seasonic 1250 Gold for my upcoming TriFireWater build.


Agreed.
to be happy and safe I use 1200w x2.

Cheers.


----------



## iamimpossible

looks like im going corsair 1500, was hoping to save some money and get 1200. But rather not have to change again, spend more later.
This should cover me for 7 years lol. power consumption will come down because they will go 20nm on the processors, at least I don't have to think about power again.

Waiting for X99 and will upgrade the 3770k.

maybe new cards will arrive during x99. free sync or g sync 4k at 60hz I would be quite happy with.


----------



## Jamble

I pull over 1000watts at the wall with a 295x and 4770k
so theres no way 1000watts will be enough with an extra 290x in there


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jamble*
> 
> I pull over 1000watts at the wall with a 295x and 4770k
> so theres no way 1000watts will be enough with an extra 290x in there


Try running it straight from the socket instead of off a powerboard, you might get a different result


----------



## 4K-HERO

I was able to run a 295x2 with a 290, Asus hero, overclockEd 4670k, 2 more small radiators and 7 fans with a corsair rm1000. The Max wattage I drew was a 1017 watts. When you're going with two 295x2's, you should worry more about amperage. According to AMD, each power cable needs to handle 28a. Multiply that by 4 cables and you got 112a just for the 2 cards. Plus some more for other things in your system. If your psu is not rated for enough amperage, your cables could overheat and melt. A fire is also possible. So make sure your PSU is rated for more than 112a. I upgraded to the evga 1300w because it could handle 108a to the single 12v rail. With the corsair rm1000, I was on the line for both wattage and amperage but I didn't want to risk it.


----------



## DeadlyDNA

.


----------



## pompss

Possible that none here its playing watch dogs and can tell me the average fps his getting???


----------



## ImperialOne

WatchDogs is very difficult since nvidia conspired with its producer to screw AMD cards. While I have Titans and 295x2, I'd play WD in the Green cards. I'm a fan if whoever gives me best experience, so while Mantle world to improve frames for AMD cards, nVidia's idea to screw over its competitors gameplay experience crosses a line. Silly, but it likely shows weakness on Team Green's part. Then again, Titan Z is a more manifest sign.
Pompss, as a nvidia loyalist, you can see any WatchDog or 295x2 review and find the fps values for 295x2 on WD.


----------



## pompss

Quote:


> Originally Posted by *ImperialOne*
> 
> WatchDogs is very difficult since nvidia conspired with its producer to screw AMD cards. While I have Titans and 295x2, I'd play WD in the Green cards. I'm a fan if whoever gives me best experience, so while Mantle world to improve frames for AMD cards, nVidia's idea to screw over its competitors gameplay experience crosses a line. Silly, but it likely shows weakness on Team Green's part. Then again, Titan Z is a more manifest sign.
> Pompss, as a nvidia loyalist, you can see any WatchDog or 295x2 review and find the fps values for 295x2 on WD.


Asking because i getting 10-22 fps at 1440p
I don't wanna see review from magazine i wanna know in this forum what everybody else is getting


----------



## pompss

i see on digital storm that at 4k average id 56 fps.
How its possible that it get 15-22 fps at 1440p ?

http://www.digitalstormonline.com/unlocked/watch-dogs-benchmarks-amd-radeon-r9-1080p-and-4k-idnum281/


----------



## kalijaga

Quote:


> Originally Posted by *pompss*
> 
> Asking because i getting 10-22 fps at 1440p
> I don't wanna see review from magazine i wanna now in this forum what everybody else is getting


Hi Pomps,

just to share the issue of Green vs Red, using the 14.4 driver, I get 20-30 fps with stuttering at 4k, about 30-40 at 1440p. Using the 14.6 driver, I get 30-50fps with slightly less stuttering at 4k, about 50-80 at 1440p. This is using the worse mods and temp limit at 66C.

Good luck.


----------



## pompss

Quote:


> Originally Posted by *kalijaga*
> 
> Hi Pomps,
> 
> just to share the issue of Green vs Red, using the 14.4 driver, I get 20-30 fps with stuttering at 4k, about 30-40 at 1440p. Using the 14.6 driver, I get 30-50fps with slightly less stuttering at 4k, about 50-80 at 1440p. This is using the worse mods and temp limit at 66C.
> 
> Good luck.


Thanks you so much for sharing.
I guess your settings are set at max quality (without the high quality mods).

so there is some problems with my card as with 14.4 driver i get 15-20 fps a 1440p. with 14.6 driver same thing 15- 20 fps

What cpu you have ??


----------



## kalijaga

Po
Quote:


> Originally Posted by *pompss*
> 
> Thanks you so much for sharing.
> I guess your settings are set at max quality (without the high quality mods).
> 
> so there is some problems with my card as with 14.4 driver i get 15-20 fps a 1440p. with 14.6 driver same thing 15- 20 fps
> 
> What cpu you have ??


I have a meagre 4770k mildly overclock to 4.3. All automatically done by the sabertooth z87.

I think your numbers should be ok considering this game is still new and unoptimized.
Maybe divide my numbers by half for a single 295x2, and I think others users with more experience and powerful rig will have a much higher numbers.

Maybe use other older games eg BF3, Crysis...etc

Good luck man.


----------



## crazygamer123

anyone have the card running with water block, what is your max temp while playing some heavy graphic games ?


----------



## Jpmboy

Quote:


> Originally Posted by *crazygamer123*
> 
> anyone have the card running with water block, what is your max temp while playing some heavy graphic games ?


max T I've ever seen is 58C (water was @ 33C). I used gelid extreme and fuji extreme pads on the waterblock (except that weird 0.7MM pad - used the koolance supplied pad). This is with +25mV/1100/1500


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> max T I've ever seen is 58C (water was @ 33C). I used gelid extreme and fuji extreme pads on the waterblock (except that weird 0.7MM pad - used the koolance supplied pad). This is with +25mV/1100/1500


i guess you using the water block from koolance???


----------



## pompss

Someone is using the waterblock from ek or acquacomputer??
Like to know the temp and how perform. Is the vrm cooled also??
If not i'm thinking to buy a waterblock to cool down the vrm and vram and using the universal waterblock from ek for the gpu.
i can feel that the vrm its getting pretty hot.


----------



## Jeronbernal

hey guys, i'm currently coming from a sli 780ti setup, trying to figure out what i should put in my mITX setup i'm currently migrating to...

my options are between team green z, and a 295x2, id prefer to go the $1700 route, but if the problems are large, then you get the gist.... i want to stick with dual gpu's for my mITX build... but from what ive heard, there are some driver issues/ is this just with the 290x or also with the 295x2? i havent used radeon since ive had my prebuilt hp envy 1534, which had a hd radeon 7570, which i never had issues with.

any help would be greatly appreciated.


----------



## HoneyBadger84

Quote:


> Originally Posted by *pompss*
> 
> Thanks you so much for sharing.
> I guess your settings are set at max quality (without the high quality mods).
> 
> so there is some problems with my card as with 14.4 driver i get 15-20 fps a 1440p. with 14.6 driver same thing 15- 20 fps
> 
> What cpu you have ??


Did you do the driver switch properly, cleaning in between uninstall & install of new drivers? Cuz I'm getting more FPS than that on single card, let alone Crossfire, with everything maxed in the game, and that's with or without theWorst mod. You shouldn't be getting FPS anywhere near that low if you're using an R9 295x2 or anything close to it in terms of power.

The main issues with Watch_Dogs that I had were resolved between the patch & the 14.6 RC drivers, the game runs neigh-on flawlessly for me now.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jeronbernal*
> 
> hey guys, i'm currently coming from a sli 780ti setup, trying to figure out what i should put in my mITX setup i'm currently migrating to...
> 
> my options are between team green z, and a 295x2, id prefer to go the $1700 route, but if the problems are large, then you get the gist.... i want to stick with dual gpu's for my mITX build... but from what ive heard, there are some driver issues/ is this just with the 290x or also with the 295x2? i havent used radeon since ive had my prebuilt hp envy 1534, which had a hd radeon 7570, which i never had issues with.
> 
> any help would be greatly appreciated.


What driver issues have "heard" about, and were they by chance from a Green Team fanboi? I've had no serious issues at all since switching from my ol' 7970 to several different R9 290Xs that have been carouseling through my rig (lol).

Matter o fact, with titles like Watch_Dogs, I dare say AMD has less issues now that NVidia does, funny enough since it was an NVidia-drafted title, so to speak. I wouldn't go a Titan Z for the sheer fact that it's performance to price ratio is absolute garbage. You'd be better off going a GTX 690 if you wanna stay on that side of the fence, at least they're decently priced for the performance they give.

The R9 295x2 performs well & stays very cool with it's included cooling, given what it is, I'd say the ~$1200+ difference between it & the Titan Z pretty much makes it a slam dunk...

FYI I'm not a fanperson for either team, I used NVidia for the majority of my computer-building times thus far, just switched to AMD about 2 years ago now (6990s & 7970s were my first experiences with their cards) & I'm pretty happy on the red side of the fence, but I don't do any undo bashing of Green Team hardware. The Titan Z is really only meant for the Pro market with it's ridiculous Double Precision capabilities, as far as a gaming card goes, it's just not worth the money at all.


----------



## soulwrath

Quote:


> Originally Posted by *HoneyBadger84*
> 
> What driver issues have "heard" about, and were they by chance from a Green Team fanboi? I've had no serious issues at all since switching from my ol' 7970 to several different R9 290Xs that have been carouseling through my rig (lol).
> 
> Matter o fact, with titles like Watch_Dogs, I dare say AMD has less issues now that NVidia does, funny enough since it was an NVidia-drafted title, so to speak. I wouldn't go a Titan Z for the sheer fact that it's performance to price ratio is absolute garbage. You'd be better off going a GTX 690 if you wanna stay on that side of the fence, at least they're decently priced for the performance they give.
> 
> The R9 295x2 performs well & stays very cool with it's included cooling, given what it is, I'd say the ~$1200+ difference between it & the Titan Z pretty much makes it a slam dunk...
> 
> FYI I'm not a fanperson for either team, I used NVidia for the majority of my computer-building times thus far, just switched to AMD about 2 years ago now (6990s & 7970s were my first experiences with their cards) & I'm pretty happy on the red side of the fence, but I don't do any undo bashing of Green Team hardware. The Titan Z is really only meant for the Pro market with it's ridiculous Double Precision capabilities, as far as a gaming card goes, it's just not worth the money at all.


if at anything keep 1x 780 Ti do not downgrade to a 104 chip set


----------



## pompss

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Did you do the driver switch properly, cleaning in between uninstall & install of new drivers? Cuz I'm getting more FPS than that on single card, let alone Crossfire, with everything maxed in the game, and that's with or without theWorst mod. You shouldn't be getting FPS anywhere near that low if you're using an R9 295x2 or anything close to it in terms of power.
> 
> The main issues with Watch_Dogs that I had were resolved between the patch & the 14.6 RC drivers, the game runs neigh-on flawlessly for me now.


I reinstall WD and now with the patch and the 14.6 rc driver i getting 50- 70 fps when i walk or run on feet in the city.
IF i take the car and run into the city my fps drops to 20-35 fps i can see that both gpu is not at 100% at all but its at 50%- 65% most of the time rarely does goes up to 100% also i see both gpu drop to 25% randomly.Also the fan is dead its not turning on and the card is hot.
Could you please check if you are experiencing the same thing. I use msi afterburner to check the gpu usage.
If not i have to rma the card .
thanks


----------



## HoneyBadger84

Quote:


> Originally Posted by *pompss*
> 
> I reinstall WD and now with the patch and the 14.6 rc driver i getting 50- 70 fps when i walk or run on feet in the city.
> IF i take the car and run into the city my fps drops to 20-35 fps i can see that both gpu is not at 100% at all but its at 50%- 65% most of the time rarely does goes up to 100% also i see both gpu drop to 25% randomly.Also the fan is dead its not turning on and the card is hot.
> Could you please check if you are experiencing the same thing. I use msi afterburner to check the gpu usage.
> If not i have to rma the card .
> thanks


The fan is dead on the card? Like completely? That's not good  I don't have a 295x2 yet myself so I can't test that, but I'm pretty sure no one else with them is having that issue. Might wanna take out the card & double check the fan didn't come unplugged from the card or something.

Glad we got your FPS issues semi resolved, hope that fan comes back to life cuz that sucks


----------



## pompss

Quote:


> Originally Posted by *HoneyBadger84*
> 
> The fan is dead on the card? Like completely? That's not good  I don't have a 295x2 yet myself so I can't test that, but I'm pretty sure no one else with them is having that issue. Might wanna take out the card & double check the fan didn't come unplugged from the card or something.
> 
> Glad we got your FPS issues semi resolved, hope that fan comes back to life cuz that sucks


With 14.4 driver the fun is working , with 1.46 Rc the fan is always off even with overclocking 1100-1400

So if i get back to 14.4 i get 20-30 fps and fan is working if i keep the 14.6 RC the fan isnt' working getting higher fps but both gpu run at 50- 60%.









my conclusion is driver issues and half working card.

For sure i wanna RMA the card But i want xfx cross shipping me anothe r9 295x2 because after 1500 dollars i cannot stay without card and i don't want a refurb one.
Also 3 week waiting its too much Time and it's unacceptable !! Wanna complaint to xfx tomorrow hope they agree for faster replacement , cross ship or a refund !!!
I don't asking much after spending 1500 dollars


----------



## crazygamer123

Quote:


> Originally Posted by *pompss*
> 
> For sure i wanna RMA the card But i want xfx cross shipping me anothe r9 295x2 because after 1500 dollars i cannot stay without card and i don't want a refurb one.
> Also 3 week waiting its too much Time and it's unacceptable !! Wanna complaint to xfx tomorrow hope they agree for faster replacement , cross ship or a refund !!!
> I don't asking much after spending 1500 dollars


Gluck bro, the waiting time is really a pain, my RMA waiting time was only one week but i feel like it was taking forever.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *pompss*
> 
> Possible that none here its playing watch dogs and can tell me the average fps his getting???


About 57 fps at 4K.


----------



## Zaxis01

I have a Gigabyte 295x2 with ASIC% 82.0.

What is the standard with these particular cards?

Also what is everyone's overclocking results?


----------



## HoneyBadger84

Quote:


> Originally Posted by *pompss*
> 
> With 14.4 driver the fun is working , with 1.46 Rc the fan is always off even with overclocking 1100-1400
> 
> So if i get back to 14.4 i get 20-30 fps and fan is working if i keep the 14.6 RC the fan isnt' working getting higher fps but both gpu run at 50- 60%.


Are you using Afterburner to try & manually control the fan on the card? I don't know if that works or not, I've read reviews that say you can't actually control the fan on it very well. I don't think drivers should normally effect this at all.


----------



## ImperialOne

I know there is a Pb297q thread, but do people here with 295x2 cards have issues with the new Asus 28" 4K monitor and running 4k on DP 1.2 @60Hz?

As an aside, I am going to run Trifire, is there a best practice to point the dual 295x2 or the single 290x as the primary card?


----------



## 4K-HERO

I had to use my 290 as the primary card because I couldn't find a displayport to mini Dp version 1.2. It causes problems with some games. When I finally got the cable and started using the 295x2, everything worked great.


----------



## ImperialOne

So using either card worked fine; but you think it's preferable to is the bigger card (295x2) as primary?


----------



## HoneyBadger84

Quote:


> Originally Posted by *ImperialOne*
> 
> I know there is a Pb297q thread, but do people here with 295x2 cards have issues with the new Asus 28" 4K monitor and running 4k on DP 1.2 @60Hz?
> 
> As an aside, I am going to run Trifire, is there a best practice to point the dual 295x2 or the single 290x as the primary card?


Everything I've heard & read about using TriFire with a dual-GPU card suggests that running the single GPU card as your primary card is the smarter avenue, I think it has something to do with drivers etc responding better... and of course, the added bonus of you if you are playing or doing something you only want to use one GPU for, you can simply disable Crossfire & it puts the R9 295x2 to sleep... you can't do that if the R9 295x2 is the primary card, as it's not possible to disable the on-board Crossfire on that card, from what I've heard... I don't have one so I can't say for sure, but I've seen a few people saying it's not currently possible to disable crossfire on the card properly.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ImperialOne*
> 
> I know there is a Pb297q thread, but do people here with 295x2 cards have issues with the new Asus 28" 4K monitor and running 4k on DP 1.2 @60Hz?
> 
> As an aside, I am going to run Trifire, is there a best practice to point the dual 295x2 or the single 290x as the primary card?
> 
> 
> 
> *Everything I've heard & read about using TriFire with a dual-GPU card suggests that running the single GPU card as your primary card is the smarter avenue*, I think it has something to do with drivers etc responding better... and of course, the added bonus of you if you are playing or doing something you only want to use one GPU for, you can simply disable Crossfire & it puts the R9 295x2 to sleep... you can't do that if the R9 295x2 is the primary card, as it's not possible to disable the on-board Crossfire on that card, from what I've heard... I don't have one so I can't say for sure, but I've seen a few people saying it's not currently possible to disable crossfire on the card properly.
Click to expand...

That is the exact opposite of the reason you BUY a 295X2 for. Why is having 4 or more DP ports important? Running panels off the same ports saves one from issues of panels being out of sync. In the future or even now, 4K eyefinity is only possible with a 295X2. Why is that? It's the ports Mars, it's gotta be the ports!

If you run a 295X2 as a slave card, you really just wasted 1500 bucks or over spent that 500 bucks depending on your pov.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> That is the exact opposite of the reason you BUY a 295X2 for. Why is having 4 or more DP ports important? Running panels off the same ports saves one from issues of panels being out of sync. In the future or even now, 4K eyefinity is only possible with a 295X2. Why is that? It's the ports Mars, it's gotta be the ports!
> 
> If you run a 295X2 as a slave card, you really just wasted 1500 bucks or over spent that 500 bucks depending on your pov.


Well if you're running Eyefinity & not planning on turning it off, yeah, having the 295x2 on top makes more sense. I was refering to if he's planning on going back & forth between Eyefinity for gaming & gaming on a single monitor with Crossfire disabled... I think I got this post here & one somewhere else intertwined so I didn't fully explain my theory crafting on this one. The ports on the 295x2, like the 6990 & 7990 before it, ARE the way to go if you're going for an Eyefinity setup as they make it stupid-easy to do so.


----------



## ImperialOne

So... for a single 4K monitor... 295x2 master or slave?


----------



## crazygamer123

... dont know about tri-fire.


----------



## Zaxis01

Is anyone else here experiencing throttling issues with this card?

I have a Koolance Water Block installed and temps idle in the mid 30's and loads in the high 40s to low 50s.

I installed a fresh copy of windows 8.1 and tried catalyst driver 14.4 and 14.6 and still experiencing the same issue.

right now i have it connected to a samsung 4k monitor using hdmi and it's only capable of outputting 1920x1080p 60hz.

Am i bandwidth limited?

I placed an order for a mini dp adapter to connect my monitor using display port.

I'm not sure if that has anything to do with it or not, but currently that is my setup.

Any help or ideas would be appreciated.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Zaxis01*
> 
> Is anyone else here experiencing throttling issues with this card?
> 
> I have a Koolance Water Block installed and temps idle in the mid 30's and loads in the high 40s to low 50s.
> 
> I installed a fresh copy of windows 8.1 and tried catalyst driver 14.4 and 14.6 and still experiencing the same issue.
> 
> right now i have it connected to a samsung 4k monitor using hdmi and it's only capable of outputting 1920x1080p 60hz.
> 
> Am i bandwidth limited?
> 
> I placed an order for a mini dp adapter to connect my monitor using display port.
> 
> I'm not sure if that has anything to do with it or not, but currently that is my setup.
> 
> Any help or ideas would be appreciated.


Your temps aren't high enough to warrant throttling unless your VRMs are getting hotter. As for the adapter, I'm appalled it didn't come with the adapter you needed. That shouldn't bw the case.


----------



## pompss

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Your temps aren't high enough to warrant throttling unless your VRMs are getting hotter. As for the adapter, I'm appalled it didn't come with the adapter you needed. That shouldn't bw the case.


I think you right.
I notice that my vrms are getting very hot and my core speed drop from 100 to 25%.
Also my fan isn't working not even in manual.


----------



## Jeronbernal

The only 295x2 available at frys near me is the diamond brand 295x2, are they reputable? Never owned a diamond component.

I'm looking to watercool the 295x2 if I do get it with a open loop..

Im thinking of switching from my sli 780ti atx build to a mini itx 295x2 build, and I'm just curious if there's anything you guys think a newbie to Amd should know before pulling the trigger?

I'm not in dire need of a itx build, but it would really help out my rooms space alot. I'm also starting to get the pc build itch again =p Would you guys say you really enjoy the 295x2? And would you make the conversion? I understand my cooling options will be limited to the itx case that I use, which seems to be the 250d for me. I would be cooling both a overclocked 4790k and the 295x2 and a maximus vii impact full board block when it's released with a 240*30mm radiator and a 120x30mm radiator.

Seeming that the 295x2 is cooled with a single 120mm rad now, adding a 4790k to the mix and a 240 rad, everything should be fine?

Also, I'm using dual 144hz Asus vg248's, will i still be able to get 144hz through one of the display port to Dvi adapters?

Thanks guys for your help


----------



## cennis

How are your watchdog experiences?

with my 295x2 I get like 30~50 fps in car, 70~140 fps on foot not near cars,

1440p, ultra, AA disabled


----------



## pompss

Quote:


> Originally Posted by *Jeronbernal*
> 
> The only 295x2 available at frys near me is the diamond brand 295x2, are they reputable? Never owned a diamond component.
> 
> I'm looking to watercool the 295x2 if I do get it with a open loop..
> 
> Im thinking of switching from my sli 780ti atx build to a mini itx 295x2 build, and I'm just curious if there's anything you guys think a newbie to Amd should know before pulling the trigger?
> 
> I'm not in dire need of a itx build, but it would really help out my rooms space alot. I'm also starting to get the pc build itch again =p Would you guys say you really enjoy the 295x2? And would you make the conversion? I understand my cooling options will be limited to the itx case that I use, which seems to be the 250d for me. I would be cooling both a overclocked 4790k and the 295x2 and a maximus vii impact full board block when it's released with a 240*30mm radiator and a 120x30mm radiator.
> 
> Seeming that the 295x2 is cooled with a single 120mm rad now, adding a 4790k to the mix and a 240 rad, everything should be fine?
> 
> Also, I'm using dual 144hz Asus vg248's, will i still be able to get 144hz through one of the display port to Dvi adapters?
> 
> Thanks guys for your help


I didn't enjoy it.
A lot of drivers issues i experienced and i get a half working card.
Also keep in mind that in case you need to rma its around 2 to 4 weeks without any cross shipping options.
I would suggest to go with two 290's around 700 dollars Brand new on ebay (you save 700 and a lot of headaches) Your get almost the same performance and if one card iDOA you always have the other one.
Or if you have the money go with ndivia sli from evga if you wanna have Great customers service and fast assistance.


----------



## NavDigitalStorm

Hey guys I just reinstalled the 295X2 in my main rig with a 290X for Tri-Fire but ran into some issues.

The rig recognizes all three GPUs and I can see everything in MSI After burner. However, when I game or benchmark only GPU1 is being used. Any ideas?


----------



## Earth Dog

Did you reinstall drivers for it? I did it with a 290...
Quote:


> Im thinking of switching from my sli 780ti atx build to a mini itx 295x2 build, and I'm just curious if there's anything you guys think a newbie to Amd should know before pulling the trigger?


Why would you do this? Why get a slower card? I don't follow this logic.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Earth Dog*
> 
> Did you reinstall drivers for it? I did it with a 290...
> Why would you do this? Why get a slower card? I don't follow this logic.


He wants a smaller rig and a 295X2 is faster.


----------



## Earth Dog

Apologies.. not sure what I was thinking there... they are around the same except in TWIMTBP titles (cough bioshock, LOL!). Mere margin of error in most titles from my testing and other reviews. Anyway, I understand the downsizing part, but the performance is all too similar to make that a talking point (to me).


----------



## Skinnered

Getting serious throttling here too. Now it's warmer due the summer temps easily hiting 74 degrees for the GPU's and then the throttling started.
I got two R295x2's with just the standard hybride cooling and rad and it's not up to it's task. I have a corsair carbide air 540 case and set a push pull (one extra fan ) on the rad. Every 20 sec perf. plummets ingame. Its clear I have to do something to get this cleared out.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Skinnered*
> 
> Getting serious throttling here too. Now it's warmer due the summer temps easily hiting 74 degrees for the GPU's and then the throttling started.
> I got two R295x2's with just the standard hybride cooling and rad and it's not up to it's task. I have a corsair carbide air 540 case and set a push pull (one extra fan ) on the rad. Every 20 sec perf. plummets ingame. Its clear I have to do something to get this cleared out.


Do you have the radiators setup as intake or exhaust (sucking air in from outside or blowing it out from the inside)? I would set them up as intake if you don't have them that way now, so that they're getting as cool of air as possible.


----------



## pompss

Quote:


> Originally Posted by *Skinnered*
> 
> Getting serious throttling here too. Now it's warmer due the summer temps easily hiting 74 degrees for the GPU's and then the throttling started.
> I got two R295x2's with just the standard hybride cooling and rad and it's not up to it's task. I have a corsair carbide air 540 case and set a push pull (one extra fan ) on the rad. Every 20 sec perf. plummets ingame. Its clear I have to do something to get this cleared out.


74c its a little high but not so high.
You should put a fan on the vrms and see if gets better
One question !! Your r9 295x2 fan its on ??? mine its off even at 70c. Completely dead !!!


----------



## pompss

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> About 57 fps at 4K.


did you get less fps when driving ???

I get 35-50 fps with the fan on at 1440p
and 25 fps with the fan off at 1440p

For sure i have a problem with the fan ( need to push it to work) that cause vrms overheating causing throttling .
Someone can tell me if i'm right?


----------



## Zaxis01

How do you monitor vrm temps?


----------



## kalijaga

Quote:


> Originally Posted by *pompss*
> 
> 74c its a little high but not so high.
> You should put a fan on the vrms and see if gets better
> One question !! Your r9 295x2 fan its on ??? mine its off even at 70c. Completely dead !!!


I always thought the fans will continue to run even when the ULPS is enabled and card is powered down.
I maybe wrong.

May I suggest that you connect the actuall rad fans to another external power source just to check the rad fans. I think the VRM fans is more complicated to check as the fan headers may be underneath the cover.


----------



## pompss

Quote:


> Originally Posted by *kalijaga*
> 
> I always thought the fans will continue to run even when the ULPS is enabled and card is powered down.
> I maybe wrong.
> 
> May I suggest that you connect the actuall rad fans to another external power source just to check the rad fans. I think the VRM fans is more complicated to check as the fan headers may be underneath the cover.


The rad fan is working perfect. Its the VRM fan that its off and i need to push it in order to work.


----------



## kalijaga

Quote:


> Originally Posted by *pompss*
> 
> The rad fan is working perfect. Its the VRM fan that its off and i need to push it in order to work.


Then.... I am not able to suggest much.
I am afraid that the next step may include opening up the shroud, and checking the connections etc. It may void the warranty, thus RMA may well be the next option.


----------



## axiumone

Quote:


> Originally Posted by *Zaxis01*
> 
> How do you monitor vrm temps?


As far as I know you dont. Unless you use a laser thermometer.


----------



## pompss

Guyz i need a favor.

This is my gpu usage with watch dogs .



AS you can see its pretty unstable so i would like to know if its a game issue or card issue.

IF someone can post the same screenshot even its different or not because XFx doesn't believe the card is defective so i need more proof.
Rma will take at least 3 week to come back and before i do i need to be sure that my card is defect.
i know for sure the vrm fan is defective but since i want to buy a waterblock doesnt matter so much.
But i need to be sure there are no other defect .


----------



## cennis

Quote:


> Originally Posted by *pompss*
> 
> Guyz i need a favor.
> 
> This is my gpu usage with watch dogs .
> 
> 
> 
> AS you can see its pretty unstable so i would like to know if its a game issue or card issue.
> 
> IF someone can post the same screenshot even its different or not because XFx doesn't believe the card is defective so i need more proof.
> Rma will take at least 3 week to come back and before i do i need to be sure that my card is defect.
> i know for sure the vrm fan is defective but since i want to buy a waterblock doesnt matter so much.
> But i need to be sure there are no other defect .


my utlization is the same.. in car 35-55 fps 1440p
on foot still ranges from 55 to 120+ depends on how many cars are near. huge utilization drops, i think its drivers man


----------



## HoneyBadger84

Its the game. A lot better thsn it was on release but it still needs game-side and driver-side optimisation.


----------



## Skinnered

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Do you have the radiators setup as intake or exhaust (sucking air in from outside or blowing it out from the inside)? I would set them up as intake if you don't have them that way now, so that they're getting as cool of air as possible.


Yes I have them set to take the air into the case, I also thought that should cool the rad. as best as possible. I'm goona try to set a bigger pushfan in front of it.

pompss, my VRM fan is working, even in 2d/desktop operation. It's not a very high temp indeed, but I believe the temp target is 74-75 degrees, much lower then a regular R290.
Is it this low because of the hybride coolerpump, wich can't take any higher temps?>


----------



## HoneyBadger84

Quote:


> Originally Posted by *Skinnered*
> 
> Yes I have them set to take the air into the case, I also thought that should cool the rad. as best as possible. I'm goona try to set a bigger pushfan in front of it.
> 
> pompss, my VRM fan is working, even in 2d/desktop operation. It's not a very high temp indeed, but I believe the temp target is 74-75 degrees, much lower then a regular R290.
> Is it this low because of the hybride coolerpump, wich can't take any higher temps?>


75C is the throttle temp on these cards from the reviews I've read:

http://techreport.com/review/26279/amd-radeon-r9-295-x2-graphics-card-reviewed/12

http://www.pcgameware.co.uk/reviews/graphics-cards/msi-radeon-r9-295-x2-graphics-card-review/

So 75C is actuzlly unacceptable for this card and will cause throttling, like I already said ;-)


----------



## pompss

Quote:


> Originally Posted by *Skinnered*
> 
> Yes I have them set to take the air into the case, I also thought that should cool the rad. as best as possible. I'm goona try to set a bigger pushfan in front of it.
> 
> pompss, my VRM fan is working, even in 2d/desktop operation. It's not a very high temp indeed, but I believe the temp target is 74-75 degrees, much lower then a regular R290.
> Is it this low because of the hybride coolerpump, wich can't take any higher temps?>


I think you have airflow problem and most likely you case its too small for two r9 295x2.
The cheapest way its to add more fan pushing air inside the case and Add Fan close to VRMs and see if gets better.
If not you need to change case and go with a bigger one like corsair 750 just to give u an idea !!
Anyway with two r9 295x2 you should go with full waterblock !!!

With my case and 4 fan pushing air pointing the card i get close to max 66c with one r9 295x2


----------



## crazygamer123

hi

This one is on Newegg now. Is this the new version of R9 295x2 from Powercolor? Btw, this thing require 4 x 8 pin power supply.


----------



## HoneyBadger84

Quote:


> Originally Posted by *crazygamer123*
> 
> hi
> 
> This one is on Newegg now. Is this the new version of R9 295x2 from Powercolor? Btw, this thing require 4 x 8 pin power supply.


Oh believe me I'm acutely aware of the DEVIL 13's existence. I want one really bad. Its 2 290Xs on a single PCB and actually outperforms the R9 295x2 in a lot of benchmarks, but its also air cooled. Whole-nother monster there. And it's minimum recommended PSU is 1000W. Lol


----------



## crazygamer123

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Oh believe me I'm acutely aware of the DEVIL 13's existence. I want one really bad. Its 2 290Xs on a single PCB and actually outperforms the R9 295x2 in a lot of benchmarks, but its also air cooled. Whole-nother monster there. And it's minimum recommended PSU is 1000W. Lol


Such a good deal , if you get one right now. Free Samsung Evo 120GB ( 99$) and Razer Ouroboros (140$). 1499- 99-140 = 1260$. I dont know if i should get the Devil 13 or wait for the Ares III from Asus ( only 500 of them will be made )


----------



## HoneyBadger84

Quote:


> Originally Posted by *crazygamer123*
> 
> Such a good deal , if you get one right now. Free Samsung Evo 120GB ( 99$) and Razer Ouroboros (140$). 1499- 99-140 = 1260$. I dont know if i should get the Devil 13 or wait for the Ares III from Asus ( only 500 of them will be made )


Only 250 Devil 13 290X IIs were made, so they're even more rare. If PowerColor put one out with a stock liquid block I think that'd be the thing that finally made me go liquid on my GPUs. As it is, it's still my end goal for this generation of GPUs. I'll run it with another 290X (probably one of the TRI-Xs i have now) in TriFire when needing the extra performance, and wiht my case flo we and its monster cooler it should be a very happy, cool card.

Hope I get one before they sell out x_x


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *crazygamer123*
> 
> Such a good deal , if you get one right now. Free Samsung Evo 120GB ( 99$) and Razer Ouroboros (140$). 1499- 99-140 = 1260$. I dont know if i should get the Devil 13 or wait for the Ares III from Asus ( only 500 of them will be made )
> 
> 
> 
> 
> 
> Only 250 Devil 13 290X IIs were made, so they're even more rare. If PowerColor put one out with a stock liquid block I think that'd be the thing that finally made me go liquid on my GPUs. As it is, it's still my end goal for this generation of GPUs. I'll run it with another 290X (probably one of the TRI-Xs i have now) in TriFire when needing the extra performance, and wiht my case flo we and its monster cooler it should be a very happy, cool card.
> 
> Hope I get one before they sell out x_x
Click to expand...

The Devil is a fail. The port layout defeats the point. There's no point in buying a devil when we have 295X2s. Even the Ares 3 is a fail going by port setup as a mandatory.

Ref is gold in this case.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> The Devil is a fail. The port layout defeats the point. There's no point in buying a devil when we have 295X2s. Even the Ares 3 is a fail going by port setup as a mandatory.
> 
> Ref is gold in this case.


While I agree the port layout should've been that of a r9 295x2, it performs better in games, runs cooler beside the closed loop sucks according go several users I've seen complaining about it, and its got a prestige/collector piece factor the 295x2 doesn't.

I do wish it just had 4-6 mini-DisplayPorts though.


----------



## Cool Mike

I owned the 295x2 and now the Devil 13. I have enjoyed the Devil much better. Faster, looks very nice in my windowed case. Yes, it is louder because you are air cooling two 290x cores. This is the only disadvantage for me. Another big selling point to me was it is a custom design.


----------



## axiumone

Quote:


> Originally Posted by *HoneyBadger84*
> 
> While I agree the port layout should've been that of a r9 295x2, it performs better in games, runs cooler beside the closed loop sucks according go several users I've seen complaining about it, and its got a prestige/collector piece factor the 295x2 doesn't.
> 
> I do wish it just had 4-6 mini-DisplayPorts though.


I wouldn't say the close loop sucks, it cools 2 of my 295x2's in crossfire quite adequately. The fault, in my opinion is not being able to control the VRM fan so far, because even if the cores are nice and cool, if the VRM overheats, the card will throttle.


----------



## soulwrath

Quote:


> Originally Posted by *axiumone*
> 
> I wouldn't say the close loop sucks, it cools 2 of my 295x2's in crossfire quite adequately. The fault, in my opinion is not being able to control the VRM fan so far, because even if the cores are nice and cool, if the VRM overheats, the card will throttle.


can you fix this issue with a full waterblock?


----------



## axiumone

Quote:


> Originally Posted by *soulwrath*
> 
> can you fix this issue with a full waterblock?


Of course...


----------



## Zaxis01

So i solved my throttling issue.

I removed the water block and checked for abnormalities and discovered the thermal strips over the vrm's wasn't making direct contact.

Now that i applied a thicker layer i no longer suffer from throttling ever at 1120 core and 1500 memory overclock with +25mv increase.

I feel like such an idiot right now. LOL!


----------



## cennis

Anyone feels like the backplate is boiling hot?
from GURU3D's thermal camera images it seems like the backplate is below 60c

http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,14.html

i feel like mine is like 70 or 80


----------



## axiumone

I've posted a new video.






At this point, I'm pretty fed up with AMD, their hardware and their drivers.

Single display works perfectly smooth. However, once you enable eyefinity the monitors exhibit some horrible syncing issues in games.

Imagine the screen tearing without vsync, but simultaneously across all 5 screens and it looks like a part of the frame takes noticeably longer to refresh, making everything pretty much unplayable.

I though for a moment there could be a connection issue. Since 4 monitors are connected with DP and 1 is DVI DL, but leaving only 3 monitors in eyefinity with DP connections doesn't solve the problem.

Single 295x2 in eyefinity 5x1 works fine and one display with 295x2 crossfire works fine.

Well, new 14.6 RC2 drivers are out and eyefinity with 295x2 crossfire is still broken. Pretty disappointed at the moment. The second card has to be disabled in order to get rid of the tearing.

Here is what I have tried.

- Single monitor works with no issues.
- 3 monitors in eyefinity portrait just with displayport connections, issue present.
- Changing refresh rates on the desktop and in game from 60,100,120hz.
- Single 295x2 in eyefinity 5x1 portrait, no issues.
-14.6 beta and 14.4 whql drivers, issue present on both.
- Windows 7 x64, Windows 8.1 pro x64, issue present on both.
- Motherboards, Evga x79 dark and Asus x79 Ramapge iv black, issue present on both.
- Power supplies, Corsair AX1500i and Evga supernova nex1500, issue present on both.
- Disabling ULPS with MSI afterburner, issue present.
- Disabling poweplay with MSI afterburner, issue present.
- Under clocking the GPU, issue present. (under full load the cards at around 69-70c. Without clocks fluctuating.)
- Flashed both cards to Sapphire bios, flashed both cards to Powercolor bios, issue present on both.
- Reconnected the original radiator fan, issue present.
- All sorts of power limit changes, issue present.
- Clocks are stable and there is no apparent throttling at work here.

Now, to be fair, I have found a solution MYSELF, but it only works for titanfall and bf4 and even then, it's glitchy and the games crash randomly. The solution was to hack a seven month old 13.12 driver and add support for the 295x2 manually. I've emailed a rep at AMD to let them know where to look... that was a month ago.

Here are the two other quick vids.










At this point, I've submitted a few bug reports and also tweeted the videos to AMD. Their response was - "You may see a fix before the end of the year"

I understand that this setup may have a very limited appeal, but someone out there is bound to run into the same issues, so I'd be very glad to know about this before making this big of a purchase.

-Edit- I forgot to mention, they've pretty much confirmed that they duplicated the behavior in their lab and I know that at least one other user with the similar setup that's having these issues. So, user error is not to blame here.


----------



## Deadboy90

Quote:


> Originally Posted by *axiumone*
> 
> I've posted a new video.
> 
> 
> 
> 
> 
> 
> At this point, I'm pretty fed up with AMD, their hardware and their drivers.
> 
> Single display works perfectly smooth. However, once you enable eyefinity the monitors exhibit some horrible syncing issues in games.
> 
> Imagine the screen tearing without vsync, but simultaneously across all 5 screens and it looks like a part of the frame takes noticeably longer to refresh, making everything pretty much unplayable.
> 
> I though for a moment there could be a connection issue. Since 4 monitors are connected with DP and 1 is DVI DL, but leaving only 3 monitors in eyefinity with DP connections doesn't solve the problem.
> 
> Single 295x2 in eyefinity 5x1 works fine and one display with 295x2 crossfire works fine.
> 
> Well, new 14.6 RC2 drivers are out and eyefinity with 295x2 crossfire is still broken. Pretty disappointed at the moment. The second card has to be disabled in order to get rid of the tearing.
> 
> Here is what I have tried.
> 
> - Single monitor works with no issues.
> - 3 monitors in eyefinity portrait just with displayport connections, issue present.
> - Changing refresh rates on the desktop and in game from 60,100,120hz.
> - Single 295x2 in eyefinity 5x1 portrait, no issues.
> -14.6 beta and 14.4 whql drivers, issue present on both.
> - Windows 7 x64, Windows 8.1 pro x64, issue present on both.
> - Motherboards, Evga x79 dark and Asus x79 Ramapge iv black, issue present on both.
> - Power supplies, Corsair AX1500i and Evga supernova nex1500, issue present on both.
> - Disabling ULPS with MSI afterburner, issue present.
> - Disabling poweplay with MSI afterburner, issue present.
> - Under clocking the GPU, issue present. (under full load the cards at around 69-70c. Without clocks fluctuating.)
> - Flashed both cards to Sapphire bios, flashed both cards to Powercolor bios, issue present on both.
> - Reconnected the original radiator fan, issue present.
> - All sorts of power limit changes, issue present.
> - Clocks are stable and there is no apparent throttling at work here.
> 
> Now, to be fair, I have found a solution MYSELF, but it only works for titanfall and bf4 and even then, it's glitchy and the games crash randomly. The solution was to hack a seven month old 13.12 driver and add support for the 295x2 manually. I've emailed a rep at AMD to let them know where to look... that was a month ago.
> 
> Here are the two other quick vids.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At this point, I've submitted a few bug reports and also tweeted the videos to AMD. Their response was - "You may see a fix before the end of the year"
> 
> I understand that this setup may have a very limited appeal, but someone out there is bound to run into the same issues, so I'd be very glad to know about this before making this big of a purchase.
> 
> -Edit- I forgot to mention, they've pretty much confirmed that they duplicated the behavior in their lab and I know that at least one other user with the similar setup that's having these issues. So, user error is not to blame here.


That vid looks REALLY laggy lol.


----------



## axiumone

Quote:


> Originally Posted by *Deadboy90*
> 
> That vid looks REALLY laggy lol.


That's the whole point. Check out the other two as well. Also, keep in mind that the vid is shot only in 30 fps, it actually makes it appear much smoother on video. In person, its headache inducing. It's not a pleasant experience.


----------



## Deadboy90

Quote:


> Originally Posted by *axiumone*
> 
> That's the whole point. Check out the other two as well. Also, keep in mind that the vid is shot only in 30 fps, it actually makes it appear much smoother on video. In person, its headache inducing. It's not a pleasant experience.


It almost seems like the card cant handle 5 1080p monitors.


----------



## axiumone

Quote:


> Originally Posted by *Deadboy90*
> 
> It almost seems like the card cant handle 5 1080p monitors.


The thing is, with 13.12 hacked drivers, BF4 on all ultra with no aa gets 100-170fps. Raw power is there, held back by lack of good drivers.


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> The thing is, with 13.12 hacked drivers, BF4 on all ultra with no aa gets 100-170fps. Raw power is there, held back by lack of good drivers.


I have R9 290x in quad fire and have this issue as well. I don't think its specific to just video card but like you say driver related and "Crossfire" also. I can use 2 gpus in crossfire before i start to have this issue like your video. That would line up with your single 295x2 works fine but not 2 of them. When i use Tri fire, the issue is there but way less, than when i use quad fire. I Also noticed this more when on PCIE 2.0 than 3.0.

Can you verify your cards are on PCIE 3.0, if they are can you try in BIOS to set to PCIE 2.0 and test? Use GPU-z And try running the full screen test for link speeds. (i assume you know how to do that cause it's easy)

factors in my case i know affect this issue.

1.Crossfire (trifire-quadfire mainly)
2. Resolution/ Refresh rates.
3. PCIE link speed?(needs more testing)


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> I have R9 290x in quad fire and have this issue as well. I don't think its specific to just video card but like you say driver related and "Crossfire" also. I can use 2 gpus in crossfire before i start to have this issue like your video. That would line up with your single 295x2 works fine but not 2 of them. When i use Tri fire, the issue is there but way less, than when i use quad fire. I Also noticed this more when on PCIE 2.0 than 3.0.
> 
> Can you verify your cards are on PCIE 3.0, if they are can you try in BIOS to set to PCIE 2.0 and test? Use GPU-z And try running the full screen test for link speeds. (i assume you know how to do that cause it's easy)
> 
> factors in my case i know affect this issue.
> 
> 1.Crossfire (trifire-quadfire mainly)
> 2. Resolution/ Refresh rates.
> 3. PCIE link speed?(needs more testing)


When I was talking to AMD this is actually what they were alluding to as well. That it may be a pcie bandwidth issue. I can verify that my cards are both running in pcie 3.0 x16. I've actually found that running one card in pcie 2.0 x16 actually kind of alleviates the issue.


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> When I was talking to AMD this is actually what they were alluding to as well. That it may be a pcie bandwidth issue. I can verify that my cards are both running in pcie 3.0 x16. I've actually found that running one card in pcie 2.0 x16 actually kind of alleviates the issue.


Possibly, i also forgot about a few other things.

Are you using AMD pixel clock patch by chance?


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> Possibly, i also forgot about a few other things.
> 
> Are you using AMD pixel clock patch by chance?


Haha, that's usually my first question for multi monitor users as well, but alas, in my case I'm not using it.


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> Haha, that's usually my first question for multi monitor users as well, but alas, in my case I'm not using it.


Yeah, i was going to suggest something that might help also. Not sure if it works without the Pixel clock patch. I use CRU to change me standard/custom resolutions from "Auto LCD standard" to "Auto LCD Reduced"



It seemed like the reduced Pixel clock also helped some. In my case.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Cool Mike*
> 
> I owned the 295x2 and now the Devil 13. I have enjoyed the Devil much better. Faster, looks very nice in my windowed case. Yes, it is louder because you are air cooling two 290x cores. This is the only disadvantage for me. Another big selling point to me was it is a custom design.


Can't wait to get one myself ^_^ 10 more days till I can order.


----------



## Jeronbernal

Please add me to the club


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> Yeah, i was going to suggest something that might help also. Not sure if it works without the Pixel clock patch. I use CRU to change me standard/custom resolutions from "Auto LCD standard" to "Auto LCD Reduced"
> 
> 
> 
> It seemed like the reduced Pixel clock also helped some. In my case.


Tried it just in case, no difference.


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> Tried it just in case, no difference.


One last thing if you have time, i am able to recreate this issue when running almost anything. However when i run AVPDX11 benchmark i cannot reproduce this. Could it be a CF profile issue with the driver?


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> One last thing if you have time, i am able to recreate this issue when running almost anything. However when i run AVPDX11 benchmark i cannot reproduce this. Could it be a CF profile issue with the driver?


That's interesting! I'm at work now, so I can't check. Can you try to apply the AVPDX11 crossfire profile to any other DX11 title to check?

Something I've noticed as well, if my frame rate in any game is above 120fps, which is my refresh rate as well, the issue doesnt exist. Are you running 120hz monitors as well? Is the AVP bench running above your refresh rate?


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> That's interesting! I'm at work now, so I can't check. Can you try to apply the AVPDX11 crossfire profile to any other DX11 title to check?
> 
> Something I've noticed as well, if my frame rate in any game is above 120fps, which is my refresh rate as well, the issue doesnt exist. Are you running 120hz monitors as well? Is the AVP bench running above your refresh rate?


Yes, currently i am running [email protected] I do believe it is running above my refresh i will check it again. Also, i seem to recall Tombraider having the issue and i was way above the 120fps mark. There has to be more people with this issue, i've watched your video and i get the exact same issue execpt only on 2 of the 3 monitors. I have also tried [email protected] and it does the same thing. If i run [email protected] 60hz the issue is not there.


----------



## DividebyZERO

Alright i take that back, Tombraider stays above 120fps and i do see it at both 120hz and 60hz. The thing is on 60hz i have to really look hard for it. I could game at 60hz and be cool, but 120hz its just unbearable. Perhaps in AVPDX11 benchmark its barely there but if it is i am not seeing it. I will try screwing around with CF profiles. I tried Valley/Heaven and DX11,DX9,OpenGL and they all have it as well.

I really hate to have to game @ 60hz though, i don't want to wait 6 years for AMD to acknowledge it either.


----------



## axiumone

You're running in portrait right? Have you tried landscape?


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> You're running in portrait right? Have you tried landscape?


Currently in landscape, and for me i'm finding this issue only affects the other two screens. My primary set in CCC doesn't do it, and i moved my primary to a side screen and it was fine. The previous primary screen now has the frame skipping behavior. I can try portrait also if needed would this make and difference?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Jeronbernal*
> 
> Please add me to the club


Accepted!


----------



## Jeronbernal

Yusssssss


----------



## fireedo

hmm, finally I go to trifire configuration between MSI R9 295X2 and HIS R9 290 IceQX2, performance is great but HEAT, wow amazing, my hand now cant touch my MSI 295X2 backplate since it gets really hot...
HOT here I cant measure the temp but its feels like touching boiled water







... I just cant touch a mere 1 second, u got what i mean here

so is it safe? or is it a bad sign that I have to go full custom water cooling on my 295x2

gaming performance is fine tough


----------



## pompss

Someone is using the XSPC Razor R9 295X2 waterblock??Its good like the ek??
like the one from acquacomputer and ek but they dont have 3mm led holes.


----------



## HoneyBadger84

Quote:


> Originally Posted by *fireedo*
> 
> hmm, finally I go to trifire configuration between MSI R9 295X2 and HIS R9 290 IceQX2, performance is great but HEAT, wow amazing, my hand now cant touch my MSI 295X2 backplate since it gets really hot...
> HOT here I cant measure the temp but its feels like touching boiled water
> 
> 
> 
> 
> 
> 
> 
> ... I just cant touch a mere 1 second, u got what i mean here
> 
> so is it safe? or is it a bad sign that I have to go full custom water cooling on my 295x2
> 
> gaming performance is fine tough


Use MSI Afterburner to take a look at individual GPU speeds & temperatures, just to make sure you're not running too hot. If game performance is fine chances are there's nothing to worry about. The R9 295x2 does run pretty warm, and I imagine the backplate does as much "heat soaking" as it does helping dissipate it, so I'd expect it to be hot.


----------



## fireedo

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Use MSI Afterburner to take a look at individual GPU speeds & temperatures, just to make sure you're not running too hot. If game performance is fine chances are there's nothing to worry about. The R9 295x2 does run pretty warm, and I imagine the backplate does as much "heat soaking" as it does helping dissipate it, so I'd expect it to be hot.


Thx for your reply,
Just now I try using MSI Afterburner and using custom fan speed, the GPU(s) are at safe temperature (around 60-65) but the backplate is really hot, maybe it reach 100 C ( I dont have a tool for measuring backplate temp) but for sure it got really hot, so scary
maybe the VRM(s) are too hot, dont know

so as long as game(s) performance are fine then is it ok with this condition?


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> You're running in portrait right? Have you tried landscape?


Any update for you on this? Does your "primary" monitor work fine but not the others? Just curious as i said earlier i could change primary and it works fine when the old primary now has the craziness.


----------



## HoneyBadger84

Quote:


> Originally Posted by *fireedo*
> 
> Thx for your reply,
> Just now I try using MSI Afterburner and using custom fan speed, the GPU(s) are at safe temperature (around 60-65) but the backplate is really hot, maybe it reach 100 C ( I dont have a tool for measuring backplate temp) but for sure it got really hot, so scary
> maybe the VRM(s) are too hot, dont know
> 
> so as long as game(s) performance are fine then is it ok with this condition?


Indeed. Does GPUz not display VRM temps for 295x2s? Cuz that's probably what's running warmer. It's not at 100C, surely, but it might be up around 75-80C or so... the VRMs on the 295x2 do run quite hot because of the shody heatsink they have on them (they're not cooled by the closed loop, cuz that would've made sense ^_^ lol)


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> Any update for you on this? Does your "primary" monitor work fine but not the others? Just curious as i said earlier i could change primary and it works fine when the old primary now has the craziness.


Quote:


> Originally Posted by *DividebyZERO*
> 
> Currently in landscape, and for me i'm finding this issue only affects the other two screens. My primary set in CCC doesn't do it, and i moved my primary to a side screen and it was fine. The previous primary screen now has the frame skipping behavior. I can try portrait also if needed would this make and difference?


The reason I asked is I tried landscape about a month ago and I thought that it worked fine, but I only sent 5 minutes testing that. In my portrait setup, all of my monitors exhibit that behavior, doesn't matter if it's the primary or the side monitors.


----------



## Pheozero

So, I have a quick question. Has anyone in here gone tri-fire with a R9 295X2 and a R9 290(X) while watercooling with EK blocks? If so, can anyone tell me if any of the ports happen to line up?


----------



## fireedo

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Indeed. Does GPUz not display VRM temps for 295x2s? Cuz that's probably what's running warmer. It's not at 100C, surely, but it might be up around 75-80C or so... the VRMs on the 295x2 do run quite hot because of the shody heatsink they have on them (they're not cooled by the closed loop, cuz that would've made sense ^_^ lol)


So far there are no Temp monitoring software that can show VRM temperature ...
Quote:


> Originally Posted by *Pheozero*
> 
> So, I have a quick question. Has anyone in here gone tri-fire with a R9 295X2 and a R9 290(X) while watercooling with EK blocks? If so, can anyone tell me if any of the ports happen to line up?


qouted from EKWB site about 295 block :
Quote:


> PLEASE NOTE:
> - Fittings are not included! Due to immense variety of fittings/barbs available on market and no prescribed standards, we guarantee compatibility only for connectors bought from our web site.
> *- Water block ports do not align with any other EK-FC water block.*
> - The original factory provided backplate cannot be re-used with this water block without modification!


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> The reason I asked is I tried landscape about a month ago and I thought that it worked fine, but I only sent 5 minutes testing that. In my portrait setup, all of my monitors exhibit that behavior, doesn't matter if it's the primary or the side monitors.


I will test it, maybe the issue is not the same. However when i watched the video it looks identical. I will post back after i test


----------



## albertokr

Hello, i have a new computer with this card and i have bsod problems with it, i tried everithing and nothing works, after i searched on the web and found this post, but i didnt readed my problem here, please help









this is my computer.

Corsair RM1000 1000W 80 Plus Gold Modular
Corsair Cooling Hydro Series H110
G.Skill Trident X DDR3 2400 PC3-19200 16GB 2x8GB CL10
Intel Core I7-4790K 4.0Ghz Box
MSI Z97 Gaming 5
Samsung U28D590D 28" LED Ultra HD
Sapphire R9 295X2 8GB GDDR5
1 ssd samsung 250gb + 1 WD Caviar Black 1TB SATA3

The problem is, when i start playing, in about 5 minutes the screen starts flickering and finally after 10-15 seconds black screen, still i can hear the audio but after some seconds the audio freezes too and need to do manual reset.

i have the 4k samsung monitor and i tried at 4k, 2k, 1080p, same result on all resolutions with the samsung cable, but i tried with hdmi and dvi and still happens.
after that i tried with 1 old monitor full hd from my last pc and still happens...

i tried 14,4 and 14,6 drivers and the mobo drivers are updated, checked with cpu Z and the 2 cores works fine, at normal temps,

tried with w8.1 and w7 with The Forest, FFXIVarr and Minecraft, Bsod always on them.

anyone know what more can i do? Thanks a lot ;(


----------



## HoneyBadger84

Quote:


> Originally Posted by *albertokr*
> 
> The problem is, when i start playing, in about 5 minutes the screen starts flickering and finally after 10-15 seconds black screen, still i can hear the audio but after some seconds the audio freezes too and need to do manual reset.
> 
> i have the 4k samsung monitor and i tried at 4k, 2k, 1080p, same result on all resolutions with the samsung cable, but i tried with hdmi and dvi and still happens.
> after that i tried with 1 old monitor full hd from my last pc and still happens...
> 
> i tried 14,4 and 14,6 drivers and the mobo drivers are updated, checked with cpu Z and the 2 cores works fine, at normal temps,
> 
> tried with w8.1 and w7 with The Forest, FFXIVarr and Minecraft, Bsod always on them.
> 
> anyone know what more can i do? Thanks a lot ;(


Sounds like overheating or a bad card. Have you checked GPUz to see if the GPU or VRM temps are getting too high? They shouldn't be working that hard on the games you're talking about I don't think. Where do you have the 295x2's radiator mounted, is it setup as an exhaust, and what are your other temperatures like (CPU, motherboard, any others you can get, use CPUID HWMonitor)


----------



## albertokr

The radiator is on the back part




Ive downloaded Cpuid and the first screen is on windows and the second one playing on fullscreen The forest at 4k, like 1 minute before Bsod



This is CPU-Z playing ffxivarr at 4k.

Do u see anything wrong? i cant see anything, im not expert on this things >< Thanks again.

EDITED

After some tries i used this configuration on CCC just to try.





and it worked.. no bsod.

after that i tried this.



the bsod appeared just after trying to run the game.

its maybe a PSU problem? is there any program to check it? thanks


----------



## HoneyBadger84

That's the other end of the issue is it could be the PSU. But your PSU is a good brand, is it several years old or relatively new?


----------



## albertokr

its new, 1 week ago.

i tried to change the gpu to another pci slot, still bsod.


----------



## crazygamer123

Quote:


> Originally Posted by *albertokr*
> 
> its new, 1 week ago.
> 
> i tried to change the gpu to another pci slot, still bsod.


Hi,

Sorry to hear you are having such problems. Its very similar to mine card "Freeze, flickering,..while play games in UHD 4k" . You should return it and get a new replacement.


----------



## albertokr

Yes, i think il do, cant check more things :S thanks


----------



## tsm106

Quote:


> Originally Posted by *albertokr*
> 
> 
> 
> its maybe a PSU problem? is there any program to check it? thanks


The RM series psu is questionable. On newegg its littered with dead units too.

http://www.overclock.net/t/1455892/why-you-might-not-want-to-buy-a-corsair-rm-psu/0_40


----------



## albertokr

Yes, i readed about the psu and is not the best for this, but it should be enough.

Minecraft dont support crosssfire, in full screen is only using 1 core at about 600mhz while the other stay at 150 so its not consuming like the forest 1018 / 1018 mhz and still bsod


----------



## 4K-HERO

the rm1000 is an excellent power supply. It can handle 83a of current. I had it running a 295x2 with a 290 and the rest of my computer. It did it without a hiccup. Unless you have a faulty one you should be more than fine with that PSU.


----------



## albertokr

Thanks, then probably is not the psu.

ive reinstaled w8, but now i didnt installed the driver cd from the mobo, only the internet ones and now i played for about 20 mins and no bsod.

is a driver problem maybe?

il show u a list of the drivers dont instaled this time, maybe you know if some of them can make problems, ty again

Intel chipset drivers

Realtek HD audio drivers

Sound Blaster Cinema 2

Intel ME drivers

Intel Smart connect tecnology

Super charger

Fast Boot

Command Center

Smart Utilities

Live update 6


----------



## albertokr

nah, bsod after trying ffxiv, tomorrow il send it to the shop again.


----------



## 4K-HERO

What motherboard are you using?


----------



## albertokr

MSI z97 gaming 5


----------



## 4K-HERO

I noticed your card is a sapphire. I had a similar problem with my sapphire 290 trix. You probably have a bad card. Try it in another pc, if the problem persists, take it back. Don't buy sapphire cuz they're customer support is in Taipei. If you're unlucky to get a card that malfunctions now or in the future, it's a mission to get them to help you


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> The reason I asked is I tried landscape about a month ago and I thought that it worked fine, but I only sent 5 minutes testing that. In my portrait setup, all of my monitors exhibit that behavior, doesn't matter if it's the primary or the side monitors.


Portrait gave me identical issues.However my "primary" in CCC works fine as mentioned yours doesn't. I have dvi x2 and 1x hdmi in use, doesn't matter what one i choose for primary it works fine. I am also on 13.12 drivers. I could give a shot on newer drivers but i think your results make me think it would be a waste. Does your issue reduce at all when you use 60hz on just three screens?


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> Portrait gave me identical issues.However my "primary" in CCC works fine as mentioned yours doesn't. I have dvi x2 and 1x hdmi in use, doesn't matter what one i choose for primary it works fine. I am also on 13.12 drivers. I could give a shot on newer drivers but i think your results make me think it would be a waste. Does your issue reduce at all when you use 60hz on just three screens?


A little. If I set the refresh to 60hz and turn on vsync it's reduces it greatly. However, if the game falls below 60 fps, you get that stuttering/tearing. The issue is when the fps falls below your refresh rate. That's why you dont get the issue in the AVPDX11 bench. If you monitor the frame rate, in AVP your frame rate is always above your refresh rate.


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> A little. If I set the refresh to 60hz and turn on vsync it's reduces it greatly. However, if the game falls below 60 fps, you get that stuttering/tearing. The issue is when the fps falls below your refresh rate. That's why you dont get the issue in the AVPDX11 bench. If you monitor the frame rate, in AVP your frame rate is always above your refresh rate.


The only exception to this has been Tomb Raider, it stays above 120fps and i still get the issue. I will try to experiment with games more and see by lowering settings if staying above 120fps if it helps at all. TombRaider doesn't so far.

Tried Borderlands2, and Skyrim, way over 120fps, 250fps in borderlands 2, and the issue is there, its just less noticeable. I have no clue, this is just bizarre


----------



## DividebyZERO

Quote:


> Originally Posted by *DividebyZERO*
> 
> The only exception to this has been Tomb Raider, it stays above 120fps and i still get the issue. I will try to experiment with games more and see by lowering settings if staying above 120fps if it helps at all. TombRaider doesn't so far.
> 
> Tried Borderlands2, and Skyrim, way over 120fps, 250fps in borderlands 2, and the issue is there, its just less noticeable. I have no clue, this is just bizarre


I gave 14.6RC2 drivers a whirl and they are a HUGE improvement for anyone having the issue like in *axiumone*'s video hopefully. Thanks guys..


----------



## albertokr

Quote:


> Originally Posted by *4K-HERO*
> 
> I noticed your card is a sapphire. I had a similar problem with my sapphire 290 trix. You probably have a bad card. Try it in another pc, if the problem persists, take it back. Don't buy sapphire cuz they're customer support is in Taipei. If you're unlucky to get a card that malfunctions now or in the future, it's a mission to get them to help you


Thanks for the info, btw im from Spain and here we only have sapphire one right now









Well i sent the card to the shop 1h ago, they will send me a new one.

thanks again


----------



## 4K-HERO

did you end up trying it another pc?


----------



## albertokr

Cant, i dont have other pcs near with enough power to check it


----------



## crazygamer123

How about trying to plug another card in, if its work then the R9 is a dead card.


----------



## albertokr

yes, i should have tried with my old 4870x2, but is ok, i sent the full pc cause it was mounted in the shop, they will check the problem. ty


----------



## Satchmo0016

Hey I also have a Sapphire 295x2. I can boot into windows but the system just shuts off or freezes after a few seconds. No BSOD or anything. Also the radiator fan doesnt seem to be coming on at all, does it only come on when the gpu scales up to 3D speeds or should it be on all the time? Also also, the VRM fan is really really loud (like its forever at 100%). Is that normal?


----------



## crazygamer123

Quote:


> Originally Posted by *Satchmo0016*
> 
> Hey I also have a Sapphire 295x2. I can boot into windows but the system just shuts off or freezes after a few seconds. No BSOD or anything. Also the radiator fan doesnt seem to be coming on at all, does it only come on when the gpu scales up to 3D speeds or should it be on all the time? Also also, the VRM fan is really really loud (like its forever at 100%). Is that normal?


NO. Its the problem with your sapphire card. The radiator fan should be on all the time. Besides that, my VRM fan went very loud last time, and its happen i got a bad card, the motherboard somehow doesnt detect the card.

Cant beleive there are too many people having problems with Sapphire 295x2.


----------



## Satchmo0016

Quote:


> Originally Posted by *crazygamer123*
> 
> NO. Its the problem with your sapphire card. The radiator fan should be on all the time. Besides that, my VRM fan went very loud last time, and its happen i got a bad card, the motherboard somehow doesnt detect the card.
> 
> Cant beleive there are too many people having problems with Sapphire 295x2.


Yeah.. I used to swear by sapphire and ATI cards. It took me 4 vapor x cards to get one that didn't have something wrong with it. In done with AMD and possibly Newegg too. I think they sent me a card that was meant to go get refurbed..


----------



## crazygamer123

Quote:


> Originally Posted by *Satchmo0016*
> 
> Yeah.. I used to swear by sapphire and ATI cards. It took me 4 vapor x cards to get one that didn't have something wrong with it. In done with AMD and possibly Newegg too. I think they sent me a card that was meant to go get refurbed..


Good luck with RMA the card. Go for Amazon next time.


----------



## Satchmo0016

Quote:


> Originally Posted by *crazygamer123*
> 
> Good luck with RMA the card. Go for Amazon next time.


Yeah.. it sucks that they charge tax in my state now. Also, I found it strange that my Sapphire card came in a box from China inside of the newegg box. It was opened but looked new..


----------



## axiumone

Quote:


> Originally Posted by *Satchmo0016*
> 
> Yeah.. it sucks that they charge tax in my state now. Also, I found it strange that my Sapphire card came in a box from China inside of the newegg box. It was opened but looked new..


My sapphire was shipped the same way from newegg. A box from china with shipping labels to newegg and inside was the retail box.


----------



## Satchmo0016

Quote:


> Originally Posted by *axiumone*
> 
> My sapphire was shipped the same way from newegg. A box from china with shipping labels to newegg and inside was the retail box.


And everything was fine with it?


----------



## crazygamer123

Quote:


> Originally Posted by *Satchmo0016*
> 
> And everything was fine with it?


Should go for Asus or MSI at first place







. Lession is learnt.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Satchmo0016*
> 
> Yeah.. I used to swear by sapphire and ATI cards. It took me 4 vapor x cards to get one that didn't have something wrong with it. In done with AMD and possibly Newegg too. I think they sent me a card that was meant to go get refurbed..


I wouldn't exactly blame AMD for Sapphire's aftermarket stuff having a large batch of bad cards. Especially if we're talking about Vapor-Xs, those are pretty customized, lot of work done after receiving the chip etc from AMD. Sapphire in general though do add some things to pretty much all their higher end cards, and that can result in a higher instance of failures, similar to that seen in the Gigabyte WindForce cards, which have a reference PCB but it has a crap-ton of added on hardware by Gigabyte, which of course would lead to a higher than normal instance of cards having "issues" that require RMAing.

I don't know how much "work" Sapphire did, if any, on their R9 295x2s, but the majority of bad R9 295x2 stories I've seen so far have been on Sapphire cards.


----------



## axiumone

Quote:


> Originally Posted by *Satchmo0016*
> 
> And everything was fine with it?


Yeah, the card itself is fine. It's the drivers I have issues with









Quote:


> Originally Posted by *crazygamer123*
> 
> Should go for Asus or MSI at first place
> 
> 
> 
> 
> 
> 
> 
> . Lession is learnt.


I absolutley agree. I would typically never buy a sapphire card becasue of their super strict warranty policy. MSI and Gigabyte are my top choices. However, sapphire manufactures the cards for almost all of the other brands. Thus the sapphire card was the first one available for sale, MSI didnt become available until weeks after the official launch.


----------



## electro2u

The drivers really are a complete mess. Was a huge mistake getting a 295x2 instead of 2 290s. Now that I'm stuck with it I will learn as best I can how to work with it but it's a terribly temperamental card. The game I most wanted to use it for, FFXIV, doesn't use both Gpus on w8.1. 1500 dollar 290x. Works fine in windows 7. Square Enix don't care. AMD don't care. Msi don't care.


----------



## crazygamer123

They should update us a new driver, I got screen freezing in Thief, cant do anything except hold down the power button and restart my computer . Its also a game from Square Enix which support Mantle, Crossfire.


----------



## sugarhell

Quote:


> Originally Posted by *electro2u*
> 
> The drivers really are a complete mess. Was a huge mistake getting a 295x2 instead of 2 290s. Now that I'm stuck with it I will learn as best I can how to work with it but it's a terribly temperamental card. The game I most wanted to use it for, FFXIV, doesn't use both Gpus on w8.1. 1500 dollar 290x. Works fine in windows 7. Square Enix don't care. AMD don't care. Msi don't care.


If the game doesnt support multi gpus then there is nothing to do with nvidia or amd.

iirc i tried FFXIV and crossfire was wworking. Not perfect scale because of the cpu bottleneck but it was working. Check if you are using windows borderless because crossfire only works with fullscreen


----------



## Zaxis01

What's everyones overclock on this card?

I was able to achieve 1100mhz Core Clk and 1625mhz VRam at +0 vcore but no matter how much voltage i add i can't go much higher then that.

Is there a limit set with this card?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Zaxis01*
> 
> What's everyones overclock on this card?
> 
> I was able to achieve 1100mhz Core Clk and 1625mhz VRam at +0 vcore but no matter how much voltage i add i can't go much higher then that.
> 
> Is there a limit set with this card?


You're probably getting close to thermal limit, or the power limit on these cards, it sucks because it only has 2 8-pin connectors for 2 GPUs. OCing without a full coverage aftermarket liquid block is just not a good idea because of VRM temps.


----------



## Zaxis01

I have a Koolance waterblock installed.

My temps max out around 50-55c

I think it might be a voltage limit.


----------



## HoneyBadger84

If you're not hitting high thermals on the GPU core or VRMs, its probably the power limit of the card or driver/hardware imposed voltage wall of some sort.


----------



## electro2u

Quote:


> Originally Posted by *sugarhell*
> 
> If the game doesnt support multi gpus then there is nothing to do with nvidia or amd.
> 
> iirc i tried FFXIV and crossfire was wworking. Not perfect scale because of the cpu bottleneck but it was working. Check if you are using windows borderless because crossfire only works with fullscreen


Like I said, both Gpus on my 295x2 work fine with thIs game in windows 7. I've of course put the game in full screen. I have the same issue with DX9 Unigine Valley. AMD doesn't officially support dx9 and square Enix doesn't officially support windows 8.1 for FFXIV.

WIN7 has started to really suck with the taskbar being on top of almost everything including full screen YouTube. 295x2 is "working as intended" but it's a major pain in the butt. I've tested it on 2 motherboards msi and asus and 3 different bios.

I'll keep trying. I'm sure I missed something somewhere because in dx11 full screen titles it kicks tail.

Also who mentioned nvidia?


----------



## axiumone

Quote:


> Originally Posted by *electro2u*
> 
> Like I said, both Gpus on my 295x2 work fine with thIs game in windows 7. I've of course put the game in full screen. I have the same issue with DX9 Unigine Valley. AMD doesn't officially support dx9 and square Enix doesn't officially support windows 8.1 for FFXIV.
> 
> WIN7 has started to really suck with the taskbar being on top of almost everything including full screen YouTube. 295x2 is "working as intended" but it's a major pain in the butt. I've tested it on 2 motherboards msi and asus and 3 different bios.
> 
> I'll keep trying. I'm sure I missed something somewhere because in dx11 full screen titles it kicks tail.
> 
> Also who mentioned nvidia?


Pretty strange. I've never had issues with crossfire and FFXIV. It always shows that all 4 of my gpus are used in win 8.1. That's with both my 295x2 set up and my previous quadfire 290 set up. The game is massively cpu limited, but that's another story. I think it's something on your end.


----------



## electro2u

Quote:


> Originally Posted by *axiumone*
> 
> Pretty strange. I've never had issues with crossfire and FFXIV. It always shows that all 4 of my gpus are used in win 8.1. That's with both my 295x2 set up and my previous quadfire 290 set up. The game is massively cpu limited, but that's another story. I think it's something on your end.


Like what, though? I don't really think there's anything wrong with the card and other games work fine with it in win 8.1.
First off, both my GPUs are at 100% utilization but one is at 300Mhz with this game. Also, it works fine ONCE, and then once you exit it stops working. Not sure how I'd be causing that but changing just the *name* of the FFXIV folder will fix it ONCE, then I have to change the name again.
Lots of people have weird issues with FFXIV that they can't fix. Kinda funny I solve Nvidia problems for the game for people elsewhere but no one can solve my problems with the 295x2.
I'm glad your setup works for you but I only have 1 295x2 and it only uses 1 gpu in FFXIV in windows 8.1 but both in windows 7. Perhaps I am installing windows 8.1 incorrectly or using the wrong version of DX9?

I also find it interesting that Unigine Valley exhibits the exact same behavior in both windows 7 and 8.1 when using the DX9 engine.

Perhaps our 295x2s aren't EXactly identical? Mine is an MSI and I've tried threee different BIOS sets on it, they all act the same way with this software.


----------



## albertokr

Yes with my Sapphire same as you, on w8.1 both gpu 100% and after a restart only 1 was at 1018, the other at 300


----------



## 4K-HERO

Has anyone figured out how to control the vrm fan yet? Does the vrm fan spin at 100% already or can it go faster? I still cant find anything on this subject.


----------



## fireedo

Quote:


> Originally Posted by *4K-HERO*
> 
> Has anyone figured out how to control the vrm fan yet? Does the vrm fan spin at 100% already or can it go faster? I still cant find anything on this subject.


have u try MSI Afterburner 3.0.1? yes it can go way faster







... but more noise for sure


----------



## electro2u

Quote:


> Originally Posted by *albertokr*
> 
> Yes with my Sapphire same as you, on w8.1 both gpu 100% and after a restart only 1 was at 1018, the other at 300


Omg I had begun to wonder if it was only me. Someone else. Thank you so much but I'm really sorry you have this problem, too. It's something specific to FFXIV and some driver we are using perhaps... what could it be?

What motherboard are you on sir? Let's try to troubleshoot this: I have a partial workaround but I HATE it. Try to confirm for me if you have time or patience for this that changing the Folder name of your game directory from "Final Fantasy XIV - A Realm Reborn" to ANYTHING else will again allow your card to run the game correctly. If it works, see if it is a permanent fix or only once again. I think if this single issue did not exist I would not bother with a waterblock for my 295x2:

When I realized I was stuck with a 295x2 that didn't work with the only game I really *need* to play right now, I decided I would get a 290x for TriFire. This is stupid too but my wife plays and she wants company. I don't even like video games that much anymore. I haven't yet got to try them together but they both work. I wasn't going to be mean to these beautiful cards and run them in trifire on Air+Water so I am now the proud owner of a 295x2 and a 290x Kryographics block and I'm slowly getting ready to put my loop together. I've also purchased a new mobo with a PLX chip on it so I can do 16x/16x but my MSI AC Xpower mobo... HATES my MSI 295x2 lol.

They don't make very many good games anymore that have graphics that are HQ enough to satisfy me. FFXIV is gorgeous and runs like a champ for a multi-platform port, imo. I can run it on a single 1440p monitor at 120FPS vsynced with just a few changes to the maximum graphics settings on this card... if CF is working properly. Otherwise I pretty much get exactly half FPS. I'm positive if I had 290x x2 I wouldn't have this problem. I thought the PLX chip on the motherboard might help but not so far. I have one more experiment to try. The Xpower AC z97 is supposed to use the #2 PCIE slot for single card configurations, but it wouldn't give me any signal when I installed the card there. Got up and running with the card in the first slot and that's the one I've been using since. Going to try going back to second slot. Perhaps the Bios just needed a kick in the ass.

It's pretty much like half the time the board boots, it's running in slow mode and in BIOS the mouse moves, freeze, moves, freeze. I don't know if it's the board or the board acting funny because of the 295x2.

The stock BIOS that come with these cards is peculiar, too. It's not UEFI but it's sort of a hybrid UEFI so CSM (compatibility support module on your z87 or z97 mobo) needs to be on, but then you still get the ASUS or MSI logo instead of a Microsoft flag in 8.1. I asked MSI about a UEFI bios for the card and they coughed it up quick too. I can submit it if anyone wants it. It works without CSM, but it didn't help with FFXIV obviously.


----------



## electro2u

Quote:


> Originally Posted by *4K-HERO*
> 
> Has anyone figured out how to control the vrm fan yet? Does the vrm fan spin at 100% already or can it go faster? I still cant find anything on this subject.


Not me. I'm using aB 3.01, too and I've tried every setting in it. Some of these people have different BIOS features maybe. Can we start submitting BIOS in txt form please? We have BIOS switches on these cards and there is zero risk as far as I know to switching out 295x2 BIOS. I'm on the Sapphire OC BIOS right now. It runs nice and hot for being such a slight bump. I've got the legacy and EUFI MSI BIOS as well, and none of them allow me to adjust the VRM fan at all, which sucks ass. I'm wondering if the people who can adjust their VRM fan have different/better BIOS. I've seen 2 people now in this thread that said they could, 295x2 is the other, and I have nor reason to doubt. It's definitely a problem because if my card throttles it's mostly not because of the GPU temps and it's very slight throttling. Leads me to believe it is the VRM causing the throttling, and I wonder what temp they are at. Later today I am going to disassemble the card and put some Fujipoly extreme on my VRMs I think.


----------



## axiumone

My take on this, from what I read is this. The folks that can adjust the VRM fan are running a 295x2 and 290x in trifire. For some reason that combination of cards may unlock the vrm fan adjustment in afterburner.


----------



## albertokr

Quote:


> Originally Posted by *electro2u*
> 
> Omg I had begun to wonder if it was only me. Someone else. Thank you so much but I'm really sorry you have this problem, too. It's something specific to FFXIV and some driver we are using perhaps... what could it be?
> 
> What motherboard are you on sir? Let's try to troubleshoot this: I have a partial workaround but I HATE it. Try to confirm for me if you have time or patience for this that changing the Folder name of your game directory from "Final Fantasy XIV - A Realm Reborn" to ANYTHING else will again allow your card to run the game correctly. If it works, see if it is a permanent fix or only once again. I think if this single issue did not exist I would not bother with a waterblock for my 295x2:
> 
> When I realized I was stuck with a 295x2 that didn't work with the only game I really *need* to play right now, I decided I would get a 290x for TriFire. This is stupid too but my wife plays and she wants company. I don't even like video games that much anymore. I haven't yet got to try them together but they both work. I wasn't going to be mean to these beautiful cards and run them in trifire on Air+Water so I am now the proud owner of a 295x2 and a 290x Kryographics block and I'm slowly getting ready to put my loop together. I've also purchased a new mobo with a PLX chip on it so I can do 16x/16x but my MSI AC Xpower mobo... HATES my MSI 295x2 lol.
> 
> They don't make very many good games anymore that have graphics that are HQ enough to satisfy me. FFXIV is gorgeous and runs like a champ for a multi-platform port, imo. I can run it on a single 1440p monitor at 120FPS vsynced with just a few changes to the maximum graphics settings on this card... if CF is working properly. Otherwise I pretty much get exactly half FPS. I'm positive if I had 290x x2 I wouldn't have this problem. I thought the PLX chip on the motherboard might help but not so far. I have one more experiment to try. The Xpower AC z97 is supposed to use the #2 PCIE slot for single card configurations, but it wouldn't give me any signal when I installed the card there. Got up and running with the card in the first slot and that's the one I've been using since. Going to try going back to second slot. Perhaps the Bios just needed a kick in the ass.
> 
> It's pretty much like half the time the board boots, it's running in slow mode and in BIOS the mouse moves, freeze, moves, freeze. I don't know if it's the board or the board acting funny because of the 295x2.
> 
> The stock BIOS that come with these cards is peculiar, too. It's not UEFI but it's sort of a hybrid UEFI so CSM (compatibility support module on your z87 or z97 mobo) needs to be on, but then you still get the ASUS or MSI logo instead of a Microsoft flag in 8.1. I asked MSI about a UEFI bios for the card and they coughed it up quick too. I can submit it if anyone wants it. It works without CSM, but it didn't help with FFXIV obviously.


Im using a MSI z97 gaming 5, unfortunately i sent the pc to she shop to fix it and i cant try anything more


----------



## electro2u

Quote:


> Originally Posted by *axiumone*
> 
> My take on this, from what I read is this. The folks that can adjust the VRM fan are running a 295x2 and 290x in trifire. For some reason that combination of cards may unlock the vrm fan adjustment in afterburner.


That sounds spot on. These are the kind of observations I bet AMD software people could make good use out of to fix these dumb issues (seriously? the VRM fans are 100% until you get the drivers installed and then they are... pretty much not running). It's very quiet on the stock setting once the drivers kick in, but when the system boots the VRM fan on my 295x2 system kicks up to 100% and it's LOUD. That would definitely help with people's throttling issues if it were dynamic...

Can the fan/s on the rad be controlled even? I can't hear my Noctua NF-P12's on it. I have them both running off the card using a fan splitter but I think they are stuck at low speed possibly.


----------



## rdr09

Quote:


> Originally Posted by *electro2u*
> 
> That sounds spot on. These are the kind of observations I bet AMD software people could make good use out of to fix these dumb issues (seriously? the VRM fans are 100% until you get the drivers installed and then they are... pretty much not running). It's very quiet on the stock setting once the drivers kick in, but when the system boots the VRM fan on my 295x2 system kicks up to 100% and it's LOUD. That would definitely help with people's throttling issues if it were dynamic...
> 
> Can the fan/s on the rad be controlled even? I can't hear my Noctua NF-P12's on it. I have them both running off the card using a fan splitter but I think they are stuck at low speed possibly.


Let them hear it . . .

http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B5A


----------



## fireedo

Quote:


> Originally Posted by *axiumone*
> 
> My take on this, from what I read is this. The folks that can adjust the VRM fan are running a 295x2 and 290x in trifire. For some reason that combination of cards may unlock the vrm fan adjustment in afterburner.


maybe u just right, I forget what about BEFORE I plugged my R9 290 as my third GPU (Trifire), anyway this is my AB 3.0.1 screenshot :


----------



## electro2u

Quote:


> Originally Posted by *rdr09*
> 
> Let them hear it . . .
> 
> http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B5A


Oh I have. I've been working on this for a while. I have had a much better dialogue going on about this with MSI. AMD officially doesn't support DirectX9 and won't look into this problem. SquareEnix doesn't support windows 8.1 for FFXIV so it's sort of no one's problem but ours really. That's why I was coming here. Some Nvidsia users have similar issues with the game as well, but my 2 GTX 780s in SLI worked really well. If I had done more research before buying the 295x2 I would have known that AMD has some issues with DX9 and probably stayed Green. That said ,the 295x2 is brilliant when it works for me, in Windows 7...

FWIW I did solve the issue with my new MSI motherboard. The BIOS/ME needed to be updated before switching the card back to the second PCIE slot. Very tricky.


----------



## sugarhell

What you mean amd officially doesnt support directx9?


----------



## electro2u

Quote:


> Originally Posted by *sugarhell*
> 
> What you mean amd officially doesnt support directx9?


That's literally how they avoid troubleshooting the problem. They say dx9 isn't officially supported. If you have issues on an AMD card w a dx9 title you are SOL. Doesn't mean it won't work. Means they don't care if it doesn't.


----------



## kalijaga

BY any chance for the people running 295x2 and 290x trifire can post the temperature difference using manual fan curves in AB? Ia m hoping that AMD will release an update to the 295x2 via CCC or app so that we can have our own fan curve. It may be useful for people running quadfire and living in high ambient temp.

Cheers.


----------



## pompss

FrozenCpu Just shipped my full waterblock Aquacomputer Kryographics Vesuvius Radeon R9 295X2.
Also i will replace my old rad funs with 3 jet flo 120mm red led and remove primochill tube with acrylic tubing ad red uv coolant.
Tomorrow i wanna start bending !!!!


----------



## Earth Dog

Quote:


> Originally Posted by *pompss*
> 
> FrozenCpu Just shipped my full waterblock Aquacomputer Kryographics Vesuvius Radeon R9 295X2.
> Also i will replace my old rad funs with 3 jet flo 120mm red led and remove primochill tube with acrylic tubing ad red uv coolant.
> Tomorrow i wanna start bending !!!!


Get colored tubing and ditch the dye.. Dyes can clog loops... Watercooling 101 here.


----------



## Pheozero

Quote:


> Originally Posted by *Earth Dog*
> 
> Get colored tubing and ditch the dye.. Dyes can clog loops... Watercooling 101 here.


Errr... no, it's not the dyes that clog loops, it's the plasticizer.


----------



## Earth Dog

http://forums.anandtech.com/showthread.php?t=2170057

Dozen's of links just like it.

Is that all plasticizer and not dyes and everything every website says is wrong? (not being a jerk... that is a serious question).


----------



## Pheozero

Here is the thing, most of the time if it's a quality dye like something from Mayhems for example, it won't break down in the loop. All that crap you see in blocks is usually plazticizor highlighted to what ever color you had in the system.

Unless it's a Primochill Dye Bomb. Don't ever use a Primochill Dye Bomb. Ever. My personal loop was running for up to 2 years without any of the pastel breaking down.


----------



## Earth Dog

There are exceptions to the rule, certainly... however generic advice is to stay the hell away, or if one inisists, to make sure they check the loop a lot more frequently. Its just easier to get colored tubing and NEVER having to worry about it (to me).


----------



## pompss

Quote:


> Originally Posted by *Earth Dog*
> 
> There are exceptions to the rule, certainly... however generic advice is to stay the hell away, or if one inisists, to make sure they check the loop a lot more frequently. Its just easier to get colored tubing and NEVER having to worry about it (to me).


I using aurora form 6 months never had a problem .
I don't know about dye or pastel but i'm pretty sure mayhem's product will not break down my loop.
But i agree with you about the colored tubing instead pastel and red uv dye since you are not abel to see the coolant moving .
This is why my first coolant was aurora


----------



## ViRuS2k

yeah baby just got this installed today what a difference








currently running @1150/1625 max temps in sleeping dogs and crysis 3 62c loaded.








unsure of the vrm temps im sure there far better than the stocky but dam the backplace is scorching hot i guess the heat transfer is working on the backplate to that i bought with the block









EK FTW..

the waterblock that i installed with its backplate.. useing mayhems blood red die XI with a killcoil.




went with the metal version as im sick of copper lol
i also upgraded my motherboard and cpu and psu from i7 4770k @4.5 to devils canyon i7 4790k @4.8 from a gigabyte z87oc to a z97soc force board and from a pc power and cooling 950w to a ocz xz 1250w psu.


----------



## Zaxis01

Any custom bios' yet?

If so has anyone flashed their 295x2 and what are your results?


----------



## fireedo

Quote:


> Originally Posted by *ViRuS2k*
> 
> yeah baby just got this installed today what a difference
> 
> 
> 
> 
> 
> 
> 
> 
> currently running @1150/1625 max temps in sleeping dogs and crysis 3 62c loaded.
> 
> 
> 
> 
> 
> 
> 
> unsure of the vrm temps im sure there far better than the stocky but dam the backplace is scorching hot i guess the heat transfer is working on the backplate to that i bought with the block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EK FTW..
> 
> the waterblock that i installed with its backplate.. useing mayhems blood red die XI with a killcoil.
> 
> went with the metal version as im sick of copper lol
> i also upgraded my motherboard and cpu and psu from i7 4770k @4.5 to devils canyon i7 4790k @4.8 from a gigabyte z87oc to a z97soc force board and from a pc power and cooling 950w to a ocz xz 1250w psu.


darn it, so the backplate still hot??? scary hot?

I 've already made a purchase from EKWB too ....


----------



## ViRuS2k

Quote:


> Originally Posted by *fireedo*
> 
> darn it, so the backplate still hot??? scary hot?
> 
> I 've already made a purchase from EKWB too ....


yup the back plate is scary hot cook a egg on it for sure. lol
these cards are scary hot to begin with... gpus are cool front vrms are cool
just the backplate is scary hot though there is no way to cool it other than have a passive fan blowing onto it there just designed to run hot thats all but the block and backplate really help...
and keep it under control with silence

im not that bothered to be honest about OCing it
benifits for me are silent
single slot
and no throttleing as the card will never get to 75c ect.


----------



## fireedo

Quote:


> Originally Posted by *ViRuS2k*
> 
> yup the back plate is scary hot cook a egg on it for sure. lol
> these cards are scary hot to begin with... gpus are cool front vrms are cool
> just the backplate is scary hot though there is no way to cool it other than have a passive fan blowing onto it there just designed to run hot thats all but the block and backplate really help...
> and keep it under control with silence
> 
> im not that bothered to be honest about OCing it
> benifits for me are silent
> single slot
> and no throttleing as the card will never get to 75c ect.


I got throttling using stock cooling even temperature never above 70 C but the backplate get really hot like hell, so thats why I decide to take full custom WB

so then its still worth if no throttling even the backplate still hot ... am I correct?


----------



## cennis

we need something like this backplate and airflow over it..

or the aquacomputer active backplate that has contact with water


----------



## fireedo

Quote:


> Originally Posted by *cennis*
> 
> 
> 
> we need something like this backplate and airflow over it..
> 
> or the aquacomputer active backplate that has contact with water


wow that's really awesome and looks like exactly what we need to cool the backplate


----------



## HoneyBadger84

Quote:


> Originally Posted by *cennis*
> 
> 
> 
> we need something like this backplate and airflow over it..
> 
> or the aquacomputer active backplate that has contact with water


That is what we in the air cooled heatsink world call overkill. Lol


----------



## cennis

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That is what we in the air cooled heatsink world call overkill. Lol


its not overkill at all for 295x2...
mad throttling bro


----------



## HoneyBadger84

Quote:


> Originally Posted by *cennis*
> 
> its not overkill at all for 295x2...
> mad throttling bro


I was refering to in general lol if they actually took the time to design a quality heatsink and backplate for the R9 295X2 I think they could get the temps way down. Would probably end up being a 4-slot (1 for the backplate heasink, 3 for the heatsink itself) monster though.


----------



## Elmy

So just a little bit of information for anyone thinking of running 2 295X2's. I finally got some time to play with mine. I was running the cards at 1100/1400 last night playing Tomb Raider with my 4770K and was drawing 1605 watts and I tripped out my AX1500i. Time to get a new case now and dual PSU's :-(


----------



## crazygamer123

Quote:


> Originally Posted by *Elmy*
> 
> So just a little bit of information for anyone thinking of running 2 295X2's. I finally got some time to play with mine. I was running the cards at 1100/1400 last night playing Tomb Raider with my 4770K and was drawing 1605 watts and I tripped out my AX1500i. Time to get a new case now and dual PSU's :-(


Such useful information. I am planning to get the AX1500i for dual r9 295x2. But i will get the Rosewill Hercules 1600w instead.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Elmy*
> 
> So just a little bit of information for anyone thinking of running 2 295X2's. I finally got some time to play with mine. I was running the cards at 1100/1400 last night playing Tomb Raider with my 4770K and was drawing 1605 watts and I tripped out my AX1500i. Time to get a new case now and dual PSU's :-(


Quote:


> Originally Posted by *crazygamer123*
> 
> Such useful information. I am planning to get the AX1500i for dual r9 295x2. But i will get the Rosewill Hercules 1600w instead.


That really is a setup that would be best ran on Dual PSUs. even if it's 2 Antec or Corsair 1000Ws. I can't believe an AX1500i tripped out at only 1605W draw, was that from the wall? I've seen my older (3yrs old) AX1200W hit almost 1500W draw from the wall and it didn't bat an eye. Asking one PSU to power a Quad SLi system while OCed on the GPUs is asking a bit much, unless you're on a 230V plug so you can draw that extra power.

As far as budget with quality goes, I'd go with an Antec Platinum HCP-1300W as a primary PSU, then an Antec HCP-1000 or HCP-850 as the second power supply. They have OC Link, which makes both PSUs turn on when you turn on the computer, without having to wire them together. They are also some of the best reviewed, highest quality PSUs around. And spending less than $600 for quality PSUs to power your $3k worth of graphics cards seems only fair  lol


----------



## Elmy

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That really is a setup that would be best ran on Dual PSUs. even if it's 2 Antec or Corsair 1000Ws. I can't believe an AX1500i tripped out at only 1605W draw, was that from the wall? I've seen my older (3yrs old) AX1200W hit almost 1500W draw from the wall and it didn't bat an eye. Asking one PSU to power a Quad SLi system while OCed on the GPUs is asking a bit much, unless you're on a 230V plug so you can draw that extra power.
> 
> As far as budget with quality goes, I'd go with an Antec Platinum HCP-1300W as a primary PSU, then an Antec HCP-1000 or HCP-850 as the second power supply. They have OC Link, which makes both PSUs turn on when you turn on the computer, without having to wire them together. They are also some of the best reviewed, highest quality PSUs around. And spending less than $600 for quality PSUs to power your $3k worth of graphics cards seems only fair  lol


My Kill-A-Watt was showing 1605.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Elmy*
> 
> My Kill-A-Watt was showing 1605.


You sure you didn't trip the breaker in your house for the circuit the plug was on? Unless you have a bad Ax1500i there's zero reason pulling 1605W at the wall would trip it, cuz with efficiency counted, that means the PSU was putting out between 1400-1430W depending on how efficient it was being. No way it should be tripoing without even being at full draw.


----------



## DividebyZERO

Quote:


> Originally Posted by *Elmy*
> 
> My Kill-A-Watt was showing 1605.


Makes me feel a bit better about my Lepa Gs1600 then, last night i was testing 14.7 betas and +200mv 1200/1600 quad 290x and my killawatt started beeping. I looked over and saw 1880watts kinda panicked. So i cancelled the benchmark and dropped the overclocks. I've regularly run 1600+ watts to it in benching and gaming some no issues yet. I have been considering dual PSU though to be safe but not sure i need it yet.


----------



## HoneyBadger84

Quote:


> Originally Posted by *DividebyZERO*
> 
> Makes me feel a bit better about my Lepa Gs1600 then, last night i was testing 14.7 betas and +200mv 1200/1600 quad 290x and my killawatt started beeping. I looked over and saw 1880watts kinda panicked. So i cancelled the benchmark and dropped the overclocks. I've regularly run 1600+ watts to it in benching and gaming some no issues yet. I have been considering dual PSU though to be safe but not sure i need it yet.


I would go dual PSU like I'm planning to when I start OCing my 4 290Xs, once I get a 4th reference card to run.

The way it was explained to me:
Quote:


> Originally Posted by *twerk*
> 
> I'm not entirely convinced the HCP-1300 would provide enough wattage, OCP kicks in at just over 1400W from my testing so you may end up triggering it with a decent overclock on your cards and CPU.
> 
> 290X @ 1.35V/1150MHz = ~422W
> 3930k @ 1.29V/4.6GHz = ~201W
> 
> (422x4)+201=1889W
> 
> Of course, you may not be overclocking the GPU's that high but it just shows how quickly power draw goes up, and that's not even a particularly extreme overclock.


----------



## Elmy

Quote:


> Originally Posted by *HoneyBadger84*
> 
> You sure you didn't trip the breaker in your house for the circuit the plug was on? Unless you have a bad Ax1500i there's zero reason pulling 1605W at the wall would trip it, cuz with efficiency counted, that means the PSU was putting out between 1400-1430W depending on how efficient it was being. No way it should be tripoing without even being at full draw.


I am actually at PDXLAN right now on a dedicated 20 amp circuit. I didn't trip the breaker... I also know this because I am a master electrician in the State of Washington....

A 20 amp breaker won't trip until you hit about 22-23 amps for a period of of time.. which gives you the possibility to actually draw around 2640 watts or so before the breaker will trip.

In most modern houses all circuits in the kitchen, bathrooms and laundry rooms are on 20 amp circuits. Everything else is 15 amp which is good for around 2000 watts before the breaker will trip.

I am going to install a 1u Seasonic 500w Gold rated modular power supply in my rig and use that for the CPU/motherboard, my 14 120mm fans , my 2 D5 pumps to help alleviate the load on the AX1500i.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Elmy*
> 
> I am actually at PDXLAN right now on a dedicated 20 amp circuit. I didn't trip the breaker... I also know this because I am a master electrician in the State of Washington....
> 
> A 20 amp breaker won't trip until you hit about 22-23 amps for a period of of time.. which gives you the possibility to actually draw around 2640 watts or so before the breaker will trip.
> 
> In most modern houses all circuits in the kitchen, bathrooms and laundry rooms are on 20 amp circuits. Everything else is 15 amp which is good for around 2000 watts before the breaker will trip.
> 
> I am going to install a 1u Seasonic 500w Gold rated modular power supply in my rig and use that for the CPU/motherboard, my 14 120mm fans , my 2 D5 pumps to help alleviate the load on the AX1500i.


My window AC unit that's rated and on a 115V plug, 15A breaker (with nothing else on it), has tripped it twice. What would you recommend I do, put it on a larger breaker, or not worry about it since it hasn't happened in days. Could it be because I'm using an extension cord (10 gauge, about 25ft) to run it to another circuit so its on a breaker by itself?

Sorry for









Back on topic, that is quite weird to me that a AX1500i would stop up at only 1605W at the wall though. Like I said, my AX1200W has reached 1450W from the wall and didn't miss a beat (powering 4 290Xs at stock).


----------



## Synthaxx

Well I think there is nothing weird about the ax1500i failing at 1605watts at the wall. When you calculate the ~94% efficiency you will get around 1600watts. The AX1200 you are talking about has a gold efficiency rating, so when efficiency is between 80 and 90%, you will pull between 1330 watts and 1500 watts at the wall.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Well I think there is nothing weird about the ax1500i failing at 1605watts at the wall. When you calculate the ~94% efficiency you will get around 1600watts. The AX1200 you are talking about has a gold efficiency rating, so when efficiency is between 80 and 90%, you will pull between 1330 watts and 1500 watts at the wall.


Uuuuh I calculated the efficiency of the A1500i out in an earlier post. 1605W from the wall even at 94% efficiency is 1508W. It shouldn't be shutting off that quickly either way at that load. Also the 1500i has about 90% efficiency at the 1500W mark via Johnny Guru: http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=378 so my stated estimate of 1605W at the wall equating to under 1500W of power to the system was pretty darn accurate, as 90% efficiency at 1605W at the plug = 1444W to the system... see what I mean about it possibly being a bad your of its tripping that low?

Also, my PSU is 80+ Gold but its from 2 1/2 years ago before "Platinum" was a thing. It has 92% efficiency on cold test at 1200W of power and 90.7% efficiency on hot test at 1200W of power. http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story3&reid=189 So me seeing ~1450W at the wall means I was indeed outputting more than 1200W of power, with no issues (~1300W if it was ~90%, even more if it was in the 92% range)


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Uuuuh I calculated the efficiency of the A1500i out in an earlier post. 1605W from the wall even at 94% efficiency is 1508W. It shouldn't be shutting off that quickly either way at that load. Also the 1500i has about 90% efficiency at the 1500W mark via Johnny Guru: http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=378 so my stated estimate of 1605W at the wall equating to under 1500W of power to the system was pretty darn accurate, as 90% efficiency at 1605W at the plug = 1444W to the system... see what I mean about it possibly being a bad your of its tripping that low?
> 
> Also, my PSU is 80+ Gold but its from 2 1/2 years ago before "Platinum" was a thing. It has 92% efficiency on cold test at 1200W of power and 90.7% efficiency on hot test at 1200W of power. So me seeing ~1450W at the wall means I was indeed outputting more than 1200W of power, with no issues (~1300W if it was ~90%, even more if it was in the 92% range)


Sorry, I didn't read any reviews on the 'real' efficiency, I used the % as advertised on corsair's website. If his system failed at 1605 watts at the wall, assuming the ax1500i used all 1500watts, it had about 93.5% efficiency. Maybe you are right and it's a bad one... I just wanted to bring some insight


----------



## electro2u

I was able to use the onboard video from my processor to unbrick one of the bios sets on my x2.


----------



## cennis

Quote:


> Originally Posted by *sugarhell*
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


I tried applying volts with this method (with /wi6 instead of /wi4 to the 295x2 but it does not work. It has always worked for me on the 290s.
The main benefit of this method is it allows more than +100mw that the slider is limited to. Anyone know how it works on the 295x2?


----------



## sugarhell

Quote:


> Originally Posted by *cennis*
> 
> I tried applying volts with this method (with /wi6 instead of /wi4 to the 295x2 but it does not work. It has always worked for me on the 290s.
> The main benefit of this method is it allows more than +100mw that the slider is limited to. Anyone know how it works on the 295x2?


This doesnt work on a 295x2...


----------



## Zaxis01

I installed the latest catalyst 14.7 drivers and the control center is showing 14.6 RC.

Is anyone else having this issue?


----------



## axiumone

Quote:


> Originally Posted by *Zaxis01*
> 
> I installed the latest catalyst 14.7 drivers and the control center is showing 14.6 RC.
> 
> Is anyone else having this issue?


Normal. AMD forgot to update that.


----------



## Satchmo0016

Update on this post:
Quote:


> Originally Posted by *Satchmo0016*
> 
> Hey I also have a Sapphire 295x2. I can boot into windows but the system just shuts off or freezes after a few seconds. No BSOD or anything. Also the radiator fan doesnt seem to be coming on at all, does it only come on when the gpu scales up to 3D speeds or should it be on all the time? Also also, the VRM fan is really really loud (like its forever at 100%). Is that normal?


Tech guy at AMD said faulty cooling loop, RMA'd it, got a new one that works.

Now this new one makes a water trickling sound, like there's bubbles in the loop (sounds like its coming from the pumps). I didnt hear this noise at first but after playing DayZ for a few hours it started doing it. The radiator is in push/pull in the rear exhaust position in my case, so its higher than the card with inlet and outlet at the bottom. This is the proper configuration right?

I also tried tipping whole tower all around, on/off in different positions, and detached the radiator to try to free the bubbles and relocate them to the top of the radiator, but no luck. Do you guys have any advice?


----------



## Zaxis01

Quote:


> Originally Posted by *axiumone*
> 
> Normal. AMD forgot to update that.


Oh! I see. Thanks for the clarification.


----------



## mparra11

Hey guys i have never done custom water cooling but i really want to water cool my 295x im looking at getting the ek block does anyone have a video or write up on how to install the block? Hows the performance on the ek block?


----------



## Earth Dog

I will be reviewing the EK WB for the 295x2 here very soon... There will not be a video, but likely high level instructions on how it was installed... but that won't be ready for another month I would expect.

I have not seen any reviews but seeing as how it also cools the VRMs and assuming you properly rad it (2x120mm at least), it should drop temps quite a bit.


----------



## mparra11

Ok sweet i have not pulled the trigger on the block due to making surei can do it lol. I waa planing on getting a 480mm rad due to me having a 900d. Doing a custom od Paint on it now


----------



## electro2u

Will be installing the EK 295x2 backplate with the Aquacomputer block this week, assuming they fit together (might need to grind the screws down a little). Should have just gone with EK blocks I guess.


----------



## crazygamer123

Quote:


> Originally Posted by *Satchmo0016*
> 
> Update on this post:
> Tech guy at AMD said faulty cooling loop, RMA'd it, got a new one that works.
> 
> Now this new one makes a water trickling sound, like there's bubbles in the loop (sounds like its coming from the pumps). I didnt hear this noise at first but after playing DayZ for a few hours it started doing it. The radiator is in push/pull in the rear exhaust position in my case, so its higher than the card with inlet and outlet at the bottom. This is the proper configuration right?
> 
> I also tried tipping whole tower all around, on/off in different positions, and detached the radiator to try to free the bubbles and relocate them to the top of the radiator, but no luck. Do you guys have any advice?


So far i can tell you installed it correctly. I have exactly same setup( push/pull/exhaust). Does the problem effect R9 295x2's temp?


----------



## Satchmo0016

Quote:


> Originally Posted by *crazygamer123*
> 
> So far i can tell you installed it correctly. I have exactly same setup( push/pull/exhaust). Does the problem effect R9 295x2's temp?


Maybe. It depends on whats normal for temperature, but I have nothing really to compare it to. If I run Valley, after about one full pass the card is already at 75C and throttling to 940mhz. I haven't even tried it not in push pull but I imagine that i'd just happen sooner. It shouldnt be throttling that soon, should it?


----------



## cennis

Quote:


> Originally Posted by *Satchmo0016*
> 
> Maybe. It depends on whats normal for temperature, but I have nothing really to compare it to. If I run Valley, after about one full pass the card is already at 75C and throttling to 940mhz. I haven't even tried it not in push pull but I imagine that i'd just happen sooner. It shouldnt be throttling that soon, should it?


thats not normal I dont think, could be a bad tim/mount, could be the loop itself


----------



## Satchmo0016

Quote:


> Originally Posted by *cennis*
> 
> thats not normal I dont think, could be a bad tim/mount, could be the loop itself


Hmm... This one was a replacement for a defective one from Newegg (pumps were completely broken when I got it). I wonder if they send out crappy units or previous returns as RMA?


----------



## cennis

Quote:


> Originally Posted by *Satchmo0016*
> 
> Hmm... This one was a replacement for a defective one from Newegg (pumps were completely broken when I got it). I wonder if they send out crappy units or previous returns as RMA?


I've heard some horror stories about newegg RMA but im cant say for certain


----------



## crazygamer123

50/50 here. If you are lucky they will send you a good replacement ( most of them are refurbished or "previous returns"). if not you will get another crappy unit.


----------



## Satchmo0016

Yeah. I think somebody at Newegg was pretty miffed that I made such a big stink about buying a $1600 paperweight with the first card, because the packaging slip was literally crumpled up and shoved into the box. Lol.


----------



## Necrodeath

hi gys. i need a quick info








i'm planning to change my 780 for the vesuvius.
i see on shops that the sapphire one are at lower price in comparison to other models.
is there anything that should stop me to buy a sapphire 295x2 and save some back or should i buy another model?


----------



## joeh4384

I don't think it matters what one you get. Doesn't the sapphire one come in a cool briefcase?


----------



## cennis

Hi guys, I have had this msi 295x2 for a month now,

it was always good and stable at 100mw,50 power limit, 1150mhz, 1625mhz

no crash/ nothing, a couple times i had "vrm throttling" to 300mhz but I have put many fans blowing at it on my testbench

However last night, running valley my card freezes at those clocks. Temperatures are definitely under control, no core or vrm throttling.

reduced my cpu to stock and it happens. replicated issue for 5 times.
freeze does not seem to occur under stock settings.
The freeze does not go to bsod or restart, just freeze at a picture and stays there

Also, when it freeze, the psu makes a clicky noise similar to when you turn on the system (certain seasonic designs make this noise when you turn on), however the fan is still spinning.

TLDR; clocks been stable for a month, start to freeze on valley


----------



## crazygamer123

Quote:


> Originally Posted by *Necrodeath*
> 
> hi gys. i need a quick info
> 
> 
> 
> 
> 
> 
> 
> 
> i'm planning to change my 780 for the vesuvius.
> i see on shops that the sapphire one are at lower price in comparison to other models.
> is there anything that should stop me to buy a sapphire 295x2 and save some back or should i buy another model?


There are plenty of people are having problems with Sapphire R9 295x2 recently (DOA, fans, fault-cooling loop). If i were you, i would go with Asus or MSI.
Quote:


> Originally Posted by *joeh4384*
> 
> I don't think it matters what one you get. Doesn't the sapphire one come in a cool briefcase?


Yes it does, only with the OC version with the price tag 1599$. Other versions cost 1499$ come with normal box. Imo, Its not worth to pay extra 100$ for the briefcase, you can just overclock the card by yourself anytime.


----------



## Satchmo0016

Quote:


> Originally Posted by *crazygamer123*
> 
> There are plenty of people are having problems with Sapphire R9 295x2 recently (DOA, fans, fault-cooling loop). If i were you, i would go with Asus or MSI.
> 
> ...


You know, I've had a lot of problems with sapphire cards lately.. Im wondering if they go with the lower binned chips or slightly defective from AMD and got a better deal or something? My 2nd replacement's loop (Sapphire OC version) seems like it isnt even full.


----------



## electro2u

As I understand it Sapphire has an extremely annoying RMA procedure. I think the best warranty terms come from MSI ASUS and Gigabyte on these, but I don't know about the Clubs. The 3 I mentioned have 3 year warranties. The powercolors sapphires and xfx 295x2s have 2 year warranties. The Visionteks only have a 1 year warranty.

Hmm wow, the Diamond unit says it has a 5 year warranty.

I went with the MSI unit, and you can actually pull the cooler off without voiding the warranty which is cool.


----------



## ChrisxIxCross

Currently Newegg has a deal going on with the Diamond 295 coming with a FREE 500gb evo in case anyone is interested. http://www.newegg.com/Product/Product.aspx?Item=N82E16814103131&cm_re=r9_295x2-_-14-103-131-_-Product

Other vendors also come with an SSD but only 120gb -

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=PPSSTKDPGEAVHE&icid=266871

Also is anyone familiar with how the production of these cards works? Does AMD manufacture them and send off units to the vendors that just put their stickers on them?


----------



## axiumone

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> Currently Newegg has a deal going on with the Diamond 295 coming with a FREE 500gb evo in case anyone is interested. http://www.newegg.com/Product/Product.aspx?Item=N82E16814103131&cm_re=r9_295x2-_-14-103-131-_-Product
> 
> Other vendors also come with an SSD but only 120gb -
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=PPSSTKDPGEAVHE&icid=266871
> 
> Also is anyone familiar with how the production of these cards works? Does AMD manufacture them and send off units to the vendors that just put their stickers on them?


Sapphire produces all of the cards.


----------



## ImperialOne

Each manufacturer produces it's own cards within the specs set forth by the designer (AMD or nVidia). Some do a better job than others. Some use better parts (Hynix memory in Asus) than others.


----------



## DividebyZERO

Quote:


> Originally Posted by *cennis*
> 
> Hi guys, I have had this msi 295x2 for a month now,
> 
> it was always good and stable at 100mw,50 power limit, 1150mhz, 1625mhz
> 
> no crash/ nothing, a couple times i had "vrm throttling" to 300mhz but I have put many fans blowing at it on my testbench
> 
> However last night, running valley my card freezes at those clocks. Temperatures are definitely under control, no core or vrm throttling.
> 
> reduced my cpu to stock and it happens. replicated issue for 5 times.
> freeze does not seem to occur under stock settings.
> The freeze does not go to bsod or restart, just freeze at a picture and stays there
> 
> Also, when it freeze, the psu makes a clicky noise similar to when you turn on the system (certain seasonic designs make this noise when you turn on), however the fan is still spinning.
> 
> TLDR; clocks been stable for a month, start to freeze on valley


Have you changed Drivers? Maybe try to verify voltage adjustments and power limit is applying to both gpu's on the card? Gpuz might tell you that, I've had issues with my 290x CF crapping out when applying OC with MSI AB. It turned out that it wasn't upping the vcore on the 2nd card properly, and power limit also. If PSU is clicking i would try running valley windowed on 1 gpu WITH it oc'd if you can. Maybe see if its stable on one GPU first. Maybe use a voltage monitor program for your 12v/5v/3.3v etc and see if it's acting wierd.

What PSU do you use?


----------



## HoneyBadger84

Gotta keep in min folks, this card can draw upwards of 500W (up to just about 600W in some tests on several review sites) by itself, at stock, so having anything less than 1000W PSU is really not a smart thing to do. You want to have a good quality Antec, Corsair, eVGA, Silverstone, Seasonic or similar unit, to ensure you have overhead & aren't over-taxing your PSU. Having your unit running at or near max rated power for regular use is not a great idea in most cases.

For those of you running QuadFire with 2 295x2s... well, you should be running 2 PSUs, plain & simple, for safety reasons among other things. *shrug* that's not just my opinion, that's coming from testing sites and the like as well.


----------



## Necrodeath

so should i avoid sapphire? caus on my country it cost lot less than the asus or msn...like 300euro less.
btw i have another doubt i need to resolve.
i'm upgrading from a 780.
i dunno if i should get another 780 and going sli or sell the 780 and pick a 295x2
anyone can help me?
i'm worried about the drivers for the 295x2 and now u telling me that the sapphire ones comes with a lot of problems :\..


----------



## rdr09

Quote:


> Originally Posted by *Necrodeath*
> 
> so should i avoid sapphire? caus on my country it cost lot less than the asus or msn...like 300euro less.
> btw i have another doubt i need to resolve.
> i'm upgrading from a 780.
> i dunno if i should get another 780 and going sli or sell the 780 and pick a 295x2
> anyone can help me?
> i'm worried about the drivers for the 295x2 and now u telling me that the sapphire ones comes with a lot of problems :\..


going sli would be cheaper and results to similar performance. what rez and what are the rest of your components (cpu, psu, etc)?

i recommend another 780. not sure how much the 295 is there but here in the US . . . $1500 at least.

This particular GPU, imo, is meant for if you are space limited and want to run high rez or if you need the ports they provide.


----------



## Necrodeath

Quote:


> Originally Posted by *rdr09*
> 
> going sli would be cheaper and results to similar performance. what rez and what are the rest of your components (cpu, psu, etc)?
> 
> i recommend another 780. not sure how much the 295 is there but here in the US . . . $1500 at least.
> 
> This particular GPU, imo, is meant for if you are space limited and want to run high rez or if you need the ports they provide.


sapphire 295x2 is aroud 1050euro where i live. +new psu (1200W)= 1300 euro =1758$
the evga sc acx that i have on the pc atm is aroud 450 euro +new psu(850W)=650 euro= 879 $

the main doubt is that i think the 295x2 is far more powerfull than a 780 sli....
but atm i'm a lot concerned about the sapphire bran and drivers from amd than the price..


----------



## rdr09

Quote:


> Originally Posted by *Necrodeath*
> 
> sapphire 295x2 is aroud 1050euro where i live. +new psu (1200W)= 1300 euro =1758$
> the evga sc acx that i have on the pc atm is aroud 450 euro +new psu(850W)=650 euro= 879 $
> 
> the main doubt is that i think the *295x2 is far more powerfull than a 780 sli*....
> but atm i'm a lot concerned about the sapphire bran and drivers from amd than the price..


nope. whatever game one setup can max, the other can, too. save your euro and just wait for the next line of gpus coming soon.

See that Rigbuilder link in the upper righthand corner of the page? Fill it out pls.


----------



## Necrodeath

Quote:


> Originally Posted by *rdr09*
> 
> nope. whatever game one setup can max, the other can, too. save your euro and just wait for the next line of gpus coming soon.
> 
> See that Rigbuilder link in the upper righthand corner of the page? Fill it out pls.


i'm gaming at 1440p and 1 780 struggle a bit on hight settings. the 295x2 seems to perform very well...


----------



## rdr09

Quote:


> Originally Posted by *Necrodeath*
> 
> i'm gaming at 1440p and 1 780 struggle a bit on hight settings. the 295x2 seems to perform very well...


780 sli should do just as well. do not compare a single card performance with a card with 2 gpu cores in it - 295.

what cpu do you have?


----------



## Necrodeath

Quote:


> Originally Posted by *rdr09*
> 
> 780 sli should do just as well. do not compare a single card performance with a card with 2 gpu cores in it - 295.
> 
> what cpu do you have?


a 3570k @4.2 but i was planning ot get a 4790k or a 4930k. dunno wich one to chose between the two.
i'm worried that a 3570k can cause a bottleneck expecially with a 295x2


----------



## cennis

Quote:


> Originally Posted by *DividebyZERO*
> 
> Have you changed Drivers? Maybe try to verify voltage adjustments and power limit is applying to both gpu's on the card? Gpuz might tell you that, I've had issues with my 290x CF crapping out when applying OC with MSI AB. It turned out that it wasn't upping the vcore on the 2nd card properly, and power limit also. If PSU is clicking i would try running valley windowed on 1 gpu WITH it oc'd if you can. Maybe see if its stable on one GPU first. Maybe use a voltage monitor program for your 12v/5v/3.3v etc and see if it's acting wierd.
> 
> What PSU do you use?


yea im using 1000w platinum XFX psu

It should have enough capacity, i think it only happens in valley, around 5~ mins into the run


----------



## rdr09

Quote:


> Originally Posted by *Necrodeath*
> 
> a 3570k @4.2 but i was planning ot get a 4790k or a 4930k. dunno wich one to chose between the two.
> i'm *worried that a 3570k can cause a bottleneck expecially with a 295x2*


depends on the game or games. Multiplayer games an i7 might help better push multi-gpus. i only have a single 290 and i leave my HT off even in MP games like BF4 and C3. if i were to add another 290, i think i will be forced to turn on ht. oc your cpu a bit more. the 780 should handle 1440 without sweat. no need for much AA with that rez.

monitor your usages (cpu and gpu). you can use a simple tool like Hwinfo64 to do that.


----------



## HoneyBadger84

Quote:


> Originally Posted by *cennis*
> 
> yea im using 1000w platinum XFX psu
> 
> It should have enough capacity, i think it only happens in valley, around 5~ mins into the run


If it's happening in Valley that could be VRMs overheating (the middle of the GPU board is full of them hence the fan being in that area). Have you ran something that's more universally stressful, like 3DMark FireStrike (specifically the graphics & combined tests)?

Having said that, I think it's POSSIBLE your issue MIGHT be amps, but it looks like that PSU should have plenty since it's a single rail. That unit is also pretty well rated in general, including on a review by Johnny Guru.


----------



## cennis

Quote:


> Originally Posted by *HoneyBadger84*
> 
> If it's happening in Valley that could be VRMs overheating (the middle of the GPU board is full of them hence the fan being in that area). Have you ran something that's more universally stressful, like 3DMark FireStrike (specifically the graphics & combined tests)?
> 
> Having said that, I think it's POSSIBLE your issue MIGHT be amps, but it looks like that PSU should have plenty since it's a single rail. That unit is also pretty well rated in general, including on a review by Johnny Guru.


Yea I am worried about the vrm too.

However currently we know that vrm overheating gives causes 300mhz throttling
I have encountered that before but now with 120mm fans blowing at the back of the card, and a "red modded" AIO onto the front of the vrm, it seems like that shouldn't be the problem

Each time the run lasts 5~min without any throttling at all, but freeze soon after.

I'll try 3dmark firestrike later but 3dmark11 runs fine


----------



## Earth Dog

Its not amperage... I can run the 295x2 overclocked as well as a 4930K between 4.4 and 4.7GHz and I hit around 890W(peak) at the wall (Heaven, Valley, and FStrike are in that same 890W ballpark).

I wont tell you that is on a 750W PSU either....LOL! So yeah, that KW single rail monster is PLENTY.


----------



## HoneyBadger84

3DMark11 has high power draw in some tests, but the highest draws I see are typically in 3DMark FireStrike's combined test as it typically loads up the CPU decently as well as both GPUs. If it passes that, you can pretty much rule out it being a PSU issue completely since there's no way Valley draws more power than that as far as I know.


----------



## electro2u

Quote:


> Originally Posted by *rdr09*
> 
> 780 sli should do just as well.


Not even close. 2x 780s run valley at 1080p at about the same fps as the 295x2 at 1440p. 295x2 is way more powerful. Mostly because you get almost full scaling from 2way CF. SLi is nowhere near fully scaled in 2 way.


----------



## tsm106

Hello? The 780ti is killing it in Valley and you say they are no where near what again?


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Hello? The 780ti is killing it in Valley and you say they are no where near what again?


Think some are confused as to whether he's talking about a 780 or 780 Ti and he's said 780 then 780 Ti then back to 780 himself, so I'm not even sure what he has, which is why more than one person suggested he fill out the rig builder so we know what we're dealing with.

If it's a 780, I agree, 780s in SLi will get dunked by a 295x2 in most benchmarks if not all, 780 Tis on the other hand will beat a 295x2 handedly.


----------



## Necrodeath

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Think some are confused as to whether he's talking about a 780 or 780 Ti and he's said 780 then 780 Ti then back to 780 himself, so I'm not even sure what he has, which is why more than one person suggested he fill out the rig builder so we know what we're dealing with.
> 
> If it's a 780, I agree, 780s in SLi will get dunked by a 295x2 in most benchmarks if not all, 780 Tis on the other hand will beat a 295x2 handedly.


sorry i'm new so i dont know how the rig builder works. i'm learning.
btw i have a 780 sc from evga and the dilemma was if to add another 780 or buy a 295x2. i'm gamign at 1440
reading the forum i saw 2 things that are driving me away from buying one of those cards.
the first one is that sapphire have problems from what i read.
second one are drivers :\..


----------



## HoneyBadger84

at 1440p get the 295x2 imo, more vRAM & better performance than regular 780s in SLi. Drivers are okay as far as game support goes, the driver issues have more to do with Overclocking, which will be fixed sooner or later.


----------



## Necrodeath

Quote:


> Originally Posted by *HoneyBadger84*
> 
> at 1440p get the 295x2 imo, more vRAM & better performance than regular 780s in SLi. Drivers are okay as far as game support goes, the driver issues have more to do with Overclocking, which will be fixed sooner or later.


i see your point. but still not sure of sapphire product.
as another user said lots here have got problems with fault gpu and that worry me a bit but i'm nice to head that drivers works good


----------



## HoneyBadger84

Quote:


> Originally Posted by *Necrodeath*
> 
> i see your point. but still not sure of sapphire product.
> as another user said lots here have got problems with fault gpu and that worry me a bit but i'm nice to head that drivers works good


If you're that concerned about it, search out 2 non-mining R9 290Xs on EBay and get'em for cheap. That'll run ya around $600 plus shipping & kick your GTX 780 SLi idea's performance right in the bunghole.


----------



## Necrodeath

Quote:


> Originally Posted by *HoneyBadger84*
> 
> If you're that concerned about it, search out 2 non-mining R9 290Xs on EBay and get'em for cheap. That'll run ya around $600 plus shipping & kick your GTX 780 SLi idea's performance right in the bunghole.


ahaha XD well i like the idea of a single card slot usage that's why i like the 295x2.
where i live also the sapphire model is nearly the price of 2 290x
do u think that my 3570k @4.2 would be a bottleneck in either 2 290x or 1 295x2?


----------



## HoneyBadger84

Unlikely, except in CPU-bound titles.

Another thing you could consider if you're willing to shell out $1500 or whatever it is for an R9 295x2 there, look at the PowerColor Devil 13 R9 290X II card, it's 2 R9 290Xs on a single PCB, 3 slot card, but it only has the outputs of a single 290X. It is air cooled, but runs quiet & cool, check out reviews on it. Just a note on that beast, it does require FOUR 8-pin PCI-E plugs to run it, recommended PSU is 1000W minimum.


----------



## Necrodeath

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Unlikely, except in CPU-bound titles.
> 
> Another thing you could consider if you're willing to shell out $1500 or whatever it is for an R9 295x2 there, look at the PowerColor Devil 13 R9 290X II card, it's 2 R9 290Xs on a single PCB, 3 slot card, but it only has the outputs of a single 290X. It is air cooled, but runs quiet & cool, check out reviews on it. Just a note on that beast, it does require FOUR 8-pin PCI-E plugs to run it, recommended PSU is 1000W minimum.


i will check it out!
atm i'm considering the 295x2 and maybe in future upgrade my 3570k for a better option. tbh i can't find anything that tell me if it's enough to support this gpu.
it would be really bad to spend 1500$ on a gpu and be cpu limited


----------



## electro2u

Quote:


> Originally Posted by *tsm106*
> 
> Hello? The 780ti is killing it in Valley and you say they are no where near what again?


I didn't say Ti did I? Nope.


----------



## fireedo

finally got from EKWB for my 295x2 block



now I think it will take time before up n running since I never have experience building a custom water cooling before


----------



## crazygamer123

Quote:


> Originally Posted by *fireedo*
> 
> finally got from EKWB for my 295x2 block
> 
> now I think it will take time before up n running since I never have experience building a custom water cooling before


Awesome kit, lets see a picture when its done.


----------



## shadow85

Are their going to be anymore factory overclocked 295x2's coming out? And what happend to the dual hawaii devils13 290x gpu? Been awhile since i heard anything on that


----------



## crazygamer123

Quote:


> Originally Posted by *shadow85*
> 
> Are their going to be anymore factory overclocked 295x2's coming out? And what happend to the dual hawaii devils13 290x gpu? Been awhile since i heard anything on that


Yes, Asus going to release the Ares III which is r9 295x2 with custom EK block. Not sure about its clock atm. The Devil 13 perform slightly faster than the R9 295x2, but it wont support 4X DP1.2.


----------



## HoneyBadger84

Quote:


> Originally Posted by *shadow85*
> 
> Are their going to be anymore factory overclocked 295x2's coming out? And what happend to the dual hawaii devils13 290x gpu? Been awhile since i heard anything on that


http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584 is out and since only 250 were made, probably almost sold out, I'd assume anyway. I was gonna get one but then I got 3 290Xs new on EBay for $ 880.


----------



## vonalka

Add me to the club, I just put the R9 295x2 into my build






More pic here if you are interested
http://www.overclock.net/t/1254106/cosmos-ii-i7-3960x-build/50#post_22587792


----------



## fireedo

Quote:


> Originally Posted by *crazygamer123*
> 
> Awesome kit, lets see a picture when its done.


well its done







.... after spending hour(s) since I never do custom water cooling before and no leaking whatsoever, not a beautifull one but it is really worth it every penny









and NOW never throttle anymore !!!







in a room without Air Cond (ambient @ 31-33 C ) ... playing game(s), Benchmarking never exceed 66 C (maybe on your country with ambient @ 25 C ish my result is a garbage, but here temperature is really hot, so this result is amazing







)








and another pros, now I have a big space between 2 card


----------



## Satchmo0016

So is there any way to stop the 295x2 from throttling at 75c under normal use without a custom water kit? I've tried different fans, push, pull, push/pull and all throttle after 7-10 min of gaming.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Satchmo0016*
> 
> So is there any way to stop the 295x2 from throttling at 75c under normal use without a custom water kit? I've tried different fans, push, pull, push/pull and all throttle after 7-10 min of gaming.


It's throttling because of temps yeah? Maybe try underclocking it slightly? Is it really warm there right now, if you're running push/pull with it blowing out of the case (So you're not filling your case with hot fire-air) it should run decently cool. Wish I could get my hands on one of these to really test it and see what kinda temps I can get out of it since I'm typically really good at getting cards to run cool then passing the info along.


----------



## Satchmo0016

Quote:


> Originally Posted by *HoneyBadger84*
> 
> It's throttling because of temps yeah? Maybe try underclocking it slightly? Is it really warm there right now, if you're running push/pull with it blowing out of the case (So you're not filling your case with hot fire-air) it should run decently cool. Wish I could get my hands on one of these to really test it and see what kinda temps I can get out of it since I'm typically really good at getting cards to run cool then passing the info along.


Yeah the only way I've had it set up is exhaust from the case. Tried all fan combinations too. I had to drop it to about 900mhz to stop throttling from temperature, and that's pretty unacceptable IMO.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Satchmo0016*
> 
> Yeah the only way I've had it set up is exhaust from the case. Tried all fan combinations too. I had to drop it to about 900mhz to stop throttling from temperature, and that's pretty unacceptable IMO.


I would say that's not bad at all, but I've only read a couple reviews on this card. Maybe trying something like undervolting it slightly & setting the core speed to 1000MHz or 975MHz would lessen or eliminating the throttling all together... not sure how much voltage you'd wanna take off for that small of a drop. Probably 10-20mV is safe to say.


----------



## Jpmboy

what's the best card to tri-fire a 295x2 with?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jpmboy*
> 
> what's the best card to tri-fire a 295x2 with?


290X. A 295x2 is basically a set of 290Xs on one card, so it'd work almost the exact same as TriFire of 3 290Xs:



See specsheet above (just ignore the Devil 13's stuff)


----------



## crazygamer123

Quote:


> Originally Posted by *fireedo*
> 
> well its done
> 
> 
> 
> 
> 
> 
> 
> .... after spending hour(s) since I never do custom water cooling before and no leaking whatsoever, not a beautifull one but it is really worth it every penny
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and NOW never throttle anymore !!!
> 
> 
> 
> 
> 
> 
> 
> in a room without Air Cond (ambient @ 31-33 C ) ... playing game(s), Benchmarking never exceed 66 C (maybe on your country with ambient @ 25 C ish my result is a garbage, but here temperature is really hot, so this result is amazing
> 
> 
> 
> 
> 
> 
> 
> )


Sound like that huge 360 radiator really did a good job.
Quote:


> Originally Posted by *Satchmo0016*
> 
> So is there any way to stop the 295x2 from throttling at 75c under normal use without a custom water kit? I've tried different fans, push, pull, push/pull and all throttle after 7-10 min of gaming.


Yes, consider moving your rig into the air-con room. I have push/pull/exhaust setup plus air-con temp never go higher than 67 C. Also exchange the stock fan with better one. But electronic bill this month gone higher 50$ last month =.=


----------



## Satchmo0016

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I would say that's not bad at all, but I've only read a couple reviews on this card. Maybe trying something like undervolting it slightly & setting the core speed to 1000MHz or 975MHz would lessen or eliminating the throttling all together... not sure how much voltage you'd wanna take off for that small of a drop. Probably 10-20mV is safe to say.


I'll try that.

Quote:


> Originally Posted by *crazygamer123*
> 
> Yes, consider moving your rig into the air-con room. I have push/pull/exhaust setup plus air-con temp never go higher than 67 C. Also exchange the stock fan with better one. But electronic bill this month gone higher 50$ last month =.=


Its normally ~23C in my house. I have it in push/pull with NF-F12 fans and it still gets to 75C and throttles. I even tested it with the radiator removed from the case to make sure its getting air conditioned air. Is it possible that the loop isnt full or something? I can feel heat blowing off the radiator so the cores are transferring heat to it, but for some reason the radiator is staying really hot, its measuring 65C via infrared thermometer. I can also hear water sloshing around in it when I move it around.

edit: 65C at the inlet and about 54C at the outlet.


----------



## crazygamer123

Quote:


> Originally Posted by *Satchmo0016*
> 
> I can feel heat blowing off the radiator so the cores are transferring heat to it, but for some reason the radiator is staying really hot, its measuring 65C via infrared thermometer. I can also hear water sloshing around in it when I move it around.
> 
> edit: 65C at the inlet and about 54C at the outlet.


I haven't noticed any water sound yet, what game are you playing when it got throttle?


----------



## Satchmo0016

Quote:


> Originally Posted by *crazygamer123*
> 
> I haven't noticed any water sound yet, what game are you playing when it got throttle?


I'm seeing it happen in any game really, even if gpu usage isnt flat out 100%. But I've been using heaven and valley to test fan configs.


----------



## crazygamer123

Quote:


> Originally Posted by *Satchmo0016*
> 
> I'm seeing it happen in any game really, even if gpu usage isnt flat out 100%. But I've been using heaven and valley to test fan configs.


Might be the VRM was getting too much heats. I have a 200mm fan blow cool-air directly to the card. Also its very unlikely that your card get throttle at every games unless there is a problem with the card itself. Are you using the fan controller which attach with the card?. I play payday2, dark souls 2, temp <57 C , Battlefield 4 mantle, Metro last light, Thief mantle, temp < 67 C.


----------



## Jpmboy

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 290X. A 295x2 is basically a set of 290Xs on one card, so it'd work almost the exact same as TriFire of 3 290Xs:
> 
> 
> 
> See specsheet above (just ignore the Devil 13's stuff)


thx! Maybe I can get an old miner for cheap?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jpmboy*
> 
> thx! Maybe I can get an old miner for cheap?


IF you go after a miner make sure you grill the seller to see what temperatures it was operated at (try to find one that was kept at or below 80C) and if it has any sort of BIOS modifications see if they're willing to put the stock BIOS back on.

I've had a few former mining cards and they were mistreated but ran fine for me, juat had to put the stock BIOS back on them.


----------



## electro2u

So I am still having this issue with final fantasy XIV and the 295x2 (in win8.1 only), but for anything that isn't DX9 it works fantastic... so I'm about to slap the AC block on and see if the EK backplate will fit... if anyone wants to know more about the FFXIV issue, I'm offering a 64$ reward for a solution in this thread.


----------



## Jpmboy

Quote:


> Originally Posted by *HoneyBadger84*
> 
> IF you go after a miner make sure you grill the seller to see what temperatures it was operated at (try to find one that was kept at or below 80C) and if it has any sort of BIOS modifications see if they're willing to put the stock BIOS back on.
> 
> I've had a few former mining cards and they were mistreated but ran fine for me, juat had to put the stock BIOS back on them.


yeah - I'd likely tweak bios to match the base clocks on my 295x2 (flashed to OC bios).


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jpmboy*
> 
> yeah - I'd likely tweak bios to match the base clocks on my 295x2 (flashed to OC bios).


You looking for stock reference card you can just liquid cool? Ons of the mods here (dizzz) has two on sale that are barely used ... oh right you already posted in his thread lol derp


----------



## caste1200

how do you know if your card gets throttle?
I have Im having problems with bf4, my gpu usage is very random..

is that normal? im getting a very poor performance on bf4... and when using a single gpu its running 100% of the time at 100 load and almost the same FPS... only like 10% improvement with the second gpu active..

help









im running the game on dx11 since mantle is so unstable...


----------



## fireedo

Quote:


> Originally Posted by *caste1200*
> 
> how do you know if your card gets throttle?
> I have Im having problems with bf4, my gpu usage is very random..
> 
> is that normal? im getting a very poor performance on bf4... and when using a single gpu its running 100% of the time at 100 load and almost the same FPS... only like 10% improvement with the second gpu active..
> 
> help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> im running the game on dx11 since mantle is so unstable...


that seems like throttling symptoms u have there....try uninstall your graphics driver first see what happen, but I think your card is throttling

it throttle when reach 74-75 C temp on GPU(s) and ??(?) C on VRM(s) , dunno maybe about 90-100 C VRM(s) when they start to throtle


----------



## caste1200

Im trying the latest 14.7 drivers, we'll see if that changes something...if not a defective vrm?


----------



## caste1200

here are the results..max temp 65C°

I also get BSOD sometimes with an atikmpag.sys error..


----------



## cennis

Quote:


> Originally Posted by *caste1200*
> 
> 
> 
> here are the results..max temp 65C°
> 
> I also get BSOD sometimes with an atikmpag.sys error..


disable ULPS in MSI AB?


----------



## caste1200

UPLS is disabled


----------



## shadow85

Hmm the devils 13 runs hotter than and uses more power than the 295x2, n frm the benchies i seen only has 1-2 fps better in some games at 4k.


----------



## Satchmo0016

Quote:


> Originally Posted by *crazygamer123*
> 
> Might be the VRM was getting too much heats. I have a 200mm fan blow cool-air directly to the card. Also its very unlikely that your card get throttle at every games unless there is a problem with the card itself. Are you using the fan controller which attach with the card?. I play payday2, dark souls 2, temp <57 C , Battlefield 4 mantle, Metro last light, Thief mantle, temp < 67 C.


I don't think it's the card, or software, but more so the cooling system. For some reason no matter what it's doing it always gets up to 75 degrees within a few minutes and starts throttling. Heat is getting to the radiator because its also really hot but it cant get rid of it fast enough. Maybe my 290x chips are exceptionally hot for some reason? Performance before they over heat seems appropriate though.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *vonalka*
> 
> Add me to the club, I just put the R9 295x2 into my build
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> More pic here if you are interested
> http://www.overclock.net/t/1254106/cosmos-ii-i7-3960x-build/50#post_22587792


Accepted!


----------



## caste1200

anybody here with an 295x2 here play battlefield 4 that can show me how their GPU load behaves? and what kind of performance you are getting?
just to compare to mines if my card has a problem or is just not optimized yet..

cheers!


----------



## NavDigitalStorm

Quote:


> Originally Posted by *caste1200*
> 
> anybody here with an 295x2 here play battlefield 4 that can show me how their GPU load behaves? and what kind of performance you are getting?
> just to compare to mines if my card has a problem or is just not optimized yet..
> 
> cheers!


I'm usually getting 100% usage on both GPUs. However, after the last patch ive noticed that my second GPU fluctuates from 10% to 30%, never 100%.


----------



## caste1200

but you have one GPU on fully load? what about frame rates? in my case when one card is at 70% the other one is at 30% ...and its never stable its always fluctuating..

why???????


----------



## NavDigitalStorm

Quote:


> Originally Posted by *caste1200*
> 
> but you have one GPU on fully load? what about frame rates? in my case when one card is at 70% the other one is at 30% ...and its never stable its always fluctuating..
> 
> why???????


I noticed that you have to run in full-screen to make sure you get 100% on both GPUs.

First card is always on 100%, getting about 60 FPS on BF4 at Ultra no AA.


----------



## caste1200

.


----------



## caste1200

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I noticed that you have to run in full-screen to make sure you get 100% on both GPUs.
> 
> First card is always on 100%, getting about 60 FPS on BF4 at Ultra no AA.


I always run my games on fullscreen at 5760x1080 eyefinity setup.......annoying... very poor performance..


----------



## NavDigitalStorm

Quote:


> Originally Posted by *caste1200*
> 
> I always run my games on fullscreen at 5760x1080 eyefinity setup.......annoying... very poor performance..


I haven't messed with eyefinity. I use a single 4K display which scaled marvelously. Ill test again tonight.


----------



## caste1200

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> I haven't messed with eyefinity. I use a single 4K display which scaled marvelously. Ill test again tonight.


eyefinity is sweet







could you make a screenshot of your gpu usage during bf4 game play?
cheers!


----------



## GunnzAkimbo

Has anyone tried popping off the 120mm Rad and replacing it with a 140?

Keep the coolant, and add distilled to top it up.

Peg the hoses.


----------



## HoneyBadger84

Quote:


> Originally Posted by *shadow85*
> 
> Hmm the devils 13 runs hotter than and uses more power than the 295x2, n frm the benchies i seen only has 1-2 fps better in some games at 4k.


Need more airflow then, with the vRAM speed advantage the Devil 13 has it will get a small advantage over the 295x2, but your real advantage is no throttling, unless you're letting it get too hot (think it's cap is set at 85C)


----------



## HoneyBadger84

Quote:


> Originally Posted by *caste1200*
> 
> I always run my games on fullscreen at 5760x1080 eyefinity setup.......annoying... very poor performance..


Eyefinity resolutions are pretty harsh in some games, might be BF4 is one of them (can't test for myself as I don't have the game)


----------



## caste1200

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Eyefinity resolutions are pretty harsh in some games, might be BF4 is one of them (can't test for myself as I don't have the game)


yeah it is, but the performance im getting out of this card is very bad..I feel like its just my old 7970 GHZ editio 6gb card....


----------



## Satchmo0016

Does anybody who has added their 295x2 to a custom loop still have the stock radiator/hoses, and/or maybe saved the fluid? I'm looking to see if I could set up two radiators in parallel to help with overheating without having to go the custom block route.


----------



## SaLX

Quote:


> Originally Posted by *4K-HERO*
> 
> the rm1000 is an excellent power supply. It can handle 83a of current. I had it running a 295x2 with a 290 and the rest of my computer. It did it without a hiccup. Unless you have a faulty one you should be more than fine with that PSU.


I'm checking out PSU's atm..... so a very good 1000w PSU (ie the Corsair RM 1000) ought to do it for a single 295 x2? I'll be OC'ing my CPU on an i5 board, but nothing punishing.

Many thanks for any replies.


----------



## soulwrath

i
Quote:


> Originally Posted by *SaLX*
> 
> I'm checking out PSU's atm..... so a very good 1000w PSU (ie the Corsair RM 1000) ought to do it for a single 295 x2? I'll be OC'ing my CPU on an i5 board, but nothing punishing.
> 
> Many thanks for any replies.


would be a bit overkill but it would supply it more than enough







you would honestly just need a 750-850W and that is with a buffer of about +100W - 200W than you need


----------



## HoneyBadger84

Quote:


> Originally Posted by *SaLX*
> 
> I'm checking out PSU's atm..... so a very good 1000w PSU (ie the Corsair RM 1000) ought to do it for a single 295 x2? I'll be OC'ing my CPU on an i5 board, but nothing punishing.
> 
> Many thanks for any replies.


I'd recommend looking at something more reasonably priced imo, but the RM 1000 isn't too bad price wise. Corsair units are typically severely overpriced, I wouldn't buy one these days. The only reason I got my AX1200W back when I did was because it was the best at the time & the only single-rail 1200W available, which is what I wanted (3yrs ago)


----------



## SaLX

Thanks for both of your replies, Badger and Wrath. I've always used Seasonic PSU's without a problem, but have had friend's various brand name power supplies fail over the years. Corsair used rebadged Seasonics at one time, so I happily told my friends to use them (ok veering off topic) and in turn they had no problems.

With such a good graphics card as the 295, would you guys then happily recommend most of the well known brands - EVGA, Coolemaster, Enermax, Silverstone et al, not only for design and metrics, but with reliability in mind too?

Any goto PSU's in the 850w range?


----------



## HoneyBadger84

Quote:


> Originally Posted by *SaLX*
> 
> Thanks for both of your replies, Badger and Wrath. I've always used Seasonic PSU's without a problem, but have had friends various brand name power supplies fail over the years. Corsair used rebadged Seasonics at one time, so I happily told my friends to use them (ok veering off topic) and in turn they had no problems.
> 
> With such a good graphics card as the 295, would you guys then happily recommend most of the well known brands - EVGA, Coolemaster, Enermax, Silverstone et al, not only for design and metrics, but with reliability in mind too?


Coolermaster & Enermax it really depends on the PSU as some are high quality some are trash. Silverstone is pretty much all high quality as is eVGA. Antec is another great one.

For a 295x2, if yer never planning on QuadFire, I'd go with a strong 850-1000W unit and be done with it, from one of those brands.


----------



## caste1200

Quote:


> Originally Posted by *soulwrath*
> 
> i
> would be a bit overkill but it would supply it more than enough
> 
> 
> 
> 
> 
> 
> 
> you would honestly just need a 750-850W and that is with a buffer of about +100W - 200W than you need


Ive been at alomst 900 watts while playing bf4 with my 295x2


----------



## pompss

vrms max temp.?????/
i hit 60 c its that safe ???


----------



## ImperialOne

Yes!


----------



## crazygamer123

hi

What software you guys are using to monitor the GPUs in game? (full screen)


----------



## HoneyBadger84

MSI Afterburner is the only program I know of that has an effective easy to read in-game On Screen Display that you can program to show GPU temperature, load, & core/mem clocks.


----------



## caste1200

Quote:


> Originally Posted by *ImperialOne*
> 
> Yes!


Quote:


> Originally Posted by *pompss*
> 
> vrms max temp.?????/
> i hit 60 c its that safe ???


how do you get vrms temps?

thanks


----------



## papi4baby

Quote:


> Originally Posted by *caste1200*
> 
> anybody here with an 295x2 here play battlefield 4 that can show me how their GPU load behaves? and what kind of performance you are getting?
> just to compare to mines if my card has a problem or is just not optimized yet..
> 
> cheers!


Mine keeps both gpus at 1018 constant. Unless there's a map change.


----------



## rdr09

Quote:


> Originally Posted by *caste1200*
> 
> yeah it is, but the performance im getting out of this card is very bad..I feel like its just my old 7970 GHZ editio 6gb card....


check your cpu usage as well. i use Hwinfo or MSI AB. here is an example with a single 290 and my i7 @ 4.5 HT off in BF4 MP. . .



may have to turn on HT if i add another 290.


----------



## electro2u

Been trying to lock down some stuttering issues I've been having in regular applications like Chrome just scrolling around (at 120Hz/120Fps--Chrome uses up to 144Fps). This was mostly going on after waking from Sleep mode. I finally locked it down. It was simply the PCIE link state in windows power settings. Turned it off and the stutter went away.


----------



## caste1200

Quote:


> Originally Posted by *rdr09*
> 
> check your cpu usage as well. i use Hwinfo or MSI AB. here is an example with a single 290 and my i7 @ 4.5 HT off in BF4 MP. . .
> 
> 
> 
> may have to turn on HT if i add another 290.


I use AB, I already posted a screenshot after an hour of gameplay, only uses 30-70% load and something like GPU 1 : 70% GPU 2 : 30%, never at 100%, never keeping a stable load..


----------



## rdr09

Quote:


> Originally Posted by *caste1200*
> 
> I use AB, I already posted a screenshot after an hour of gameplay, only uses 30-70% load and something like GPU 1 : 70% GPU 2 : 30%, never at 100%, never keeping a stable load..


gpu load is half of the story. the other half is the cpu load.


----------



## caste1200

Quote:


> Originally Posted by *rdr09*
> 
> gpu load is half of the story. the other half is the cpu load.


oh sorry, didn't realized it was a C and not a G, i'll keep an eye to it, usually all my 8 cores are running around 80 -100 % while playing bf4 64bits


----------



## caste1200

Quote:


> Originally Posted by *rdr09*
> 
> gpu load is half of the story. the other half is the cpu load.


yup...all my 8cores are near 60-85% the whole time, FX8350 at 4.5 GHZ


----------



## ChrisxIxCross

Hey Guys my 295 is coming in tomorrow! I want to run mine in a push pull config, can anyone here confirm that you can hook up two fans with a splitter directly into the fan header of the card or did you use a fan controller?


----------



## Satchmo0016

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> Hey Guys my 295 is coming in tomorrow! I want to run mine in a push pull config, can anyone here confirm that you can hook up two fans with a splitter directly into the fan header of the card or did you use a fan controller?


Its a DC signal from the card to the fan and I don't know what kind of amps its rated, but one 12v fan is pretty small load so I wouldnt worry about it. I also tried it and didnt seem to have any issues.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *ChrisxIxCross*
> 
> Hey Guys my 295 is coming in tomorrow! I want to run mine in a push pull config, can anyone here confirm that you can hook up two fans with a splitter directly into the fan header of the card or did you use a fan controller?


We prefer to have both fans connected to the 295X2's connector as it allows the card to regulate the need and has them run both at the same speed.


----------



## romp23

Hi, I'm also very curious about using 2 fans in push pull config, but unsure which fans to use, and if to use the 2 pin header or not.
I would prefer to use Noctua fans because my system is already populated with Noctua fans but not sure which spec fans to use...

Any help would be much appreciated...


----------



## crazygamer123

Quote:


> Originally Posted by *romp23*
> 
> Hi, I'm also very curious about using 2 fans in push pull config, but unsure which fans to use, and if to use the 2 pin header or not.
> I would prefer to use Noctua fans because my system is already populated with Noctua fans but not sure which spec fans to use...
> 
> Any help would be much appreciated...


I am using the Corsair SP 120 HP edition in push/pull, one of them is connected directly to the R9 295x2's fan controller, and the CPU optional fan plug to power the other one ( when CPU get hot they also push their fan speed ). You need 3 pin header to connect to the R9 295x2. However, i only notice a slightly different in temp about 1-2 Celsius.


----------



## HoneyBadger84

I'd use two heavy static pressure new fans (Corsair AP or AF series) on the radiator and use the one it comes with somewhere else in your case hooked up to the card still so it doesn't think it has no fan on it.


----------



## Earth Dog

Here
Quote:


> Originally Posted by *NavDigitalStorm*
> 
> We prefer to have both fans connected to the 295X2's connector as it allows the card to regulate the need and has them run both at the same speed.


I used a fan controller and put both fans on the same channel that way they are both the same speed. I left the stock fan connected and cooling the rear of the card...


----------



## Synthaxx

Well look what I found today! These are my new 295x2s, they are the Sapphire OC'ed version.


Spoiler: Warning: Spoiler!















Can I join the club?









Ps: sorry if I messed up the post, I really should learn to use forums


----------



## axiumone

Quote:


> Originally Posted by *Synthaxx*
> 
> Well look what I found today! These are my new 295x2s, they are the Sapphire OC'ed version.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ps: sorry if I messed up the post, I really should learn to use forums


Nice! Congrats! What kind of monitor set up are you planning to run with them?


----------



## Synthaxx

Quote:


> Originally Posted by *axiumone*
> 
> Nice! Congrats! What kind of monitor set up are you planning to run with them?


I ordered the PB287Q.
Maybe if I can find 3 2560x1440s, I can buy those in addition to get a triple setup, but i'm quite strict when it comes to bezels...


----------



## crazygamer123

Quote:


> Originally Posted by *Synthaxx*
> 
> Well look what I found today! These are my new 295x2s, they are the Sapphire OC'ed version.
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ps: sorry if I messed up the post, I really should learn to use forums


Nice kit. You might want to consider a 1500-1600w power supply to feed these cards.


----------



## Synthaxx

Quote:


> Originally Posted by *crazygamer123*
> 
> Nice kit. You might want to consider a 1500-1600w power supply to feed these cards.


Thanks! Yeah i'm still looking for a good PSU. I'm currently doubting between the (upcoming) EVGA 1600 g2 or the Lepa max platinum 1700w (europe only)...
The Lepa is available, but the evga is still not available here


----------



## crazygamer123

Quote:


> Originally Posted by *Synthaxx*
> 
> Thanks! Yeah i'm still looking for a good PSU. I'm currently doubting between the (upcoming) EVGA 1600 g2 or the Lepa max platinum 1700w (europe only)...
> The Lepa is available, but the evga is still not available here


The G2 is out in the US now. It costs less than the corsair 1500w and provide more amps.


----------



## Synthaxx

Quote:


> Originally Posted by *crazygamer123*
> 
> The G2 is out in the US now. It costs less than the corsair 1500w and provide more amps.


Yeah the 1600 g2 is really good for the price, but I currenly live in Belgium which is in Europe, so ... It's not available yet









On the other hand, the lepa 1700w is available here, with platinum efficiency and 100 watts more for the same price.

Which one would be better for powering the 295x2 quadfire? I think the evga is single-rail and the lepa is multi-rail... I really don't know much about the difference ... I only know the lepa has 30amp rails and the 295x2 requires 28amps


----------



## axiumone

Quote:


> Originally Posted by *Synthaxx*
> 
> Yeah the 1600 g2 is really good for the price, but I currenly live in Belgium which is in Europe, so ... It's not available yet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On the other hand, the lepa 1700w is available here, with platinum efficiency and 100 watts more for the same price.
> 
> Which one would be better for powering the 295x2 quadfire? I think the evga is single-rail and the lepa is multi-rail... I really don't know much about the difference ... I only know the lepa has 30amp rails and the 295x2 requires 28amps


Ah... that's not a good thing. I wouldnt trust a multi rail psu with my 295x2 in crossfire. I'm pretty sure that these cards can spike over 28a of current and trip the over current protection on the psu, resulting in a random shutdown/restart.


----------



## Synthaxx

Quote:


> Originally Posted by *axiumone*
> 
> Ah... that's not a good thing. I wouldnt trust a multi rail psu with my 295x2 in crossfire. I'm pretty sure that these cards can spike over 28a of current and trip the over current protection on the psu, resulting in a random shutdown/restart.


Yeah, i think you're right and That's what I thought, 28 amps seems close to 30 amps isn't it? And I think overclocking the 295x2 will also increase the amps, no?
I read somewhere that there overcurrent protection on the 1700w triggers at 40-50 amps


----------



## crazygamer123

Quote:


> Originally Posted by *Synthaxx*
> 
> Yeah, i think you're right and That's what I thought, 28 amps seems close to 30 amps isn't it? And I think overclocking the 295x2 will also increase the amps, no?
> I read somewhere that there overcurrent protection on the 1700w triggers at 40-50 amps


I would go for single rail.

One r9 295x2 need 50a (according to AMD recommendation). So two of them need about 100a + 25a for your OCed cpu and others = 125a at peak. Plus some extra, you would need about >= 135a power supply


----------



## DividebyZERO

Not trying to argue power supply mechanics but I can speak for my Lepa GS1600. I use it daily on my 290x quadfire setup. I have hit 1880 watts at wall with kill-a-watt meter. It's never shut down or anything for me. Its been solid as a rock, so I wouldn't dismiss the Lepa so quickly. To me by far the best power supply ive owned.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Thanks! Yeah i'm still looking for a good PSU. I'm currently doubting between the (upcoming) EVGA 1600 g2 or the Lepa max platinum 1700w (europe only)...
> The Lepa is available, but the evga is still not available here


My suggestion would be dual Antec HCP PSUs. They have OC Link which enables them to both turn on when you start your computer without an adapter, and are 80+ Platinum certified. Two 1000Ws or 1 1300W & 1 850W like I'm getting should serve you very well. Its worth the extra cost to not have to worry about popping a PSU IMO.

Think of it this way, you spent how much on the cards? ~$550 on PSUs to power them well is worth it.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> My suggestion would be dual Antec HCP PSUs. They have OC Link which enables them to both turn on when you start your computer without an adapter, and are 80+ Platinum certified. Two 1000Ws or 1 1300W & 1 850W like I'm getting should serve you very well. Its worth the extra cost to not have to worry about popping a PSU IMO.
> 
> Think of it this way, you spent how much on the cards? ~$550 on PSUs to power them well is worth it.


Well that is an idea, but I'd like to keep everything internal. 2 PSUs require me to modify my case a lot, so I'd rather buy 1 1600-1700 (or even more ) power supply.
I replied to a post a while back, about a ax1500i failing to power 2 295x2s (overclocked?) + the system. So would 1600 be enough?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Well that is an idea, but I'd like to keep everything internal. 2 PSUs require me to modify my case a lot, so I'd rather buy 1 1600-1700 (or even more ) power supply.
> I replied to a post a while back, about a ax1500i failing to power 2 295x2s (overclocked?) + the system. So would 1600 be enough?


If you don't plan on OCing the cards AT ALL as long as the PSU has adequate amperage for them, should be fine yeah.

I'm going dual PSU specifically because I plan to OC test with QuadFire, purely for benchmark numbers though.

Fortunately my case has native dual PSU support:

http://www.enermax.com/home.php?fn=eng/product_a1_1_1&lv0=2&lv1=22&no=179 I have the 4x180mm side panel version.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> If you don't plan on OCing the cards AT ALL as long as the PSU has adequate amperage for them, should be fine yeah.


But the AX1500i only failed when the 295x2s were overclocked to 1100/1400. So there should be some OC headroom, no?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> But the AX1500i only failed when the 295x2s were overclocked to 1100/1400. So there should be some OC headroom, no?


1100/1400 isn't much headroom, at all, if you look at the stock clocks (1018/1250 if I remember right?). Amperage is more important that wattage is what I was trying to convey, a quality 1500-1600W unit will still croak out if it doesn't have the amperage to properly power those beasts... the AX1500i has plenty of amperage if memory serves, I was just making a general statement for those looking at PSUs for a single 295x2, 850W can be plenty, as long as the PSU has the 50A it calls for on the rails you'll be using for the card.


----------



## NavDigitalStorm

Quote:


> Originally Posted by *Synthaxx*
> 
> Well look what I found today! These are my new 295x2s, they are the Sapphire OC'ed version.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can I join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ps: sorry if I messed up the post, I really should learn to use forums


Accepted!


----------



## DividebyZERO

On my lepa gs1600 I run 4x290x @ 1100/1600 up to 1200/1600 anywhere from +100mv /200+mv depending on what I am doing or benching. I cannot see two 295x2 using more power unless getting some crazy extreme overclock. I get the idea of being more satisfied with dual PSU's, however a lepa gs1600 or 1700 platinum should be plenty. If you don't want to change your case also.

If your going to do extreme benching maybe dual psu is a better route. If your going to just be gaming with mild to decent OC a Lepa will serve you well.

For an idea I normally run my system around 1200-1600w for extended periods of gaming. Benching as ive said I have reached 1800+watts. Good luck with your decision.


----------



## HoneyBadger84

Quote:


> Originally Posted by *DividebyZERO*
> 
> On my lepa gs1600 I run 4x290x @ 1100/1600 up to 1200/1600 anywhere from +100mv /200+mv depending on what I am doing or benching. I cannot see two 295x2 using more power unless getting some crazy extreme overclock. I get the idea of being more satisfied with dual PSU's, however a lepa gs1600 or 1700 platinum should be plenty. If you don't want to change your case also.
> 
> If your going to do extreme benching maybe dual psu is a better route. If your going to just be gaming with mild to decent OC a Lepa will serve you well.
> 
> For an idea I normally run my system around 1200-1600w for extended periods of gaming. Benching as ive said I have reached 1800+watts. Good luck with your decision.


1800 at the wall? Cuz with the LEPA's efficiency 1800 at the wall is about 1600 actual, give or take a few dozen watts. So that's not too bad at all.


----------



## fireedo

at my trifire configuration I never see above 1100 watt according to my watt meter, so save to say that if using quadfire then LEPA 1600watt is more than enough


----------



## HoneyBadger84

Quote:


> Originally Posted by *fireedo*
> 
> at my trifire configuration I never see above 1100 watt according to my watt meter, so save to say that if using quadfire then LEPA 1600watt is more than enough


I've seen ~1400W draw at the wall with mild OCs on TriFire, and that was during something that's only GPU intensive not really cpu intensive: http://www.3dmark.com/fs/2446772 that's ~1180W actual draw. So I'd say 1600W if your planning to OC the 2 295x2s is borderline, you gotta keep in mind the extra power draw of the fans on the radiator and pump on the closed loops too.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I've seen ~1400W draw at the wall with mild OCs on TriFire, and that was during something that's only GPU intensive not really cpu intensive: http://www.3dmark.com/fs/2446772 that's ~1180W actual draw. So I'd say 1600W if your planning to OC the 2 295x2s is borderline, you gotta keep in mind the extra power draw of the fans on the radiator and pump on the closed loops too.


I calculated my recommended power supply with this calculator.
It took into accound the overclocked processor, and when all components are at 100% load, it is recommended for a 1592 watt PSU, with a minimum PSU of 1542 watts.
When all components are at 90% load (which is the recommended option), I would require 1388 watts as minimum and thus a 1438 watts PSU is recommended.

When I say I'd overclock the cards, I wont go over the top with it, just a decent OC.


----------



## MA-Maddin

Hey all,

as you all know, the common install of fans is this:



but how about that:



wouldn't it be "cooler" for the Radeon if the radiator gets cooled by the 25°C / 77°F outer air instead of 50°C / 122°F in-case air?


----------



## romp23

Quote:


> Originally Posted by *MA-Maddin*
> 
> Hey all,
> 
> as you all know, the common install of fans is this:
> 
> 
> 
> but how about that:
> 
> 
> 
> wouldn't it be "cooler" for the Radeon if the radiator gets cooled by the 25°C / 77°F outer air instead of 50°C / 122°F in-case air?


The first configuration will work best, more fans doesn't necessarily mean better temps, the direction of airflow is more important


----------



## Jeronbernal

someone needs to buy my ax1200i and a set of corsair sleeved black cables @[email protected]
so i can hurry and pop this 295x2 into my ITX build ~__~ im dying over here lol


----------



## HoneyBadger84

Quote:


> Originally Posted by *MA-Maddin*
> 
> Hey all,
> 
> as you all know, the common install of fans is this:
> 
> 
> 
> but how about that:
> 
> 
> 
> wouldn't it be "cooler" for the Radeon if the radiator gets cooled by the 25°C / 77°F outer air instead of 50°C / 122°F in-case air?


You would then have the hot devil fire air coming out of the 295x2's radiator blowing IN TO your system... so that's a no.

See here: http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/65973-amd-radeon-r9-295x2-performance-review-17.html

Make no mistake, the air coming out of the radiator is not warm, its HOT.


----------



## CriticalHit

Quote:


> Originally Posted by *MA-Maddin*
> 
> Hey all,
> 
> as you all know, the common install of fans is this:
> 
> 
> 
> but how about that:
> 
> 
> 
> wouldn't it be "cooler" for the Radeon if the radiator gets cooled by the 25°C / 77°F outer air instead of 50°C / 122°F in-case air?


if you want the rad to take in cooler air, mount it outside of your case...


----------



## HoneyBadger84

And who's case air is at 50C anyway? Yikes.

You could just configure your case like I have mine. Every fan you can see except the back mounted one is intake , along with 2 more on the front you can't see











Edit: yes yes yes side fans needed cleaning, old picture.


----------



## Jeronbernal

Hey guys, so currently i've been sitting on a Diamond Brand R9 295x2, haven't been able to use it yet, due to my PSU being too large for this build.

anyone have any idea if Diamond is a decent brand GPU? should i bring it back to switch for a XFX? the XFX 295x2 i would have to have shipped from another store.

any input would be great.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jeronbernal*
> 
> Hey guys, so currently i've been sitting on a Diamond Brand R9 295x2, haven't been able to use it yet, due to my PSU being too large for this build.
> 
> anyone have any idea if Diamond is a decent brand GPU? should i bring it back to switch for a XFX? the XFX 295x2 i would have to have shipped from another store.
> 
> any input would be great.


Pretty sure all r9 295x2s are in reality made by the same manufacturer, so there shouldn't be any difference. I know XFX has a good warranty, not sure about Diamond. Never heard anything bad about them though.


----------



## Synthaxx

Quote:


> Originally Posted by *Jeronbernal*
> 
> Hey guys, so currently i've been sitting on a Diamond Brand R9 295x2, haven't been able to use it yet, due to my PSU being too large for this build.
> 
> anyone have any idea if Diamond is a decent brand GPU? should i bring it back to switch for a XFX? the XFX 295x2 i would have to have shipped from another store.
> 
> any input would be great.


Quote:


> Originally Posted by *HoneyBadger84*
> 
> Pretty sure all r9 295x2s are in reality made by the same manufacturer, so there shouldn't be any difference. I know XFX has a good warranty, not sure about Diamond. Never heard anything bad about them though.


Yeah, I guess the same. Same manufacturer, different sticker.
Just go to the brand you like the most (warranty, brand name, ...)


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Yeah, I guess the same. Same manufacturer, different sticker.
> Just go to the brand you like the most (warranty, brand name, ...)


If I remember right, all 295x2s are assembled by Sapphire. Or maybe that was just the first batches.


----------



## Jeronbernal

I wish that $1499 sapphire 295x2 + 500gb Samsung Evo Ssd deal was local =/


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jeronbernal*
> 
> I wish that $1499 sapphire 295x2 + 500gb Samsung Evo Ssd deal was local =/


Diamond, but yeah just saw that. If didn't have seven cards total (6 290Xs once the last one gets here and 1 290) laying here (three of which I gotta resell) I'd jump on that. Just to try it out and keep the free SSD on resale.


----------



## CrisInuyasha

I bought an R9 295x2 OC from Sapphire last friday, but I had to turn on the CSM on my motherboard for it to be recognized on my current Windows installation. Is this card not fully UEFI compatible, or there are any other way to make it work?


----------



## electro2u

Quote:


> Originally Posted by *CrisInuyasha*
> 
> I bought an R9 295x2 OC from Sapphire last friday, but I had to turn on the CSM on my motherboard for it to be recognized on my current Windows installation. Is this card not fully UEFI compatible, or there are any other way to make it work?


Yes that was the first sign there was trouble in paradise! Idk about sapphire but MSI emailed me a uefi bios and it works fine. Flashing it is a major pain especially since the instructions they sent along actually bricked my bios when I followed them. That said, it was painless to un-brick it from the onboard video but you can't really do it from the other bios switch. Or I couldn't. It would try but say it couldn't read the Bios rom when I tried to flash the bricked bios from the 295x2 itself. Not sure if that makes sense or not.

Can post the bios if you like but i would want a bios from my cards vendor out of OCD-ness. I did flash my MSI card with the sapphire OC bios tho and that worked fine. Afaik the 295x2 cards are all essentially identical hw


----------



## CrisInuyasha

Quote:


> Originally Posted by *electro2u*
> 
> Yes that was the first sign there was trouble in paradise! Idk about sapphire but MSI emailed me a uefi bios and it works fine. Flashing it is a major pain especially since the instructions they sent along actually bricked my bios when I followed them. That said, it was painless to un-brick it from the onboard video but you can't really do it from the other bios switch. Or I couldn't. It would try but say it couldn't read the Bios rom when I tried to flash the bricked bios from the 295x2 itself. Not sure if that makes sense or not.
> 
> Can post the bios if you like but i would want a bios from my cards vendor out of OCD-ness. I did flash my MSI card with the sapphire OC bios tho and that worked fine. Afaik the 295x2 cards are all essentially identical hw


I contacted their support and just got a response with the updated bios for it. I will give it a try and report the results here after I'm done.


----------



## CrisInuyasha

They sent me an UEFI enabled bios but for the original card, not the OC model. I asked if they have a bios for the OC card, now I'm waiting for the response. Anyway, if anyone is interested in the BIOS I received for the Sapphire, here is the link (btw they only sent me the bios and nothing more, but I just used ATI Winflash and it worked):

https://dl.dropboxusercontent.com/u/20405147/sapphire_r9_295x2_uefi.zip

 

C67301MU.101 for the master card
C67301SU.101 for the slave card

Edit: I noticed I lost the ability to monitor / edit voltages with this UEFI bios when using MSI Afterburner


----------



## HoneyBadger84

The fact that it's labeled ATI instead of Sapphire (like the one below) would've made me message them right away. If a card is not labeled with the specific vendor when they're known to label their BIOSes with their given brand, that's a sign of issues waiting to happen... Right now I have a Sapphire 290X that's not showing as a Sapphire, still waiting to get a proper Sapphire BIOS & flash it on to the card.



That's what it should say in the SubVendor once you get the BIOS switched over to what they send you...

My current one looks like this X_X:



I feel your pain man...

I really wanna get my hands on a R9 295x2, specifically to see how well it folds if nothing else







Maybe once I resell the 2 290Xs & 1 290 I'm not keeping for regular use... we shall see ^_^ That Diamond deal with the free 500GB Samsung EVO SSD is darn good, hopefully it'll last til I can get it.


----------



## Jazam

Hello







doing my first *full* pc build towards the end of this week. I'll be running a r9 295x2 woooo


----------



## xer0h0ur

Another lurker here posting for the first time. I just wanted to say thanks to whomever had posted the newegg offer for the Diamond 295X2 with the free 500GB SSD. I jumped all over that offer. Should be getting it in today and hopefully I don't run into any issues dropping it into my system. I was just a bit confused about what someone just mentioned about needing to enable CSM on their motherboard to get it to work? Not familiar with that acronym or what enabling that does. Other than that are there any software/driver tweaks you fellow 295X2 owners can pass on down?

I am jumping from a 270 to this beast so I am already expecting to encounter issues but I presume that other than uninstalling the drivers and re-installing I should be okay right? Or are there any issues with uninstalling drivers I don't know about?


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> Another lurker here posting for the first time. I just wanted to say thanks to whomever had posted the newegg offer for the Diamond 295X2 with the free 500GB SSD. I jumped all over that offer. Should be getting it in today and hopefully I don't run into any issues dropping it into my system. I was just a bit confused about what someone just mentioned about needing to enable CSM on their motherboard to get it to work? Not familiar with that acronym or what enabling that does. Other than that are there any software/driver tweaks you fellow 295X2 owners can pass on down?
> 
> I am jumping from a 270 to this beast so I am already expecting to encounter issues but I presume that other than uninstalling the drivers and re-installing I should be okay right? Or are there any issues with uninstalling drivers I don't know about?


Find & Download DDU, use it between uninstalling and reinstalling the drivers. Also what are your specs? Hope you've got a beefy PSU as these things are monsters.


----------



## xer0h0ur

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Find & Download DDU, use it between uninstalling and reinstalling the drivers. Also what are your specs? Hope you've got a beefy PSU as these things are monsters.


Its an Alienware Aurora R4 ALX but I ditched the 850W PSU because I had zero confidence in it having strong enough 12V rails. I couldn't manage to find the actual specs on it anywhere to see if it was dishing out the 28A per 8 pin connector or the 50A necessary between both connectors as AMD recommends.

I am running Windows 7 Ultimate with a core i7 4930K @ 4.1GHz, 32GB 4x8 Mushkin quad channel 1600MHz 7-8-8-24, 256GB Micron M550 SSD, 1TB WD VelociRaptor, ditched the POS Asetek/Alienware closed loop cooler for the Thermaltake Bigwater 760 Pro, ditched the Dell 850W PSU for an OCZ ZX1250W.


----------



## HoneyBadger84

Should be good then...


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> I was just a bit confused about what someone just mentioned about needing to enable CSM on their motherboard to get it to work? Not familiar with that acronym or what enabling that does.


CSM is compatibility manager for non uefi devices. Many of the 295x2s come with legacy non uefi bios. If your 270 was uefi and you plug in the 295x2 the board should adjust your settings for you. You're on windows 7 though and generally people don't install win7 in uefi mode anyway. You'd know what CSM was if you were running win 7 in iefi mode probably. It's not important and I had an MSI tech tell me the legacy bios was recommended by and which is why it's common. I use a uefi bios on my MSI 295x2 with no issues.


----------



## xer0h0ur

Requesting permission to come aboard


----------



## crazygamer123

Quote:


> Originally Posted by *xer0h0ur*
> 
> Requesting permission to come aboard


Nice rig. What is your CPU water cooling?


----------



## NavDigitalStorm

Quote:


> Originally Posted by *xer0h0ur*
> 
> Requesting permission to come aboard


Accepted. What brand are they?


----------



## xer0h0ur

Quote:


> Originally Posted by *NavDigitalStorm*
> 
> Accepted. What brand are they?


Its Diamond 295X2 from the Newegg offer with the 500GB SSD
Quote:


> Originally Posted by *crazygamer123*
> 
> Nice rig. What is your CPU water cooling?


Thanks. Its the Thermaltake Bigwater 760 Pro. As you can tell this case isn't exactly large enough to accommodate two 120mm radiators so I ditched the Asetek/Alienware closed loop cooler for that system. It actually works better than I expected since there isn't a hell of a lot of space between the top vents for the case and the radiator.


----------



## CrisInuyasha

Quote:


> Originally Posted by *xer0h0ur*
> 
> Another lurker here posting for the first time. I just wanted to say thanks to whomever had posted the newegg offer for the Diamond 295X2 with the free 500GB SSD. I jumped all over that offer. Should be getting it in today and hopefully I don't run into any issues dropping it into my system. I was just a bit confused about what someone just mentioned about needing to enable CSM on their motherboard to get it to work? Not familiar with that acronym or what enabling that does. Other than that are there any software/driver tweaks you fellow 295X2 owners can pass on down?
> 
> I am jumping from a 270 to this beast so I am already expecting to encounter issues but I presume that other than uninstalling the drivers and re-installing I should be okay right? Or are there any issues with uninstalling drivers I don't know about?


Just for reference, this is the message it will give when you plug it in and have a fully UEFI enabled and active system:


----------



## HoneyBadger84

I might be finally getting my hands on one of these soon, trying to work a deal for one. If so I'd be resellin' 1-2 more of my 290Xs after it gets here, assuming I like it enough to keep







hoping I can get'er for under $1300, but I doubt it lol


----------



## HoneyBadger84

lol Seller was all "Well one just sold for $1375 a few weeks ago so your offer is unreasonable." my reply: "Look again, that item was relisted because of non-paying bidder & ended up selling last week for $1175." And it was new in box







I keep way too good of track of these things & 290Xs on EBay... I want one of these but this guy is being a moron so I may just wait & buy one new in a couple weeks.


----------



## xer0h0ur

The only thing that sucked about the installation was that the screws Diamond gives you for the radiator are freakin miniature. I don't have any idea what the engineers were thinking on that one "Hey guys, these screws should work. I mean everyone has paper thin cases." Seems legit. So I was tearing my room apart looking for screws that were long enough to mount the radiator and only managed to find two lol. I need to buy some long screws to mount the 2nd fan on the radiator for the push/pull configuration.


----------



## electro2u

Quote:


> Originally Posted by *HoneyBadger84*
> 
> lol Seller was all "Well one just sold for $1375 a few weeks ago so your offer is unreasonable." my reply: "Look again, that item was relisted because of non-paying bidder & ended up selling last week for $1175." And it was new in box
> 
> 
> 
> 
> 
> 
> 
> I keep way too good of track of these things & 290Xs on EBay... I want one of these but this guy is being a moron so I may just wait & buy one new in a couple weeks.


I had mine up for right around your price but I have no rep here. Then I figured no one wanted it so I put a block on it and I'm having a shop put it under water....

Here's the thing though I think I might have figured out why I couldn't get ffxiv or Unigine directx9 to work in crossfire w it. I have a 290 here and it was doing weird stuff too. Well I figured out what it was after almost sending it back to newegg.

Had chrome set to default browser and using a plugin called Chromium wheel scrolling smoother.

When I uninstalled that plugin and switched my default browser to Firefox the strange behavior all of these cards have been having went away.

The issue is erratic frame rates with low power applications like... Firefox internet explorer and chrome. I've tested a 290 a 290x and my 295x2 all do this. With the 295x2 turning off pcie link state power saving from windows power settings fixed the issue.

With single card configurations this fix doesn't work... But uninstalling that chromium wheel plugin did.

Could anyone verify this problem with chrome wheel smoothing addon? Preferably with a 120hz monitor. It is very smooth when scrolling web pages and then suddenly it will look like frame rate went to 15 or 30.

I've had this plugin set to auto load when I install chrome for longer than I've had any of these cards.


----------



## CrisInuyasha

Quote:


> Originally Posted by *electro2u*
> 
> I had mine up for right around your price but I have no rep here. Then I figured no one wanted it so I put a block on it and I'm having a shop put it under water....
> 
> Here's the thing though I think I might have figured out why I couldn't get ffxiv or Unigine directx9 to work in crossfire w it. I have a 290 here and it was doing weird stuff too. Well I figured out what it was after almost sending it back to newegg.
> 
> Had chrome set to default browser and using a plugin called Chromium wheel scrolling smoother.
> 
> When I uninstalled that plugin and switched my default browser to Firefox the strange behavior all of these cards have been having went away.
> 
> The issue is erratic frame rates with low power applications like... Firefox internet explorer and chrome. I've tested a 290 a 290x and my 295x2 all do this. With the 295x2 turning off pcie link state power saving from windows power settings fixed the issue.
> 
> With single card configurations this fix doesn't work... But uninstalling that chromium wheel plugin did.
> 
> Could anyone verify this problem with chrome wheel smoothing addon? I've had it set to auto load when I install chrome for longer than I've had any of these cards.


Did you try disabling the ULPS? Maybe things will get better. I used MSI Afterburner to do it, but you can do it using regedit:

http://www.tomshardware.com/faq/id-1904869/disable-ulps-amd-crossfire-setups.html


----------



## electro2u

Quote:


> Originally Posted by *CrisInuyasha*
> 
> Did you try disabling the ULPS? Maybe things will get better. I used MSI Afterburner to do it, but you can do it using regedit:
> 
> http://www.tomshardware.com/faq/id-1904869/disable-ulps-amd-crossfire-setups.html


I have a 64$ reward for a solution over at [H]. The thread explains everything. I tried everything you can possibly imagine. Here's the link: http://hardforum.com/showthread.php?t=1826754&highlight=295x2
Entries are closed tho because I don't have the 295x2 to test with until next week. Also I think I either solved it myself or the 2 guys who we're looking at the registry with me had it about figured out.

Dx9 games use windows on windows to emulate 32 bit platform in 64 OS. Add to that they apparently hook into internet explorer to do that... Start messing with default browser or browser extensions and add ons. You see where I'm going with this?


----------



## xer0h0ur

So in other words its ghetto rigged. Sounds about right. Thanks for the heads up about disabling ULPS though. Going to have to do that when I get off work.


----------



## xer0h0ur

I wasn't aware there was a 295X2 that came waterblocked from the factory with a warranty but I just saw this https://www.visiontekproducts.com/index.php/graphics-cards/visiontek-cryovenom-r9-295x2-detail#specifications


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> I wasn't aware there was a 295X2 that came waterblocked from the factory with a warranty but I just saw this https://www.visiontekproducts.com/index.php/graphics-cards/visiontek-cryovenom-r9-295x2-detail#specifications


Yep, I think PowerColor is going to come out with one as well if they dont' already have it out. PowerColor & VisionTek typically come out with liquid blocked cards about as fast as eVGA does for NVidia's side of things, i.e. pretty quickly or even on launch in most cases.


----------



## vonalka

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yep, I think PowerColor is going to come out with one as well if they dont' already have it out. PowerColor & VisionTek typically come out with liquid blocked cards about as fast as eVGA does for NVidia's side of things, i.e. pretty quickly or even on launch in most cases.


You mean like this:







I installed it just recently


----------



## NavDigitalStorm

^

That is the reference AIO. I believe he means the custom loop editions with waterblocks by Swiftech or EK.


----------



## Pheozero

Huh, $36 dollars give or take to get a card with a waterblock installed already.

Still can't afford it


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> I wasn't aware there was a 295X2 that came waterblocked from the factory with a warranty but I just saw this https://www.visiontekproducts.com/index.php/graphics-cards/visiontek-cryovenom-r9-295x2-detail#specifications


EK nickel plated block with a giant Visiontek logo on the backplate... $1750 with a 1 year warranty. They must be smoking crack.

You can remove the stock cooler from the MSI 295x2 and put a block on without voiding the 3 year warranty.


----------



## HoneyBadger84

Yeah it'd be cheaper and smarter to get one that comes with a free ssd at 1499 and buy the block, install it yourself.

I wish they made air cooled versions of this card for cheaper. I'd buy it


----------



## Jeronbernal

they did, didn't they?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131479

**EDIT
my bad this one

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584

the free mouse makes it like 2 bucks cheaper if you count the awesome mouse bundle deal XD


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jeronbernal*
> 
> they did, didn't they?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131479
> 
> **EDIT
> my bad this one
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584
> 
> the free mouse makes it like 2 bucks cheaper if you count the awesome mouse bundle deal XD


Not the same. I almost got one of those though. Doesn't have the 6 eyefinity ports. That has more power capabilities etc, but issues with OCing software, at least in the on release reviews. Only about 250 of those were made.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jeronbernal*
> 
> they did, didn't they?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131479
> 
> **EDIT
> my bad this one
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584
> 
> the free mouse makes it like 2 bucks cheaper if you count the awesome mouse bundle deal XD


Its all fun and games until you take away my 4 mini displayports


----------



## HoneyBadger84

X_x got a counter offer on a 295x2 for $1150 + shipping. No box though. That makes me worry about damage in shipping. Refreshing. I'd have to quickly resell and get paid for the 3 cards I need to ditch then I could easily afford it. Gonna try to do that today...


----------



## electro2u

Quote:


> Originally Posted by *HoneyBadger84*
> 
> X_x got a counter offer on a 295x2 for $1150 + shipping. No box though. That makes me worry about damage in shipping. Refreshing. I'd have to quickly resell and get paid for the 3 cards I need to ditch then I could easily afford it. Gonna try to do that today...


Sounds almost too good to be true. Still a lot of money but if you wanted one and there's seriously nothing wrong with it then that's a steal.


----------



## HoneyBadger84

Quote:


> Originally Posted by *electro2u*
> 
> Sounds almost too good to be true. Still a lot of money but if you wanted one and there's seriously nothing wrong with it then that's a steal.


Well basically the dude said he A: doesn't game enough & B: doesn't game at high enough resolution to justify having the beast so he's downgrading. He has a picture of it working in his computer, so should be good... think I'm gonna pass & let him know if he's willing to wait a month or so I may buy it off him then. I gotta get all my ducks in a row & get these cards that're sitting in boxes not being used gone ASAP.


----------



## ViRuS2k

Q for you guys.

according to my information i gathered all R9295x2s are the same hardware / reference.

I currently have 1018/1250 bios on my card and want to use a 1030/???? bios on my card and must have working UEFI intergrated into the bios.

does anyone have one, every little bit of extra speed helps









if someone has the 1030 clocks and is sure it has Uefi intergrated can you post it so i can flash it to my EK watercooled card








the reason why i dont just overclock past those clocks or at those clocks is cause it messes with powerplay and makes your card run at 3d voltages constantly.


----------



## CrisInuyasha

Quote:


> Originally Posted by *ViRuS2k*
> 
> Q for you guys.
> 
> according to my information i gathered all R9295x2s are the same hardware / reference.
> 
> I currently have 1018/1250 bios on my card and want to use a 1030/???? bios on my card and must have working UEFI intergrated into the bios.
> 
> does anyone have one, every little bit of extra speed helps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> if someone has the 1030 clocks and is sure it has Uefi intergrated can you post it so i can flash it to my EK watercooled card
> 
> 
> 
> 
> 
> 
> 
> 
> the reason why i dont just overclock past those clocks or at those clocks is cause it messes with powerplay and makes your card run at 3d voltages constantly.


Some minutes ago they sent be an OC bios with the enabled UEFI, but it's having a issue: the GPU fan is locked at max speed. I'm trying to see with then if its something here or with the new bios. Also, the temp limit showing in amd overdrive is 0C compared to 75C, so I'm guessing it must be it the problem (and can't change it either)


----------



## xer0h0ur

Quote:


> Originally Posted by *ViRuS2k*
> 
> Q for you guys.
> 
> according to my information i gathered all R9295x2s are the same hardware / reference.


I vaguely remember reading that the Devil 13 is its own designed pcb but other than that I am pretty sure the designs are identical thus far. It kinda sucks if its a different arrangement because since that sucker has 4 8pin connectors I expected it to be a better overclocker should it have been waterblocked. Only problem is that if its a unique layout pcb then you're up **** creek without a paddle trying to get a waterblock since no company in their right mind would make a custom block for a card limited to 250 units I believe, including all of the cards given to reviewers.

On a totally different note, my system's only weak link now is my monitor. I don't even have an HD monitor much less a 4K monitor so my next step is grabbing a 4K monitor to push this beast. I was recommended the Samsung U28D590D. Is this a solid monitor? I can't exactly afford to break the bank.


----------



## ViRuS2k

well if someone has the devil 13 bios i would give it a try though the ref 1030 bios would be great.


----------



## electro2u

Quote:


> Originally Posted by *ViRuS2k*
> 
> well if someone has the devil 13 bios i would give it a try though the ref 1030 bios would be great.


Reference Sapphire OC bios is Here.
There are 2 BIOS files for each dip switch position. A master and a slave. Instructions for flashing the bios are at the bottom of the page.

The Devil 13 BIOS is not going to be compatible with the reference cards most likely. The fan stuff is all different. All the reference card BIOS *should* be interchangeable. Cards easy to unbrick from onboard video.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> On a totally different note, my system's only weak link now is my monitor. I don't even have an HD monitor much less a 4K monitor so my next step is grabbing a 4K monitor to push this beast. I was recommended the Samsung U28D590D. Is this a solid monitor? I can't exactly afford to break the bank.


If you play games 4k isn't recommended unless you need massive viewing space like for strategy games. For pretty much any other type of gaming to make use of your card with a single monitor you want a 120Hz 1440p Overclocker like a Qnix QX2710 or an Overlord Tempest OC. I use a Yamakasi Catleap 2B OC and it's pretty much the same as the Overlord.


----------



## xer0h0ur

The thing is that I am shooting for 4K not 1440P and I don't really care about it being a 60Hz monitor. The way I see it, if you manage to get 60 FPS in 4K you're doing good. I presume I could always just scale back to 2560 x 1440 on a 4K monitor anyways if I wanted to game at a lower resolution. Or is that assumption wrong? I am after all skipping a generation or two of displays so I don't know about how that works.


----------



## electro2u

You're right about the scaling thing. It's just that the monitor you are looking at is a TN, but you seem like you already know what you want. If you want a 4k monitor for cheap that's pretty much what there is. If you want the best possible 27" monitor you can have it for cheap.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> The thing is that I am shooting for 4K not 1440P and I don't really care about it being a 60Hz monitor. The way I see it, if you manage to get 60 FPS in 4K you're doing good. I presume I could always just scale back to 2560 x 1440 on a 4K monitor anyways if I wanted to game at a lower resolution. Or is that assumption wrong? I am after all skipping a generation or two of displays so I don't know about how that works.


Why the samsung? isn't the Asus PB287Q exactly the same monitor but with more features?


----------



## xer0h0ur

I am not set on that monitor, just intent on going 4K. I was just recommended that one since its a decent price for a 60Hz 4K screen using displayport. I have not looked at that Asus at all so thanks for that suggestion.


----------



## evoll88

I bought the asus 4k tn last week and over all its not to bad (colors look good and res. looks great) but I am waiting on the 4k ips monitor to buy to replace the asus 4k tn.


----------



## ViRuS2k

Quote:


> Originally Posted by *electro2u*
> 
> Reference Sapphire OC bios is Here.
> There are 2 BIOS files for each dip switch position. A master and a slave. Instructions for flashing the bios are at the bottom of the page.
> 
> The Devil 13 BIOS is not going to be compatible with the reference cards most likely. The fan stuff is all different. All the reference card BIOS *should* be interchangeable. Cards easy to unbrick from onboard video.


thanks for the info and as for the fan stuff on the devil 13 will be uncompatible but that does not bother me as i dont use the fan on my card anyway
as my card is fitted with a EK waterblock. its the higher clock speeds im more bothered about on the devil 13... if you find the bios let me know thanks.


----------



## electro2u

The devil 13 is actually slower and if you want to up your clocks just use AMD overdrive or msi afterburner or RadeonPro(i like RP a lot)


----------



## HoneyBadger84

Devil 13 is about the same, in reality, as it has a faster vRAM speed but slower core speed, but it also throttles less because it has a higher "throttle temp" from what reviews I read.

Right now I'm putting my timeline at possibly getting one of these as being at least 4 weeks out. Once the dust settles on my resales & I get to a point where I have the money along with half the current hole I'm in for some other bills paid off (shouldn't be too hard. Sucks I gotta wait but







I'll be recouping about half the card cost on the back end reselling 2 of the cards it'll be replacing, if I like it enough to keep, so there's that a least.

Does anyone have some temperature readouts from a high airflow/low ambient setup similar to mine so I can get an idea of what I'd be dealing with, preferably with the radiator setup in a push/pull config?


----------



## axiumone

I'm not really low ambient. Usually around 25-29c in the house. I'm hovering around 70-72c on both cards in push/pull at 2100rpm on the rad fans. Take a look at the sig rig for pics.

I think having control over the VRM fan would help a ton, but amd in their infinite wisdom, thinks the user's don't need to complicate themselves with such silly nonsense.


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> I'm not really low ambient. Usually around 25-29c in the house. I'm hovering around 70-72c on both cards in push/pull at 2100rpm on the rad fans. Take a look at the sig rig for pics.
> 
> I think having control over the VRM fan would help a ton, but amd in their infinite wisdom, thinks the user's don't need to complicate themselves with such silly nonsense.


Did you find a solution to the jitter frame/jerks you had a video of a while back?


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> Did you find a solution to the jitter frame/jerks you had a video of a while back?


Nope, after extensive communications with amd, a solution doesn't exist. The second 295x2 is just in my case as very expensive decoration.

I'm supremely disappointed with their driver team. As much as I love the value usually represented by going with amd, I don't think I'll be getting any more of their products.


----------



## Synthaxx

Quote:


> Originally Posted by *axiumone*
> 
> Nope, after extensive communications with amd, a solution doesn't exist. The second 295x2 is just in my case as very expensive decoration.
> 
> I'm supremely disappointed with their driver team. As much as I love the value usually represented by going with amd, I don't think I'll be getting any more of their products.


So eyefinity is still a disaster?


----------



## Komatose84

I got myself a 295x2 also as well as the Samsung 4k monitor, had alot of trouble with drivers for a few days but finally managed to get it running well on 14.7. It performs well but some games (final fantasy 14 mainly) I have studdering and fps drops, also full screen videos in chrome don't really run smooth. Clocks hold at 1030/1250 and temps are around 66-68 so im not really sure, just putting it down to dx9 in final fantasy or dodgy drivers unless you guys could shed some light.

Im running an i7 4770k no OC, 8gb of ram and a 1300w evga supernova psu.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> So eyefinity is still a disaster?


Not for me, but I'm running the other kinda QuadFire.


----------



## axiumone

Quote:


> Originally Posted by *Synthaxx*
> 
> So eyefinity is still a disaster?


With two 295x2 in crossfire, yes, it's a disaster.


----------



## Synthaxx

Quote:


> Originally Posted by *axiumone*
> 
> With two 295x2 in crossfire, yes, it's a disaster.


So there will never be a solution for eyefinity? weird ...


----------



## axiumone

Quote:


> Originally Posted by *Synthaxx*
> 
> So there will never be a solution for eyefinity? weird ...


Here's the thing. The plx chip on these cards is causing some communication issues between other 290/295 family cards. Which in turn causes this strange frame latency / tearing effect.

From my communications, they've been able to replicate the issue. In order to fix it, it requires a complete driver rewrite. I guess it's just not a priority for them, because I got quoted a time frame of "before the end of the year".

Now, I've done my due diligence here. I've tried probably close to 50 different ways to approach this problem, lots and lots of hours and different components. It boils down to their drivers are broken. My evidence of this is - I hacked a set of 13.12 whql drivers to support the 295x2 in crossfire and they worked! They worked well enough for testing to prove that something that was changed in the 14.x branch is the root of the problem. However, the 13.12 driver is over seven months old now and doesn't support the majority of newer titles very well. That, plus a number of small annoyances prevent me from using that driver every day.


----------



## Synthaxx

Quote:


> Originally Posted by *axiumone*
> 
> Here's the thing. The plx chip on these cards is causing some communication issues between other 290/295 family cards. Which in turn causes this strange frame latency / tearing effect.
> 
> From my communications, they've been able to replicate the issue. In order to fix it, it requires a complete driver rewrite. I guess it's just not a priority for them, because I got quoted a time frame of "before the end of the year".


That's really, really sucks. Instead of focussing on their low end cards, they should focus on the current fastest card available.
I mean, they made the best card (which is indeed very good for marketing) and then get lazy and don't make great drivers... they should use this card to lure more customers to them.

But "before the end of the year" time frame means there is still hope they will fix it.


----------



## HoneyBadger84

Quote:


> Originally Posted by *axiumone*
> 
> Here's the thing. The plx chip on these cards is causing some communication issues between other 290/295 family cards. Which in turn causes this strange frame latency / tearing effect.
> 
> From my communications, they've been able to replicate the issue. In order to fix it, it requires a complete driver rewrite. I guess it's just not a priority for them, because I got quoted a time frame of "before the end of the year".
> 
> Now, I've done my due diligence here. I've tried probably close to 50 different ways to approach this problem, lots and lots of hours and different components. It boils down to their drivers are broken. My evidence of this is - I hacked a set of 13.12 whql drivers to support the 295x2 in crossfire and they worked! They worked well enough for testing to prove that something that was changed in the 14.x branch is the root of the problem. However, the 13.12 driver is over seven months old now and doesn't support the majority of newer titles very well. That, plus a number of small annoyances prevent me from using that driver every day.


Betting it's something to do with Mantle since it was released in the 14.1s and onwards. Could be some of the coding they put in them to passively support Mantle resulted in something that bumped the PLX interfacing outta whack.

So you tested a 295x2 with 290s and it went wonky with any drivers except the pre-14s (fixed to work in the 13.12s) then yeah? If so that's a deterrent to this card for me, as that would be my plan, either [email protected] on it while I game on the 290Xs or gaming on it while I fold on the 290Xs, then use QuadFire when I need it for eventual 4K or gaming in 3K (3x1080p) without as much heat actively being spewed out the back of the case.


----------



## axiumone

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Betting it's something to do with Mantle since it was released in the 14.1s and onwards. Could be some of the coding they put in them to passively support Mantle resulted in something that bumped the PLX interfacing outta whack.
> 
> So you tested a 295x2 with 290s and it went wonky with any drivers except the pre-14s (fixed to work in the 13.12s) then yeah? If so that's a deterrent to this card for me, as that would be my plan, either [email protected] on it while I game on the 290Xs or gaming on it while I fold on the 290Xs, then use QuadFire when I need it for eventual 4K or gaming in 3K (3x1080p) without as much heat actively being spewed out the back of the case.


Yeah, I tried the 295x2+290x. It didn't have the exact same issue as two 295x2. The problem with the 295x2+290x was frame time latency. It didn't tear, but it stuttered like crazy. Although my FPS would show over 150, the games felt like they were running at 20 FPS. I've tried every single driver revision, as well as frame pacing on/off and it behaved the same with all of them.

Although, from all of the reviews I've read, 4k on single monitor works great with 295x2 in crossfire or with 295x2+290x.


----------



## HoneyBadger84

Quote:


> Originally Posted by *axiumone*
> 
> Yeah, I tried the 295x2+290x. It didn't have the exact same issue as two 295x2. The problem with the 295x2+290x was frame time latency. It didn't tear, but it stuttered like crazy. Although my FPS would show over 150, the games felt like they were running at 20 FPS. I've tried every single driver revision, as well as frame pacing on/off and it behaved the same with all of them.
> 
> Although, from all of the reviews I've read, 4k on single monitor works great with 295x2 in crossfire or with 295x2+290x.


Oooooh so we're talking about Eyefinity only is the issue, or does this happen for you with single screen as well?


----------



## axiumone

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Oooooh so we're talking about Eyefinity only is the issue, or does this happen for you with single screen as well?


It is in fact an eyefinity only issue. Single screen works absolutely fine.


----------



## ViRuS2k

what we really need is a R9 compatible bios editor :/


----------



## axiumone

Quote:


> Originally Posted by *ViRuS2k*
> 
> what we really need is a R9 compatible bios editor :/


Amen! I can't believe we still don't have a bios editor for this family of cards!


----------



## xer0h0ur

HoneyBadger84 mentioned before that if GPU-Z showed the Subvendor as ATI that it was not a good thing. Can someone expand on that because I just saw that my Diamond 295X2 says "ATI (1002)". Should I be worried? How would I be able to get a proper Diamond BIOS if so?


----------



## electro2u

Quote:


> Originally Posted by *Komatose84*
> 
> I got myself a 295x2 also as well as the Samsung 4k monitor, had alot of trouble with drivers for a few days but finally managed to get it running well on 14.7. It performs well but some games (final fantasy 14 mainly) I have studdering and fps drops, also full screen videos in chrome don't really run smooth. Clocks hold at 1030/1250 and temps are around 66-68 so im not really sure, just putting it down to dx9 in final fantasy or dodgy drivers unless you guys could shed some light.
> 
> Im running an i7 4770k no OC, 8gb of ram and a 1300w evga supernova psu.


Hello. I've been looking for someone with a 295x2 that plays FFXIV to help me troubleshoot it. I find it very interesting you mention Chrome.
I'm guessing you are on windows 8.1? And I'm guessing you have chrome as your default browser?

Read this: http://hardforum.com/showthread.php?t=1826754&highlight=reward&page=4 especially the last page.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> HoneyBadger84 mentioned before that if GPU-Z showed the Subvendor as ATI that it was not a good thing. Can someone expand on that because I just saw that my Diamond 295X2 says "ATI (1002)". Should I be worried? How would I be able to get a proper Diamond BIOS if so?


No no, it's not that it's a bad thing. It's only a bad thing if you see that on R9 290/290Xs that should be showing "Sapphire/PCPartners" or "Asus" etc... Most SubVendor readouts on R9 295x2s are going to be straight ATI as they call come from the same manufacturer most likely and hare just "labeled" by whichever brand sells it to the consumer. It's perfectly fine in this case. It's when you get one that should be branded in the 290/290X category and isn't, that it's scary/a problem, as it indicates BIOS switching may have been done by a miner, in the case of used cards. It shouldn't ever be a problem at all for cards bought new especially.

I don't believe Diamond brands their BIOSes at all, so it should read out as ATI (1002).


----------



## xer0h0ur

Okay thanks for clearing up the confusion.

I was gaming last night at 1600x1200 with every setting maxed out on Hitman Absolution and I noted that the cores reached around 70C which to me is too close to the 75C limit. I know eventually pushing 4K resolution will only make it worse so I am probably jumping ship sooner than expected into slapping a water block on it. I was trying to see if there were any reviews comparing the four blocks available but didn't see any. Anyone have a recommendation?


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Okay thanks for clearing up the confusion.
> 
> I was gaming last night at 1600x1200 with every setting maxed out on Hitman Absolution and I noted that the cores reached around 70C which to me is too close to the 75C limit. I know eventually pushing 4K resolution will only make it worse so I am probably jumping ship sooner than expected into slapping a water block on it. I was trying to see if there was any reviews comparing the four blocks available but didn't see any. Anyone have a recommendation?


I just got the xspc razor ones. Haven't got them installed just yet because I get my psu on Monday and I want to test the cards first before opening them up. I think EK and XSPC are really close in performance. It's just a matter of taste I guess


----------



## xer0h0ur

The meatball in me looks at the nickel plated Aquacomputer block with the acrylic window and drools but I also love the full coverage of the EK block and backplate. I don't understand why someone hasn't attempted to fully cool the card by making a water cooled backplate since by all accounts it gets extremely hot under 4K gaming loads. The only reason I don't pull the trigger on the Aquacomputer WB is because I don't see a backplate for it and I don't feel like mismatching it with an EK backplate.


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> Yeah, I tried the 295x2+290x. It didn't have the exact same issue as two 295x2. The problem with the 295x2+290x was frame time latency. It didn't tear, but it stuttered like crazy. Although my FPS would show over 150, the games felt like they were running at 20 FPS. I've tried every single driver revision, as well as frame pacing on/off and it behaved the same with all of them.
> 
> Although, from all of the reviews I've read, 4k on single monitor works great with 295x2 in crossfire or with 295x2+290x.


I am able to reproduce almost the exact issue your having. i have currently 290x cards though. I noticed mine more when i moved from a pcie 3.0 to 2.0 platform. I am about to build up a Xeon x5650 on X58 with 4 290's and see if i can recreate it there also. then i will go back to my X79 and check it again on pcie 3.0. I dont have a youtube account or anything but i will make one to reproduce the same issues if i must. The issue becomes noticeable when i go from 2 to 3 card and really bad @ 4 cards. I am surprised this isn't affecting others as much. There must not be many people running tri-fire R9 series and eyefinity. I thought i had it fixed for a while but i found its back. My only ways to work around it are not entirely 100%. I've lowered my refresh rate in eyefinity to 100, its reduced it dramatically.

The differences on our issues are:

Different R9 series crossfie configurations.
Mine affects only 2 screens of 3 vs your 5 screens all have it
Mine is tied to Refresh rate and resolution and increases problem with more gpu's

Not sure if that covers it, but i too am distressed with this issue.


----------



## axiumone

Quote:


> Originally Posted by *DividebyZERO*
> 
> I am able to reproduce almost the exact issue your having. i have currently 290x cards though. I noticed mine more when i moved from a pcie 3.0 to 2.0 platform. I am about to build up a Xeon x5650 on X58 with 4 290's and see if i can recreate it there also. then i will go back to my X79 and check it again on pcie 3.0. I dont have a youtube account or anything but i will make one to reproduce the same issues if i must. The issue becomes noticeable when i go from 2 to 3 card and really bad @ 4 cards. I am surprised this isn't affecting others as much. There must not be many people running tri-fire R9 series and eyefinity. I thought i had it fixed for a while but i found its back. My only ways to work around it are not entirely 100%. I've lowered my refresh rate in eyefinity to 100, its reduced it dramatically.
> 
> The differences on our issues are:
> 
> Different R9 series crossfie configurations.
> Mine affects only 2 screens of 3 vs your 5 screens all have it
> Mine is tied to Refresh rate and resolution and increases problem with more gpu's
> 
> Not sure if that covers it, but i too am distressed with this issue.


I can tell you why lowering the refresh rate to 100 helps a lot. I've discovered that if you can keep the frame rate of the game ABOVE your refresh rate, then the issue doesn't exist. I've tested this with 60hz. 100hz. 120hz and it's the same for all refresh rates. I've actually discovered this when I had a quadfire 290 system. The thing is, I have these 120hz displays for a reason... I'm not interested in gaming at 60hz, even though, it's easier to attain a frame rate of 60+.


----------



## Satchmo0016

Quote:


> Originally Posted by *DividebyZERO*
> 
> I am able to reproduce almost the exact issue your having. i have currently 290x cards though. I noticed mine more when i moved from a pcie 3.0 to 2.0 platform. I am about to build up a Xeon x5650 on X58 with 4 290's and see if i can recreate it there also. then i will go back to my X79 and check it again on pcie 3.0. I dont have a youtube account or anything but i will make one to reproduce the same issues if i must. The issue becomes noticeable when i go from 2 to 3 card and really bad @ 4 cards. I am surprised this isn't affecting others as much. There must not be many people running tri-fire R9 series and eyefinity. I thought i had it fixed for a while but i found its back. My only ways to work around it are not entirely 100%. I've lowered my refresh rate in eyefinity to 100, its reduced it dramatically.
> 
> The differences on our issues are:
> 
> Different R9 series crossfie configurations.
> Mine affects only 2 screens of 3 vs your 5 screens all have it
> Mine is tied to Refresh rate and resolution and increases problem with more gpu's
> 
> Not sure if that covers it, but i too am distressed with this issue.


Just adding, I notice the stutter you're talking about in the few games with eyefinity + trifire particularly in Planetside 2. I havent had the chance to play a lot of games on it, but I'm also using 3x27" 1440p screens, so its kinda hard to get games to go above the 60 fps at max settings at 7680x1440


----------



## xer0h0ur

I pulled the trigger on the EK WB in the nickel finish and the black backplate. Can't wait to get it in hand.


----------



## electro2u

There's a thread over at H saying the Aquacomputer nickel blocks may have problems but it might be EK trying to make them look bad. I went copper/black.


----------



## derickwm

How many of you guys are custom water cooling your 295X2s?


----------



## fireedo

Quote:


> Originally Posted by *derickwm*
> 
> How many of you guys are custom water cooling your 295X2s?


me








using block from EK and the performance is amazing since I put the fans in intake positions (from outside case to inside case) highest temperature in non AC room with 30-33C ambient is about 56-57C








really worth it

(now I need to water cooled my Motherboard







)


----------



## electro2u

Quote:


> Originally Posted by *derickwm*
> 
> How many of you guys are custom water cooling your 295X2s?


I am having to because it throttles just a wee bit in long term 1440p stress testing without vsync on.

I'm using EK backplates on my cards because Aquacomputer has not yet made a 295x2 backplate. But I'm using Aquacomputer blocks because they are kinda "artsy".

Fwiw my comment above was tongue-in-cheek. I'm new to watercooling and I've never seen an industry with more trust issues than cars before.


----------



## Komatose84

Quote:


> Originally Posted by *electro2u*
> 
> Hello. I've been looking for someone with a 295x2 that plays FFXIV to help me troubleshoot it. I find it very interesting you mention Chrome.
> I'm guessing you are on windows 8.1? And I'm guessing you have chrome as your default browser?
> 
> Read this: http://hardforum.com/showthread.php?t=1826754&highlight=reward&page=4 especially the last page.


Yeah ive read through that thread, i actually switched to chrome from firefox because firefox was causing my pc to bluescreen when fullscreening videos, however i dont have any extensions on chrome and i didn't notice any performance change after the switch. The only troubles im having with the game are random fps drops and lag, i tried switching to 1440p res on the game which did make it run abit smoother but then it wouldn't let me switch back to 4k.

I'm sure a card like this should have no problem running ffxiv in 4k at all, ive tried 14.4, 14.6 and 14.7 and don't get much difference between them apart from 14.7 making the card run a few degrees hotter. I'm really stumped with this one









Edit: Forgot to say im on Windows 7 64


----------



## electro2u

Quote:


> Originally Posted by *Komatose84*
> 
> i tried switching to 1440p res on the game which did make it run abit smoother but then it wouldn't let me switch back to 4k.
> 
> Edit: Forgot to say im on Windows 7 64


Games seriously possessed, isn't it? No one has the exact same issue I do except a guy with 4 TITANS! /facepalm.

It's something about Windows on Windows (WoW64) and DX9 and AMD. I haven't solved it, I probably won't ever solve it, and I quit paying SE months ago. Currently looking for new game to play, but enjoying iracing.com plenty in the meantime.


----------



## Komatose84

Quote:


> Originally Posted by *electro2u*
> 
> Games seriously possessed, isn't it? No one has the exact same issue I do except a guy with 4 TITANS! /facepalm.
> 
> It's something about Windows on Windows (WoW64) and DX9 and AMD. I haven't solved it, I probably won't ever solve it, and I quit paying SE months ago. Currently looking for new game to play, but enjoying iracing.com plenty in the meantime.


Yeah its a pain, im just glad i can say its the game and not a hardware problem, i knew what i was getting into when i got a 295 and then a 4k monitor but i didn't know it would be this bad... do you suggest me switching to 8.1? or stick with 7? seems both have their own little problems.


----------



## electro2u

Quote:


> Originally Posted by *Komatose84*
> 
> Yeah its a pain, im just glad i can say its the game and not a hardware problem, i knew what i was getting into when i got a 295 and then a 4k monitor but i didn't know it would be this bad... do you suggest me switching to 8.1? or stick with 7? seems both have their own little problems.


As of right now, I can't get the card to work in crossfire on 8.1 with FFXIV. It works once, and then you have to change the name of the game's folder after exiting or it will only use 1 gpu. Strangest behavior from a game I've ever seen.


----------



## xer0h0ur

Quote:


> Originally Posted by *derickwm*
> 
> How many of you guys are custom water cooling your 295X2s?


No less than a handful of the listed owners are going the custom WB route from following this thread. Myself included.


----------



## Synthaxx

Quote:


> Originally Posted by *derickwm*
> 
> How many of you guys are custom water cooling your 295X2s?


Most likely less than 10 on this thread. But I bought a waterblock








I really want to know the temps when using a waterblock ..!


----------



## Skinnered

These R295's are throttling like crazy now with the warmer weather here. The radiator isn't simply enough for these things.
I got two R295'x2s and the heat these things spread is huge (that's a good think as it means it got taken away from the GPU core







. Also my 1600W lepa isn't enough to feed these things (I may have to review my lines though if anything is shared). I'll go to two psu's probably.


----------



## HoneyBadger84

1600W should be plenty, it's probably a rail mix issue. You need to make sure that each plug is a different rail, i.e. you want one plug on rail 3, one on rail 4, one on rail 5, and one in rail 6, for the cards, that way, they have 60A each, 30A per plug, as they are SUPPOSED to only need 25A per plug (50A total) to run normal clocks without any issues.

I hate how the LEPA unit doesn't have the plugs labeled with which rail they're on... love that Antec did that on their newer HCP units, it's a very nice thing to know what rail you're using easily


----------



## Skinnered

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 1600W should be plenty, it's probably a rail mix issue. You need to make sure that each plug is a different rail, i.e. you want one plug on rail 3, one on rail 4, one on rail 5, and one in rail 6, for the cards, that way, they have 60A each, 30A per plug, as they are SUPPOSED to only need 25A per plug (50A total) to run normal clocks without any issues.
> 
> I hate how the LEPA unit doesn't have the plugs labeled with which rail they're on... love that Antec did that on their newer HCP units, it's a very nice thing to know what rail you're using easily


I hope so, I have to check this first.

I came across this


----------



## HoneyBadger84

So basically you want 2 where it's showing you, then the other 2 in the middle top and right top, via that diagram:



That gives you one connector in each of the 30A rails.


----------



## Skinnered

I gonna try that, thanks!


----------



## xer0h0ur

I am going to be routing all of the heat from the EK WB to an Alphacool NexXxoS Monsta 120mm radiator with a Noctua NF-F12 PWM fan @ 1500 RPM. If that turns out to be insufficient then I will just use the original fan from the Asetek radiator.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am going to be routing all of the heat from the EK WB to an Alphacool NexXxoS Monsta 120mm radiator with a Noctua NF-F12 PWM fan @ 1500 RPM. If that turns out to be insufficient then I will just use the original fan from the Asetek radiator.


The Noctua is a really good fan. But will you just route the 295x2 to the monsta or the whole system?


----------



## xer0h0ur

Its not going to be a dual loop. I am just extending the loop so its the cpu and gpu WBs with two radiators in the loop. So there is going to be a radiator in between each block and the pump.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its not going to be a dual loop. I am just extending the loop so its the cpu and gpu WBs with two radiators in the loop. So there is going to be a radiator in between each block and the pump.


So you would be using a 120mm Monsta (cooling capacity of +- 350watts) + the 120mm (aluminum: mixing metals = corrosion) that comes with the thermaltake bigwater 760 pro?
Personally I think that would not be enough to efficiently watercool the 295x2 + CPU. I think you will still run into temp problems.


----------



## xer0h0ur

Really? I am pretty damn restricted space-wise in this case. I don't exactly have much more space to put any more radiators anywhere else. I figured by going with a significantly thicker radiator it would manage the heat on the 295X2 much better. The radiator Thermaltake uses is 144.3mm x 120mm x 33mm and its a P500 pump with a 500L / h flow rate. Based on the specs Thermaltake shows its an aluminum radiator and a plated copper block. I am just not sure if the radiator is a copper core or not.

I have currently been using EK clear coolant which I thought was made for mixed metal loops anyways. Oh yeah also a silver kill coil. I am certainly no water cooling pro. Been learning as I go.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Really? I am pretty damn restricted space-wise in this case. I don't exactly have much more space to put any more radiators anywhere else. I figured by going with a significantly thicker radiator it would manage the heat on the 295X2 much better. The radiator Thermaltake uses is 144.3mm x 120mm x 33mm and its a P500 pump with a 500L / h flow rate. Based on the specs Thermaltake shows its an aluminum radiator and a plated copper block. I am just note sure if the radiator is a copper core or not.


The pump itself is alright, nothing fancy but good enough. Don't forget you have a 4930K which has a whopping 130 watts TDP. You overclocked it slightly so make it 150. That would all total in 650 watts TDP you have to cool. I did a fast calculation and with both the monsta and the Thermalthake you would be able to cool about 480 watts optimally. Ofcourse it wouldn't be a problem if you go this route, your temps will just rise.

You will probably not like to hear what I would do in your situation...
Since your case is 'the bottleneck' here, I would sell the case and buy a case that supports more radiators. If you go all out on the case, get the 900D. Hands down the best watercooling case (to my knowledge).








The are ofcourse more economical solutions, but I can't think of any other case at the moment.


----------



## xer0h0ur

Well that is a buzzkill. I was hoping to at least reign in the GPU temps into the 60's at full load. Current performance (CPU cooling system) under 100% load for an hour using Aida64's stress test gives me temps in the 75-77C range. Under "normal" gaming loads its more like mid to high 50's / low 60's. So I should expect those temps to go up?


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well that is a buzzkill. I was hoping to at least reign in the GPU temps into the 60's at full load. Current performance (CPU cooling system) under 100% load for an hour using Aida64's stress test gives me temps in the 75-77C range. Under "normal" gaming loads its more like mid to high 50's / low 60's. So I should expect those temps to go up?


Well you might be able to hit the 60°, but then you have to ask yourself if it was worth it to watercool.
Maybe you can use external watercooling? Like fitting a radiator outside the case, ... Maybe someone else has a better idea.

General rule of thumb is minimum 120mm/ component. I think we all agree that the 295x2 count for 2.







So 2x 120 for the GPU and 1x 120 for the CPU makes a minimum of a 360


----------



## ViRuS2k

i have the EK waterblock and back plate on my r9295x2 and its not very efficiant at cooling my card when overclocked.

@1120/1690 @1.3v 69c on the first gpu and 74c-75c makes the 2nd gpu throttle

im taking about cores here guys as i have 1 card core1/akagpu1 ect ect...

strange thing is i was expecting massive temp drops with water and it just is not there... even as default speeds 1030x1300 the 2nd gpu throttles it seems the 2nd gpu is like 4-5c higher in temps causing it to hit the thermal 75c temps.

i was hopeing with watercooling this would not happen and i am using 2x thick 360mm rads in my case with a EK dc 4.0 pump/res combo...

the only thing i can think of is that the block is uneven as in 1 core is getting cooled better than the other one, i was thinking of pulling it all apart and using some liquid pro on the die`s. but it ends up being risky just like my 4790k i delided and used liquid pro on it and it caused massve temp drops









might just varnish the ic`s aswell like i did on my 4790k so that if the liquid pro does touch it will be isolated cause of the varnish.


----------



## Synthaxx

Quote:


> Originally Posted by *ViRuS2k*
> 
> i have the EK waterblock and back plate on my r9295x2 and its not very efficiant at cooling my card when overclocked.
> 
> @1120/1690 @1.3v 69c on the first gpu and 74c-75c makes the 2nd gpu throttle
> 
> im taking about cores here guys as i have 1 card core1/akagpu1 ect ect...
> 
> strange thing is i was expecting massive temp drops with water and it just is not there... even as default speeds 1030x1300 the 2nd gpu throttles it seems the 2nd gpu is like 4-5c higher in temps causing it to hit the thermal 75c temps.


Have you tried reinstalling the waterblock? What rads do you have?


----------



## xer0h0ur

Holy crap. If you can't properly cool it then I am up a creek without a paddle. My only remaining choice would be modding the side panel to accomodate a third rad mounted directly on the panel exhausting outwards. I was already planning on cutting into the panel for a small window but its no biggie to instead go the third radiator route with some quick disconnects.


----------



## xer0h0ur

I am a fan of using Prolimatech PK-3 since it consistently ranks among the best non-conductive TIMs.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am a fan of using Prolimatech PK-3 since it consistently ranks among the best non-conductive TIMs.


I also ordered PK-3 for my build


----------



## xer0h0ur

Quote:


> Originally Posted by *Synthaxx*
> 
> I also ordered PK-3 for my build


----------



## ViRuS2k

Quote:


> Originally Posted by *Synthaxx*
> 
> Have you tried reinstalling the waterblock? What rads do you have?


next on my list is to remove the block and install it again.
though im waiting so i know what thermal paste to use..

i currently have MX4 the block but i might buy some gelid extreme 3 its meant to be the best none conductive stuff.


----------



## gumby0406

Hello I am starting my first build in a long time. I have ordered 2 295x2's and I was wondering if anyone could tell me if the psu I ordered will work. I ordered the rosewill Hercules 1600. It came as a bundle on newegg and then I decided to do quadfire. The single 12v rail has 110a but on amd's site it says 30a per 8 pin connector or 50a combined. So do I need 120a or 100a for both?


----------



## gumby0406

Also I saw a video on YouTube and they had the quadfire setup with the Hercules and said it runs them fairly easily. So I'm not sure if I need something like a lepa g1600 or if the Hercules will be ok.


----------



## Synthaxx

Quote:


> Originally Posted by *gumby0406*
> 
> Hello I am starting my first build in a long time. I have ordered 2 295x2's and I was wondering if anyone could tell me if the psu I ordered will work. I ordered the rosewill Hercules 1600. It came as a bundle on newegg and then I decided to do quadfire. The single 12v rail has 110a but on amd's site it says 30a per 8 pin connector or 50a combined. So do I need 120a or 100a for both?


Quote:


> Originally Posted by *gumby0406*
> 
> Also I saw a video on YouTube and they had the quadfire setup with the Hercules and said it runs them fairly easily. So I'm not sure if I need something like a lepa g1600 or if the Hercules will be ok.


The hercules should normally be enough. I think you'll need 100 for the cards alone.








According to a review , it has 130A total.


----------



## gumby0406

Thank you. It has 2 12v rails on is 110a the other is 50a but is says the 50a is esp. Not sure what that means
Quote:


> Originally Posted by *Synthaxx*
> 
> The hercules should normally be enough. I think you'll need 100 for the cards alone.
> 
> 
> 
> 
> 
> 
> 
> 
> According to a review , it has 130A total.


----------



## gumby0406

Eps sorry


----------



## Synthaxx

Well, Newegg has a 



 where they test 2 of these cards with the Hercules 1600. It would be strange that it wouldn't work for you.


----------



## gumby0406

That was my thinking. Just trying to double and triple check. Thank you very much


----------



## HoneyBadger84

This Rosewill Hercules PSU has 2 rails, so basically what you want to do is, try to make the rails share the load of the cards. Ideally, you want to have BOTH cards split between the two rails, so they can draw as many amps as they need, meaning you want card #1 & card #2 both plugged in to one rail for one plug, and the other rail for the other plug. That should make any power issues the R9 295x2s might have in QuadFire not happen.


----------



## xer0h0ur

The mildest of overclocks on this card managed to throttle it. Same scenario as before 1600x1200 maxed out ultra settings on Hitman Absolution. Within minutes the cores were in the 70's. It was the 2nd core that was throttling and its also the core that got the hottest. This was with 1030/1600 clocks so I went back to 1018/1500.


----------



## xer0h0ur

Quote:


> Originally Posted by *ViRuS2k*
> 
> i have the EK waterblock and back plate on my r9295x2 and its not very efficiant at cooling my card when overclocked.
> 
> @1120/1690 @1.3v 69c on the first gpu and 74c-75c makes the 2nd gpu throttle
> 
> im taking about cores here guys as i have 1 card core1/akagpu1 ect ect...
> 
> strange thing is i was expecting massive temp drops with water and it just is not there... even as default speeds 1030x1300 the 2nd gpu throttles it seems the 2nd gpu is like 4-5c higher in temps causing it to hit the thermal 75c temps.
> 
> i was hopeing with watercooling this would not happen and i am using 2x thick 360mm rads in my case with a EK dc 4.0 pump/res combo...
> 
> the only thing i can think of is that the block is uneven as in 1 core is getting cooled better than the other one, i was thinking of pulling it all apart and using some liquid pro on the die`s. but it ends up being risky just like my 4790k i delided and used liquid pro on it and it caused massve temp drops
> 
> 
> 
> 
> 
> 
> 
> 
> 
> might just varnish the ic`s aswell like i did on my 4790k so that if the liquid pro does touch it will be isolated cause of the varnish.


It just occurred to me since you mentioned you have thick radiators, are you sure that your radiator fans create enough static pressure to be effective on your rads? Also do you use gaskets and/or spacers between the fan and the radiator? Spacers can help eliminate the motor dead spot from the center of the fan and gaskets can create a better seal forcing airflow through the radiator. Some gaskets can do both for you.


----------



## HoneyBadger84

So uh... either I'm seeing things or...

You can get a R9 295x2... and the very PSU I want for my system... in a combo that gives you $215 off... while STILL getting the free 500GB SSD and the 3 free games certificate... holy crap.  if I had $1600 right now the amount of spam I'd be putting on my mouse button to press order would be epic.


----------



## gumby0406

Quote:


> Originally Posted by *HoneyBadger84*
> 
> So uh... either I'm seeing things or...
> 
> You can get a R9 295x2... and the very PSU I want for my system... in a combo that gives you $215 off... while STILL getting the free 500GB SSD and the 3 free games certificate... holy crap.  if I had $1600 right now the amount of spam I'd be putting on my mouse button to press order would be epic.


When I did mine it was 1585 and the SSD was a 120gb and the combo would work with either the Hercules or a lepa g1600. But I had read some bad customer reviews on the lepa so I went with the Hercules.


----------



## HoneyBadger84

Quote:


> Originally Posted by *gumby0406*
> 
> When I did mine it was 1585 and the SSD was a 120gb and the combo would work with either the Hercules or a lepa g1600. But I had read some bad customer reviews on the lepa so I went with the Hercules.


I'm very irritated I don't have the money available. I'd do it in a heartbeat, both for the SSD & for the PSU, let alone the card itself. That PSU is the exact unit I need to run my current QuadFire configuration at stock on the GPUs, with one PSU. The SSD would allow me to reinstall Windows & put ALL of my stuff on one drive as far as games & vital programs go... and of course I want to play with that card very badly









I'll have to hope it's still available in this combo in about 2 weeks when I can get the rest of the money together I'd need to get it... right now after my upcoming truck payment & other bills, I'm gonna be about $400-$500 short, but my entire next paycheck is free. X_X Such time. Much grrr.


----------



## xer0h0ur

Now that is what you call an offer. I am still not sure if I want to keep or sell the 500GB SSD. Too bad I didn't know about that offer when I bought my PSU off amazon.


----------



## gumby0406

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm very irritated I don't have the money available. I'd do it in a heartbeat, both for the SSD & for the PSU, let alone the card itself. That PSU is the exact unit I need to run my current QuadFire configuration at stock on the GPUs, with one PSU. The SSD would allow me to reinstall Windows & put ALL of my stuff on one drive as far as games & vital programs go... and of course I want to play with that card very badly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll have to hope it's still available in this combo in about 2 weeks when I can get the rest of the money together I'd need to get it... right now after my upcoming truck payment & other bills, I'm gonna be about $400-$500 short, but my entire next paycheck is free. X_X Such time. Much grrr.


I know how that is. The only reason I was able to order this system is my wife and I had to cancel our 2 week vacation that we had been planning and saving for all year. The circumstances caused us to not be able to reschedule at all. So we split the money and she got some things she wanted. And agreed to let me use the majority of the money to build my dream system.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> Now that is what you call an offer. I am still not sure if I want to keep or sell the 500GB SSD. Too bad I didn't know about that offer when I bought my PSU off amazon.


It's a great SSD. I really hope they keep it in stock & setup in that exact combo til I can afford to purchase. If I like the 295x2 I'll keep it & resell 2 290Xs, if I don't like it, I'll resell the 295x2 & keep the PSU & SSD for the grand total of $84 I paid for it







lol Plus the free games, I REALLY want Thief X_X and Hitman: Absolution... and the third game I'd haveta think about. lol


----------



## crazygamer123

Quote:


> Originally Posted by *HoneyBadger84*
> 
> So uh... either I'm seeing things or...
> 
> You can get a R9 295x2... and the very PSU I want for my system... in a combo that gives you $215 off... while STILL getting the free 500GB SSD and the 3 free games certificate... holy crap.  if I had $1600 right now the amount of spam I'd be putting on my mouse button to press order would be epic.


Thanks for this . But its 1500$ not 1600$ for the whole set. Plus you will get a mail in rebate card after you purchase the combo. I ordered mine, dont know about diamond brand but this is such a good deal


----------



## HoneyBadger84

Quote:


> Originally Posted by *crazygamer123*
> 
> Thanks for this . But its 1500$ not 1600$ for the whole set. Plus you will get a mail in rebate card after you purchase the combo. I ordered mine, dont know about diamond brand but this is such a good deal


It's $1595 shipped for the combo with PSU I'd want with it (Antec HCP-1300W) & the 3 day shipping.







If it's still up in any way shape or form in 2 weeks, I'm 100% buying it unless something ridiculous happens that costs me over $300 to fix.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814103131 ~ go there, scroll down, view more combos, the HCP-1300W is the 5th PSU listed, at least for me, and it keeps the combo with the SSD & game certificate intact... gaaaaawd I wants.

Edit: I swear Antec & AMD should pay me for the amount of free advertising I give the R9 295x2 & that PSU cuz of how bad I want the PSU for overall regular use & the GPU to try it out


----------



## xer0h0ur

Quote:


> Originally Posted by *gumby0406*
> 
> I know how that is. The only reason I was able to order this system is my wife and I had to cancel our 2 week vacation that we had been planning and saving for all year. The circumstances caused us to not be able to reschedule at all. So we split the money and she got some things she wanted. And agreed to let me use the majority of the money to build my dream system.


Keep her.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> It's a great SSD. I really hope they keep it in stock & setup in that exact combo til I can afford to purchase. If I like the 295x2 I'll keep it & resell 2 290Xs, if I don't like it, I'll resell the 295x2 & keep the PSU & SSD for the grand total of $84 I paid for it
> 
> 
> 
> 
> 
> 
> 
> lol Plus the free games, I REALLY want Thief X_X and Hitman: Absolution... and the third game I'd haveta think about. lol


I had taken Thief and Tombraider with the silver gift from the 270 and with the gold gift I took Hitman Absolution, Sleeping Dogs and Alan Wake. Thief is a good game. I was disappointed I managed to finish it fairly quickly but its fun.


----------



## gumby0406

Quote:


> Originally Posted by *xer0h0ur*
> 
> Keep her.
> I had taken Thief and Tombraider with the silver gift from the 270 and with the gold gift I took Hitman Absolution, Sleeping Dogs and Alan Wake. Thief is a good game. I was disappointed I managed to finish it fairly quickly but its fun.


Lol I plan on it. I just received my code for the games about an hour ago. Not sure what they have for my selection. I haven't received any of my parts so I am waiting until I get the system built.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> Keep her.
> I had taken Thief and Tombraider with the silver gift from the 270 and with the gold gift I took Hitman Absolution, Sleeping Dogs and Alan Wake. Thief is a good game. I was disappointed I managed to finish it fairly quickly but its fun.


Alan Wake is pretty great imo, it managed to scare me decently a few times, which horror games rarely do. It's a decently impressive game graphically even though it's pretty dated in terms of todays stuff, and the storyline is pretty darn good, if you like thriller/jumper type stuff, which I do. The gameplay/mechanics can be a bit cantankerous at times, but manageable... I kinda like it though, it makes you actually feel like you have to put effort in to doing certain things instead of just "okay click this, hit that".

I'm really lookin' forward to Thief just cuz I love games that are graphically intense & immersive. Hitman Absolution is equal parts for the benchmarking ability the game presents & the game itself cuz every now & then I just wanna play a shooter with a decent plot, which I'm hoping it has.

Tomb Raider I have... on a console I don't even use cuz I only bought it for GTA V (literally, that and the Halo games I got an itch to replay).


----------



## xer0h0ur

Yeah Alan Wake seemed right up my alley since it came out but I just never got around to buying it plus my system had become grossly outdated when that game was released so I doubt it would have handled it with an ATI 1800XT. I have always been a fan of horror games like Silent Hill, Doom 3, Fear and whatnot.


----------



## crazygamer123

I got thief, payday 2 and Murdered Soul Suspect. Thief is pretty amazing with mantle supported. You can get 68-75 fps in 4k resolution, i finished the game with 3-4 random crashs, not a big deal.


----------



## xer0h0ur

I considered getting Murdered Soul Suspect until I saw a review on youtube about gameplay being very linear and dumbed down so I avoided it. Thief only crashed on me once but it was my own fault for alt tabbing a lot during gameplay.


----------



## axiumone

Quote:


> Originally Posted by *CrisInuyasha*
> 
> They sent me an UEFI enabled bios but for the original card, not the OC model. I asked if they have a bios for the OC card, now I'm waiting for the response. Anyway, if anyone is interested in the BIOS I received for the Sapphire, here is the link (btw they only sent me the bios and nothing more, but I just used ATI Winflash and it worked):
> 
> https://dl.dropboxusercontent.com/u/20405147/sapphire_r9_295x2_uefi.zip
> 
> 
> 
> C67301MU.101 for the master card
> C67301SU.101 for the slave card
> 
> Edit: I noticed I lost the ability to monitor / edit voltages with this UEFI bios when using MSI Afterburner


Wanted to thank you for posing these. I had a chance to update my cards to this bios this weekend and the boot times are simply incredible.

I haven't look at afterburner yet to see if I have the same voltage issue as you.


----------



## electro2u

We can trifire a 295x2 with a 290 nonx if we want right? I read somewhere the clocks of the 290 would be independent but still work in games. I can't find the reference.


----------



## ImperialOne

Quote:


> Originally Posted by *electro2u*
> 
> We can trifire a 295x2 with a 290 nonx if we want right? I read somewhere the clocks of the 290 would be independent but still work in games. I can't find the reference.


True, as long as the card is a 290 or 290x, it's fine. HardOCP did a Trifire performance article for R9 295x2 from May 2014


----------



## HoneyBadger84

Quote:


> Originally Posted by *electro2u*
> 
> We can trifire a 295x2 with a 290 nonx if we want right? I read somewhere the clocks of the 290 would be independent but still work in games. I can't find the reference.


You can TriFire any card with any card, pretty much, from the current generation, it's ideal to do it with a 290X or 290 though. Similar Shader Core count works best. Ideally you want a 290X though because they match.


----------



## xer0h0ur

Its on like Donkey Kong. Just waiting on some hosing and clamps. Frozencpu.com doesn't play around when they say rush processing. It was lightning fast and I ordered Friday night.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its on like Donkey Kong. Just waiting on some hosing and clamps. Frozencpu.com doesn't play around when they say rush processing. It was lightning fast and I ordered Friday night.


The guy that bought 2 of my 290Xs off me is probably thinking the same thing... he paid Friday, I shipped Saturday:



I work fast, and so does the Post Office... when it's only shipping about 1000mi. lol


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its on like Donkey Kong. Just waiting on some hosing and clamps. Frozencpu.com doesn't play around when they say rush processing. It was lightning fast and I ordered Friday night.


EVGA suddenly moved my order (1600 G2) to wednesday so I can't power my cards. If you got all these parts installed let us know what you get as temps








I'm also busy drilling holes in my Cosmos 2 case for my radiators since not many of them properly line up.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> EVGA suddenly moved my order (1600 G2) to wednesday so I can't power my cards. If you got all these parts installed let us know what you get as temps
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also busy drilling holes in my Cosmos 2 case for my radiators since not many of them properly line up.


I know the Cosmos 2 & my case as well both aren't exactly new, but what is up with cases having 120/140mm fan holes, supposedly designed to be liquid cooling compatible (at least in my case) and then you put a 280mm radiator up to it (H110) and guess what doesn't line up







The fan screw holes. GRRR!


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I know the Cosmos 2 & my case as well both aren't exactly new, but what is up with cases having 120/140mm fan holes, supposedly designed to be liquid cooling compatible (at least in my case) and then you put a 280mm radiator up to it (H110) and guess what doesn't line up
> 
> 
> 
> 
> 
> 
> 
> The fan screw holes. GRRR!


Exactly, the holes rarely line up ...









In the bottom I was able to get 2 240mm UT60s without to much problems. But the 120 at the exhaust fan and the 360 at the top are annoying to install ...
The 900D would have been a much wiser pick. However, I still like the cosmos' look more


----------



## HoneyBadger84

Rofl, not mine but random picture, dat radiator setup for the cards doe!


----------



## 4K-HERO

Can some of the guys with waterblocks post their temps please. Along with the brand. I was hoping to pick my waterblocks based on your experience.


----------



## SaLX

I've got to ask... what kind of shelf life do you expect this card to have as the very best graphics solution - a year maybe?


----------



## ViRuS2k

Quote:


> Originally Posted by *xer0h0ur*
> 
> It just occurred to me since you mentioned you have thick radiators, are you sure that your radiator fans create enough static pressure to be effective on your rads? Also do you use gaskets and/or spacers between the fan and the radiator? Spacers can help eliminate the motor dead spot from the center of the fan and gaskets can create a better seal forcing airflow through the radiator. Some gaskets can do both for you.


nah dont have any of those gaskets ect got a link to the gaskets i need and yeah its possible i need QUIET high rpm fans with HIGH static pressure but are quiet at max speed.
recommendations would be great.


----------



## axiumone

Quote:


> Originally Posted by *SaLX*
> 
> I've got to ask... what kind of shelf life do you expect this card to have as the very best graphics solution - a year maybe?


\

Well, we are already 4 months into the life cycle of this card, so I'm guessing another 6 months or so.


----------



## HoneyBadger84

Quote:


> Originally Posted by *SaLX*
> 
> I've got to ask... what kind of shelf life do you expect this card to have as the very best graphics solution - a year maybe?


GTX 880 won't beat it. AMD's 390, whenever it comes, won't beat it. I don't know if NVidia or AMD plan on a delayed Dual-GPU release like they did with the current series of cards... TBH I'd expect this thing and possibly a re-release of it (if that happens, with higher cores/better cooling) to be on top for at least a year. It's actual shelf life as a very high quality/high end card will probably be 3-4 years unless you're playing at 4K.


----------



## xer0h0ur

Quote:


> Originally Posted by *ViRuS2k*
> 
> nah dont have any of those gaskets ect got a link to the gaskets i need and yeah its possible i need QUIET high rpm fans with HIGH static pressure but are quiet at max speed.
> recommendations would be great.


Do you have a preference as for your fans? PWM or non-PWM? Size or thickness? The gaskets I am now using are these: http://www.frozencpu.com/products/12588/ex-rad-218/Phobya_Shroud_and_Decoupling_Gasket_120mm_7mm_thickness.html They are thick enough to make a good seal even when the surface isn't perfectly flat around the radiator or fan. I just use spacers to make sure the screws are pushing down hard enough to squeeze the rubber flush against all edges. They are made for different size fans and in varying thicknesses so you shouldn't have a problem finding them.


----------



## Jpmboy

Quote:


> Originally Posted by *Synthaxx*
> 
> EVGA suddenly moved my order (1600 G2) to wednesday so I can't power my cards. If you got all these parts installed *let us know what you get as temps*
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also busy drilling holes in my Cosmos 2 case for my radiators since not many of them properly line up.


with the Kooloance block on my 295x2 both GPUs stay under 55C after some REALLY hard use at 1110/1600 with +50mV on AB. I placed a rubber-footed 90mm fan directly on top of the VRMs (sitting on the backside of the PCB) which helps cool them down a bit. Go with Fuji poly ultra or extreme on the vrm plate of the water block. Koolance and EK pads are marginal at best.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jpmboy*
> 
> with the Kooloance block on my 295x2 both GPUs stay under 55C after some REALLY hard use at 1110/1600 with +50mV on AB. I placed a rubber-footed 90mm fan directly on top of the VRMs (sitting on the backside of the PCB) which helps cool them down a bit. Go with Fuji poly ultra or extreme on the vrm plate of the water block. Koolance and EK pads are marginal at best.


How do you know that the pads that come with the EK block are marginal at best? Not saying I doubt what you're saying I just am trying to maximize the thermal transfer to the block as well myself. Is there a comparison/review somewhere including these EK pads versus the Fuji pads you're talking about?


----------



## Jpmboy

Well - I replaced the EK ones on two kingpins (so far - 3rd one tomorrow) and vrms stay alot cooler (by IR thermometer). Did the same for 2 titans, 2 780Ti classifieds, and a total of 6 kinpins (of which I still have three) The Fuji ultra extreme is simply the best, but very expensive. THe 11 kw pads are excellent too. Best to use Gelid GC Extreme on the GPUs. Will lower temps quite a lot.









for comparison, the stock koolance 0.7mm pad is 1.5kw thermal conductivity. ultra extreme is 17.

http://www.frozencpu.com/cat/l3/g8/c487/s1290/list/p1/Thermal_Interface-Thermal_Pads_Tape-Thermal_Pad_-_10mm_-Page1.html

edit - if you ask the question on the Titan or 780 threads it will be unanimous.


----------



## xer0h0ur

That isn't even close. That is exponentially better. My only problem then would be getting the right thickness pads. Apparently EK's block calls for 0.5mm, 1mm, and 1.5mm pads. Most of which are the 0.5mm pads used on the RAM with the block and backplate. Do the Fujipoly Ultra Extreme pads come in those thicknesses? As for the GPU TIM I had already bought PK-3.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Do the Fujipoly Ultra Extreme pads come in those thicknesses?


http://www.frozencpu.com/search.html?mv_profile=keyword_search&mv_session_id=BShGyvKG&searchspec=fujipoly+ultra&go.x=0&go.y=0[/URL]


----------



## ViRuS2k

Quote:


> Originally Posted by *xer0h0ur*
> 
> Do you have a preference as for your fans? PWM or non-PWM? Size or thickness? The gaskets I am now using are these: http://www.frozencpu.com/products/12588/ex-rad-218/Phobya_Shroud_and_Decoupling_Gasket_120mm_7mm_thickness.html They are thick enough to make a good seal even when the surface isn't perfectly flat around the radiator or fan. I just use spacers to make sure the screws are pushing down hard enough to squeeze the rubber flush against all edges. They are made for different size fans and in varying thicknesses so you shouldn't have a problem finding them.


Thanks for the information








i will probably go with THICK fans or THIN fans what ever is best performance and relatively quiet.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> That isn't even close. That is exponentially better. My only problem then would be getting the right thickness pads. Apparently EK's block calls for 0.5mm, 1mm, and 1.5mm pads. Most of which are the 0.5mm pads used on the RAM with the block and backplate. Do the Fujipoly Ultra Extreme pads come in those thicknesses? As for the GPU TIM I had already bought PK-3.


all the sizes you need are available at f-cpu. PK-3 is just as good as Gelid extreme. I use it quite a lot too.









http://www.frozencpu.com/cat/l2/g8/c487/list/p1/Thermal_Interface-Thermal_Pads_Tape.html


----------



## Synthaxx

Quote:


> Originally Posted by *Jpmboy*
> 
> with the Kooloance block on my 295x2 both GPUs stay under 55C after some REALLY hard use at 1110/1600 with +50mV on AB. I placed a rubber-footed 90mm fan directly on top of the VRMs (sitting on the backside of the PCB) which helps cool them down a bit. Go with Fuji poly ultra or extreme on the vrm plate of the water block. Koolance and EK pads are marginal at best.


Very interesting! Which and how many radiators do you use?
Unfortunately, the Fuji Poly is only available here in huge sheets and I would have to buy 3. That would make like $1000 in thermal pads .....


----------



## xer0h0ur

Can someone with a fresh set of eyes take a look at these **** job installation guides EK has for their WB and backplate? There are typos and mistakes everywhere and now I am confused if I even need 1mm thick pads or not.

http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869079.pdf
http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869093.pdf


----------



## rdr09

Quote:


> Originally Posted by *xer0h0ur*
> 
> Can someone with a fresh set of eyes take a look at these **** job installation guides EK has for their WB and backplate? There are typos and mistakes everywhere and now I am confused if I even need 1mm thick pads or not.
> 
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869079.pdf
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869093.pdf


1. doing one side at a time, the side of the pcb where the block goes - the No. 2s - use 1mm.



2. the side where the backplate goes does not use any 1mm.


----------



## xer0h0ur

Quote:


> Originally Posted by *rdr09*
> 
> doing one side at a time, the side of the pcb where the block goes - the No. 2s - use 1mm.


That is what one would think by looking at the diagram with the three pads and the lines going to the PCB showing where they are mounted but directly above that it says "Replacement thermal pads:2x Thermal Pad A - 0.5mm (16x100mm),2x Pad D - 0.5mm (8xRAM) , 1x Pad C - 1.5mm (25x50mm)" as if both #1 and #2 pads are 0.5mm. That is why I am confused.


----------



## rdr09

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is what one would think by looking at the diagram with the three pads and the lines going to the PCB showing where they are mounted but directly above that it says "Replacement thermal pads:2x Thermal Pad A - 0.5mm (16x100mm),2x Pad D - 0.5mm (8xRAM) , 1x Pad C - 1.5mm (25x50mm)" as if both #1 and #2 pads are 0.5mm. That is why I am confused.


i see what you mean. it is confusing but i think it is just stating the amount of each pads that are included.

you might want to wait for more responses. surely you are not the first.

btw, not to get you more confused . . . have you heard of the Fujipolys?

edit: jp talked about them in post #1678.


----------



## electro2u

I think that's confusing enough I would shoot an email off to EK. They are pretty good about responding quickly.


----------



## axiumone

Quote:


> Originally Posted by *axiumone*
> 
> Wanted to thank you for posing these. I had a chance to update my cards to this bios this weekend and the boot times are simply incredible.
> 
> I haven't look at afterburner yet to see if I have the same voltage issue as you.


Quote:


> Originally Posted by *CrisInuyasha*
> 
> They sent me an UEFI enabled bios but for the original card, not the OC model. I asked if they have a bios for the OC card, now I'm waiting for the response. Anyway, if anyone is interested in the BIOS I received for the Sapphire, here is the link (btw they only sent me the bios and nothing more, but I just used ATI Winflash and it worked):
> 
> https://dl.dropboxusercontent.com/u/20405147/sapphire_r9_295x2_uefi.zip
> 
> 
> 
> C67301MU.101 for the master card
> C67301SU.101 for the slave card
> 
> Edit: I noticed I lost the ability to monitor / edit voltages with this UEFI bios when using MSI Afterburner


A follow up - I also lost the ability to adjust voltage in afterburner with this bios. It doesn't bother me.


----------



## xer0h0ur

Quote:


> Originally Posted by *rdr09*
> 
> i see what you mean. it is confusing but i think it is just stating the amount of each pads that are included.
> 
> you might want to wait for more responses. surely you are not the first.
> 
> btw, not to get you more confused . . . have you heard of the Fujipolys?
> 
> edit: jp talked about them in post #1678.


That is exactly why I am asking since these Fujipoly pads are very expensive and the RAM pads are 1mm too narrow so it forces me to buy the larger more expensive pads and cut them to size. Can get 15 pieces out of one 60x50 sheet but need 32 total lol. Then the 1mm thick pad if necessary and the 1.5mm pads are the most expensive and I need two of those pads. Basically looking at about $140 without shipping. Thassalotta mulah for some damn thermal pads.


----------



## rdr09

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is exactly why I am asking since these Fujipoly pads are very expensive and the RAM pads are 1mm too narrow so it forces me to buy the larger more expensive pads and cut them to size. Can get 15 pieces out of one 60x50 sheet but need 32 total lol. Then the 1mm thick pad if necessary and the 1.5mm pads are the most expensive and I need two of those pads. Basically looking at about $140 without shipping. Thassalotta mulah for some damn thermal pads.


i only have a 290 and i used this for the VRAM . . .

http://www.frozencpu.com/products/16878/thr-164/Fujipoly_ModRight_Extreme_System_Builder_Thermal_Pad_Blister_Pack_-_14_Sheet_-_150_x_100_x_05_-_Thermal_Conductivity_110_WmK.html

and have enough left over for one more or even two more gpus. yah, the 17W/mk is more expensive but for the VRAM, even the stock will work. The VRMS are the ones that need the 17.

edit: i measured what's left over for my VRAM and i still have 100 X 80 after just one GPU. enough for 2 more GPUs.

edit: BTW, my VRM temps are always lower than my core and i used CLU for the core.


----------



## xer0h0ur

Yeah I was going for the Ultra Extreme pad which is not helping me out cost-wise. The thing is that I already am balls deep into this so after spending $1500 on the video card, $262 on the waterblock + backplate + fittings, $107 on the radiator + fittings + fan gasket + extra coolant...I am not about to go mid-range with the pads. If I am going full ****** then by god I am going way off the deep end.

Sadly though if Synthaxx's calculations are correct I might end up spending more money on a third radiator for the loop plus some quick disconnects and then having to cut into the side panel to mount it.


----------



## rdr09

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I was going for the Ultra Extreme pad which is not helping me out cost-wise. The thing is that I already am balls deep into this so after spending $1500 on the video card, $262 on the waterblock + backplate +fittings, $107 on the radiator + fittings + fan gasket + extra coolant...I am not about to go mid-range with the pads. If I am going full ****** then by god I am going way off the deep end.


just go with the 11W/mk. the 17 are for extreme benchers.









edit: my left over for the VRMs after one application is enough for at least ten more gpus.


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> i only have a 290 and i used this for the VRAM . . .
> http://www.frozencpu.com/products/16878/thr-164/Fujipoly_ModRight_Extreme_System_Builder_Thermal_Pad_Blister_Pack_-_14_Sheet_-_150_x_100_x_05_-_Thermal_Conductivity_110_WmK.html
> and have enough left over for one more or even two more gpus. yah, the 17W/mk is more expensive but for the VRAM, even the stock will work. *The VRMS are the ones that need the 17.
> *
> edit: i measured what's left over for my VRAM and i still have 100 X 80 after just one GPU. enough for 2 more GPUs.
> edit: BTW, m*y VRM temps are always lower* than my core and i used CLU for the core.


or like you said, the 11's are fine. Main concern is for the vrms. the memory will not be a problem with stock EK pads.

@xer0h0ur - the 1mm "short" is not a problem for the memory T-pads. (you can cut them to fit)


----------



## xer0h0ur

Its too late bruh. You just dangled greatness in front of me and I'll be damned if I don't swipe at it.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its too late bruh. You just dangled greatness in front of me and I'll be damned if I don't swipe at it.


lol.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its too late bruh. You just dangled greatness in front of me and I'll be damned if I don't swipe at it.


This... I had to order some myself.


----------



## cennis

http://www.bit-tech.net/hardware/2014/08/06/water-cooling-the-amd-radeon-r9-295x2/1

review that I was waiting for.

EDIT: no mention about vrm temps at all which is more of a concern here...


----------



## ImperialOne

Quote:


> Originally Posted by *cennis*
> 
> http://www.bit-tech.net/hardware/2014/08/06/water-cooling-the-amd-radeon-r9-295x2/1
> 
> review that I was waiting for.
> 
> EDIT: no mention about vrm temps at all which is more of a concern here...


Bottom Line: markedly improved noise and thermals, a 1% improvement in performance under a strenuous Unigine benchmark.

No thanks. More kudos to AMD.


----------



## cennis

Quote:


> Originally Posted by *ImperialOne*
> 
> Bottom Line: markedly improved noise and thermals, a 1% improvement in performance under a strenuous Unigine benchmark.
> 
> No thanks. More kudos to AMD.


Like I mentioned, its not all about core temps.

even at stock clocks some people throttle due to VRM temps. Worse for oc'd


----------



## HoneyBadger84

Quote:


> Originally Posted by *ImperialOne*
> 
> Bottom Line: markedly improved noise and thermals, a 1% improvement in performance under a strenuous Unigine benchmark.
> 
> No thanks. More kudos to AMD.


Quote:


> Originally Posted by *cennis*
> 
> Like I mentioned, its not all about core temps.
> 
> even at stock clocks some people throttle due to VRM temps. Worse for oc'd


That review also doesn't do it justice because of the tiny radiator used. That thing should have a 360/420mm radiator on it imo.


----------



## xer0h0ur

I seriously am sick of seeing so many reviews where these morons are using them in unrealistic conditions. How many reviews have you guys seen where these schmucks are using bench rigs where its barely a frame holding a bunch of components. Rarely have I seen reviews where they actually bothered to keep the graphics card inside of a closed case AND push the card at 4K or other high stress resolutions such as 1440p. As already mentioned they also rarely even speak of the VRM temperatures and throttling.


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> I seriously am sick of seeing so many reviews where these morons are using them in unrealistic conditions. How many reviews have you guys seen where these schmucks are using bench rigs where its barely a frame holding a bunch of components. Rarely have I seen reviews where they actually bothered to keep the graphics card inside of a closed case AND push the card at 4K or other high stress resolutions such as 1440p. As already mentioned they also rarely even speak of the VRM temperatures and throttling.


You right.

I have acquacomputer waterblock and a fan over the vrm's and temperature never goes over 45 C for vrm . vram and core but when i play games the card is throttling from 100% to 5-60% and some games to 20% down all the time in almost all the games.

this is really annoying and really bad that none it's mention this issues in the reviews.
On an $1500 card this shouldn't happen.


----------



## xer0h0ur

I have read your issues and really your problems have been the worst I have heard of. I don't know of anyone else getting throttling on the level you have. Yours really is an odd case.


----------



## cennis

Quote:


> Originally Posted by *pompss*
> 
> You right.
> 
> I have acquacomputer waterblock and a fan over the vrm's and temperature never goes over 45 C for vrm . vram and core but when i play games the card is throttling from 100% to 5-60% and some games to 20% down all the time in almost all the games.
> 
> this is really annoying and really bad that none it's mention this issues in the reviews.
> On an $1500 card this shouldn't happen.


how are you getting vrm temps?

IR gun or probe?


----------



## HoneyBadger84

Quote:


> Originally Posted by *cennis*
> 
> how are you getting vrm temps?
> 
> IR gun or probe?


Probably GPUz.


----------



## xer0h0ur

Except the VRM's don't have temperature diodes on this card. Something which makes no sense unless they were ashamed of people finding out how hot they get or the fact their liquid cooling solution doesn't cool them.


----------



## cennis

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Probably GPUz.


GPUz does not work for this card


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> Except the VRM's don't have temperature diodes on this card. Something which makes no sense unless they were ashamed of people finding out how hot they get or the fact their liquid cooling solution doesn't cool them.


not sure if no diode or just no software that can monitor it?

If theres no diode, it should not throttle when vrm overheats


----------



## HoneyBadger84

Quote:


> Originally Posted by *cennis*
> 
> GPUz does not work for this card


I recall someone saying several readouts did work, if you have a 3rd card in your system (290, 290X, 280X, something that it can crossfire with) *shrug* could be wrong, I don't own it so I have no idea for sure.


----------



## xer0h0ur

Quote:


> Originally Posted by *cennis*
> 
> not sure if no diode or just no software that can monitor it?
> 
> If theres no diode, it should not throttle when vrm overheats


You have a point there. Its just odd to me that my measly 270 showed me the diode temperatures in Aida64 Extreme but this card doesn't.









I thought the tri-firing advantage was unlocking the fan control on the 295x2. Well other than the performance obviously. I don't know about the diode readouts though.


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> You have a point there. Its just odd to me that my measly 270 showed me the diode temperatures in Aida64 Extreme but this card doesn't.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I thought the tri-firing advantage was unlocking the fan control on the 295x2. Well other than the performance obviously. I don't know about the diode readouts though.


try hwmonitor?


----------



## xer0h0ur

Can't say I have used hwmonitor. I have only used Aida64 Extreme and Afterburner. I could try hwmonitor tonight though.


----------



## xer0h0ur

Anyone have any experience using quick disconnects on their loop? Any suggestions? How about making it easier to drain the loop for maintenance purposes? I figure if I am about to change it up I might as well get this all done at once.


----------



## pompss

Quote:


> Originally Posted by *cennis*
> 
> how are you getting vrm temps?
> 
> IR gun or probe?


i have temperature sensor witch is placed on the vrms and give me the temp. on my fan controller.
Works pretty good


----------



## cennis

Quote:


> Originally Posted by *pompss*
> 
> i have temperature sensor witch is place on the vrms and give me the temp. on my fan controller.
> Works pretty good


if this is a temp probe directly on the vrm chips then it sounds quite low. Maybe your card is not running a high load since you are throttling often, which would explain these low vrm temps.

even fully stressed water cooled 290x run close to 60c vrm


----------



## HoneyBadger84

Quote:


> Originally Posted by *cennis*
> 
> if this is a temp probe directly on the vrm chips then it sounds quite low. Maybe your card is not running a high load since you are throttling often, which would explain these low vrm temps.
> 
> even fully stressed water cooled 290x run close to 60c vrm


Weird, I don't think I've ever seen my VRMs get over 60C unless I'm running them OCed in Quad or TriFire sandwiched configs, even then it's just barely over 60, with GPU core temps in the mid 70s. But then again I do run my fans at 100% @ 60C on the GPU core, so there's that...

It's very odd that there's no VRM software readouts for these cards, maybe it's just a software update needed by the programs in question. There's no way there's physically no diode on the VRMs, AMD wouldn't do something that stupid on a card this expensive.


----------



## xer0h0ur

Did you ever even get your card replaced pompss?


----------



## cennis

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Weird, I don't think I've ever seen my VRMs get over 60C unless I'm running them OCed in Quad or TriFire sandwiched configs, even then it's just barely over 60, with GPU core temps in the mid 70s. But then again I do run my fans at 100% @ 60C on the GPU core, so there's that...
> 
> It's very odd that there's no VRM software readouts for these cards, maybe it's just a software update needed by the programs in question. There's no way there's physically no diode on the VRMs, AMD wouldn't do something that stupid on a card this expensive.


im just making a comparison with 290 temps since he said his 295x2 vrm is <45c


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> Did you ever even get your card replaced pompss?


No but i think i will send back to xfx soon because i have too much throttling.


----------



## pompss

Quote:


> Originally Posted by *cennis*
> 
> if this is a temp probe directly on the vrm chips then it sounds quite low. Maybe your card is not running a high load since you are throttling often, which would explain these low vrm temps.
> 
> even fully stressed water cooled 290x run close to 60c vrm


Yes its hits 60 c if i remove the fan from the backplate.
With the fan on the vrms i have 43-45 c in games with 1100 mhz core and 1350 mhz on the memory.

I would suggest to put a fan on the backplate to cool the vrms down because the waterblock its not enough


----------



## xer0h0ur

Yeah I would have punted that card already by now with the kind of throttling you're getting.


----------



## ViRuS2k

i dont get any throttling @ stock 1030/1250 but @1150/1690 @1.3v i get throttling on the 2nd core gpu

68c/74c...

its a shame cause my EK waterblock and backplate do a great job of killing the heat and the backplate is egg cooking hot lol so i guess the heat transfering to the backplate is doing its job.
i might need to invest in some fan gaskets and also some liquid metal pro on the gpu cores. or invest in better RADS. thought 2x 360 rads would kick it but not with this card lol


----------



## electro2u

I don't think we're supposed to use any electrically conductive TIM on waterblocks. no?


----------



## HoneyBadger84

Quote:


> Originally Posted by *electro2u*
> 
> I don't think we're supposed to use any electrically conductive TIM on waterblocks. no?


Probably not a good idea, especially with full coverage blocks like the type that come for 295x2s.


----------



## ViRuS2k

Quote:


> Originally Posted by *electro2u*
> 
> I don't think we're supposed to use any electrically conductive TIM on waterblocks. no?


you can use what ever thermal paste you want on water blocks there no different than heat sinks... though generally people tend to go for none condictive thermal pastes. lol
would you want to fry a R9 295x2.... though i am willing to risk it. if its going to knock off at least 5c on load temps. i seen how much load temps i knocked off deliding my devils canyon 4790k 20+c off load temps.
though i know GPU cores are different when it comes to heat so im not holding my break for that much but 5-8c off would be sweet. and stop the card throttling when overclocked.


----------



## Jpmboy

Quote:


> Originally Posted by *cennis*
> 
> http://www.bit-tech.net/hardware/2014/08/06/water-cooling-the-amd-radeon-r9-295x2/1
> 
> review that I was waiting for.
> 
> EDIT: n*o mention about vrm temps* at all which is more of a concern here...


yeah that was a surprise. Beautiful block tho!


----------



## xer0h0ur

Quote:


> Originally Posted by *ViRuS2k*
> 
> you can use what ever thermal paste you want on water blocks there no different than heat sinks... though generally people tend to go for none condictive thermal pastes. lol
> would you want to fry a R9 295x2.... though i am willing to risk it. if its going to knock off at least 5c on load temps. i seen how much load temps i knocked off deliding my devils canyon 4790k 20+c off load temps.
> though i know GPU cores are different when it comes to heat so im not holding my break for that much but 5-8c off would be sweet. and stop the card throttling when overclocked.


You sir are much braver than I. I am willing to kill a warranty to mount a waterblock but not willing to risk a $1500 paperweight.


----------



## Jpmboy

Coolaboratory Ultra 9CLU) liquid metal is great stuff... but a PIA to use more than once.


----------



## xer0h0ur

Quote:


> Originally Posted by *cennis*
> 
> try hwmonitor?


CPUID HWMonitor Pro kept crashing on me like crazy but when it was running it only showed one temperature without saying what it was for. I presumed it was for the first core. I removed it immediately. The non-Pro version was stable but once again only shows a single temperature reading named TMPIN0 for the video card.


----------



## Synthaxx

I finally received my 1600 G2 power supply so I could test my cards out!

After some testing and installing, I noticed that 1 card gets quite a bit higher benchmark in Unigine heaven (2350 vs 1750), while keeping all settings identical. I didn't notice any throttling when looking at the graphs.
The lowest score card also kindoff... stuttered between scenes, taking 2 seconds to start the next scene while the highest score card had everything much smoother.

Would that 1 card be 'bad'?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> I finally received my 1600 G2 power supply so I could test my cards out!
> 
> After some testing and installing, I noticed that 1 card gets quite a bit higher benchmark in Unigine heaven (2350 vs 1750), while keeping all settings identical. I didn't notice any throttling when looking at the graphs.
> The lowest score card also kindoff... stuttered between scenes, taking 2 seconds to start the next scene while the highest score card had everything much smoother.
> 
> Would that 1 card be 'bad'?


Did you reinstall drivers when swapping cards? Sometime even slightly BIOS differences from card to card can cause issues with the drivers if not reinstalled... which is why I've done about 15000 driver reinstalls in the last 3 months since I started swapping out 290Xs like they were pancakes in my computer case.

That might help, hope so. If not, I'd say the lower scoring card may have some issues yep. Did you check in GPUz to see if they have the exact same BIOS versions as eachother (assuming they're the same brand)?


----------



## xer0h0ur

Dang, I never thought of a BIOS difference causing problems in quadfire. I will keep that in mind if I ever end up tri-firing another 290X with the 295X2.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Dang, I never thought of a BIOS difference causing problems in quadfire. I will keep that in mind if I ever end up tri-firing another 290X with the 295X2.


HoneyBadger is saying that when you have a card installed and you switch it out for another card without reinstalling the drivers, the cards might be identical but they might have different BIOS versions, and the drivers installation can become damaged somehow. I find this to be spot on. Having the same BIOS on all your cards in a multi-GPU config is ideal, but not really critical. Particularly in Tri-Fire with a 295x2 the 290/x BIOS is going to be totally different.


----------



## xer0h0ur

Right but he also mentioned before about making sure you have the proper BIOS installed on 290X's. Something about making sure the "Subvendor" shows as being the manufacturer/retailer of the card. So in other words if I did get a 290X to tri-fire I would want to make sure it has its proper, most up to date BIOS to avoid any potential issues.


----------



## tsm106

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Dang, I never thought of a BIOS difference causing problems in quadfire. I will keep that in mind if I ever end up tri-firing another 290X with the 295X2.
> 
> 
> 
> HoneyBadger is saying that when you have a card installed and you switch it out for another card without reinstalling the drivers, the cards might be identical but they might have different BIOS versions, and *the drivers installation can become damaged somehow.* I find this to be spot on. Having the same BIOS on all your cards in a multi-GPU config is ideal, but not really critical. Particularly in Tri-Fire with a 295x2 the 290/x BIOS is going to be totally different.
Click to expand...

That's misinformation!!

Stick in another card, new bios, new card, whatever the drivers will configure that gpu on an individual basis. Different bios' also doesn't affect anything in and of itself.


----------



## ViRuS2k

well i just bought these for 1 of my radiators

Noctua NF-F12 IndustrialPPC 3000RPM PWM 120mm High Performance Fan £17.49 3 £52.47
XSPC 360mm Radiator Gasket

I am hopeing these fans will reduce full load temps more and also adding gasket to get more static pressure.
if this does not work i will be using liquid pro metal on the gpu dies.

Like i said though its only when the card is overclocked that it throttles.
when at default 1030/1300 the cards temps are 56c/64c , those are fine temps but want to be able to oc without throttling...


----------



## Synthaxx

So I did what honeybadger said and it seems my benchmarking scores are still rediculous.

I did a heaven 4.0 benchmark in 1080P, 4xaa and tesselation normal.
Normally a 780ti sli does 156.2 FPS on the exact same bench.

My results are:
score: 1649
min fps: 9.6
Max fps: 133.6
average fps: 65.4

This must be a bad card then? Or is there somewhere a hidden crossfire checkbox I didn't see?
My 3d mark scores are lower than a single titan.

edit: my GPU usage on both core 1 & 2 looks like a bouncing betty. up and down


----------



## xer0h0ur

That is odd. I would get it exchanged.


----------



## xer0h0ur

Quote:


> Originally Posted by *tsm106*
> 
> That's misinformation!!
> 
> Stick in another card, new bios, new card, whatever the drivers will configure that gpu on an individual basis. Different bios' also doesn't affect anything in and of itself.


No offense but I will take the word of someone who has experienced the issue over how things work in theory.


----------



## tsm106

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's misinformation!!
> 
> Stick in another card, new bios, new card, whatever the drivers will configure that gpu on an individual basis. Different bios' also doesn't affect anything in and of itself.
> 
> 
> 
> No offense but I will take the word of someone who has experienced the issue over how things work in theory.
Click to expand...

Was that intended comedy?









Quote:


> Originally Posted by *xer0h0ur*
> 
> That is odd. I would get it exchanged.


Maybe whomever's advice he followed didn't work out so well?


----------



## xer0h0ur

Quote:


> Originally Posted by *tsm106*
> 
> Was that intended comedy?
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe whomever's advice he followed didn't work out so well?


No I am not here to rustle jimmies or provide comedic relief. Just because HoneyBadger84's issue didn't turn out to be Synthaxx's issue, it doesn't suddenly make HoneyBadger84's issue any less real.


----------



## tsm106

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Was that intended comedy?
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe whomever's advice he followed didn't work out so well?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No I am not here to rustle jimmies or provide comedic relief. Just because electro2u's issue didn't turn out to be Synthaxx's issue, it doesn't suddenly make electro2u's issue any less real.
Click to expand...

Quote:


> Originally Posted by *Synthaxx*
> 
> So I did what honeybadger said and it seems my benchmarking scores are still rediculous.


It reads as comedy to me.


----------



## xer0h0ur

Just like theory rarely working the same as in practice. Problems rarely manifest themselves identically across different systems. I never shoot down people's experiences just because I have not experienced it myself. That is plain ridiculous.


----------



## tsm106

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's misinformation!!
> 
> Stick in another card, new bios, new card, whatever the drivers will configure that gpu on an individual basis. Different bios' also doesn't affect anything in and of itself.
> 
> 
> 
> *No offense but I will take the word of someone who has experienced the issue over how things work in theory.*
Click to expand...

^^This is comedy!


----------



## xer0h0ur

Alright, I will leave you and your ego alone now. Carry on.


----------



## Synthaxx

The advice was to simply wipe everything of the system. Using DDU for the leftover files and reinstalling everything.

When checking the results of this action by benchmarking, it seems not really much has changed.
Any other suggestions why my card(s) are not performing so well?


----------



## electro2u

Misinformation! Possibly. I was just clarifying someone's statement and saying it sounded plausible to me.

Reinstalling drivers isn't going to hurt anything so you must not have much to do if you are compelled to argue about it.


----------



## tsm106

Quote:


> Originally Posted by *electro2u*
> 
> Misinformation! Possibly. I was just clarifying someone's statement and saying it sounded plausible to me.
> 
> Reinstalling drivers isn't going to hurt anything so you must not have much to do if you are compelled to argue about it.


Man you guys are stuck in your own lil world. The statement from whomever is misinformation. AMD gpu drivers have been unified for years. That means each driver install installs the libraries for all gpu in that driver suite. If you change gpus within the same family, you don't have to reinstall drivers because they're already installed. Whomever said whatever obviously does not know this fundamental fact.


----------



## xer0h0ur

Quote:


> Originally Posted by *electro2u*
> 
> Misinformation! Possibly. I was just clarifying someone's statement and saying it sounded plausible to me.
> 
> Reinstalling drivers isn't going to hurt anything so you must not have much to do if you are compelled to argue about it.


Yeah, I am an idiot. Since tsm106 quoted you I used your name instead of HoneyBadger84. He was the person originally giving his experience with this issue.


----------



## electro2u

Quote:


> Originally Posted by *tsm106*
> 
> Man you guys are stuck in your own lil world.


Lil world vs. Big Giant Fat head--I'll take my small lil world.


----------



## Synthaxx

The card stays according to afterburner at 1030Mhz core and 1300Mhz memory.
The GPU usage on the other hand goes from 10%-80% all the time, on both the first and the second.

Maybe TSM106 can shed some light in my situation? Maybe he can tell us how he would solve the problem .......


----------



## electro2u

Quote:


> Originally Posted by *Synthaxx*
> 
> The card stays according to afterburner at 1030Mhz core and 1300Mhz memory.
> The GPU usage on the other hand goes from 10%-80% all the time, on both the first and the second.


Wait but, you're running 2 295x2s with a 1600 watt psu? I must have that wrong.


----------



## xer0h0ur

Quote:


> Originally Posted by *Synthaxx*
> 
> The card stays according to afterburner at 1030Mhz core and 1300Mhz memory.
> The GPU usage on the other hand goes from 10%-80% all the time, on both the first and the second.
> 
> Maybe TSM106 can shed some light in my situation? Maybe he can tell us how he would solve the problem .......


You know, your issue with this card sounds eerily similar to pompss' problem with his card. Was his also a Sapphire OC edition?


----------



## Synthaxx

Quote:


> Originally Posted by *electro2u*
> 
> Wait but, you're running 2 295x2s with a 1600 watt psu? I must have that wrong.


I'm testing them at this time one by one. Yes I will be running 2 295x2s with a 1600 watt PSU


----------



## xer0h0ur

He should be able to run both cards with that PSU as long as he's not getting overzealous with the overclocks system-wide.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> He should be able to run both cards with that PSU as long as he's not getting overzealous with the overclocks system-wide.


Yeah, seems OK.

Pomps issue does sound similar but it's a little unclear what he means by throttling exactly, from the posts a few pages back. Not sure if the GPU utilization is changing for him, or if his card is dropping its clocks. He says % a lot so I'm thinking he means utilization. He confirmed same issue I have on FFXIV with the card. I notice both these guys are on Asus Rampage IVs. Synthaxx is on a R4Black and Pompps is on a R4Gene.

We know it's not the cards because they are both doing it. We know he has enough power or it would just shut down.

I'm sure you guys already have latest MB BIOS installed... you guys are not messing around with your gear.


----------



## electro2u

Is there any way to force the card to run in 8x instead of 16x mode on your boards? I mean... it's a dumb idea but I'm just wondering if it has something to do with PCIE lanes and the PLEX chip in the 295x2. You would think it would just bypass the PLEX chip if it was running in 16x mode.


----------



## ViRuS2k

Quote:


> Originally Posted by *Synthaxx*
> 
> So I did what honeybadger said and it seems my benchmarking scores are still rediculous.
> 
> I did a heaven 4.0 benchmark in 1080P, 4xaa and tesselation normal.
> Normally a 780ti sli does 156.2 FPS on the exact same bench.
> 
> My results are:
> score: 1649
> min fps: 9.6
> Max fps: 133.6
> average fps: 65.4
> 
> This must be a bad card then? Or is there somewhere a hidden crossfire checkbox I didn't see?
> My 3d mark scores are lower than a single titan.
> 
> edit: my GPU usage on both core 1 & 2 looks like a bouncing betty. up and down












thats running default 1030/1300....... something defo wrong with your system.
and there is no switch on these cards crossfire just works if the profile is already in the drivers, or you can force it if not with radion pro or if you have windows 8 you can force it in ccc.


----------



## electro2u

Quote:


> Originally Posted by *Synthaxx*
> 
> I'm testing them at this time one by one. Yes I will be running 2 295x2s with a 1600 watt PSU


Try this, too. This fixed some problems up for me.


----------



## xer0h0ur

Oh I almost forgot, did you disable ULPS?

http://www.tomshardware.com/faq/id-1904869/disable-ulps-amd-crossfire-setups.html


----------



## Synthaxx

Quote:


> Originally Posted by *electro2u*
> 
> Try this, too. This fixed some problems up for me.


I have put it at high performance in the past, but it seems it switched to 'power saving'!! My FPS is now at 120+ in Heaven 4.0 ...








Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh I almost forgot, did you disable ULPS?
> 
> http://www.tomshardware.com/faq/id-1904869/disable-ulps-amd-crossfire-setups.html


I disabled ULPS now, eventhough I disabled it with the other card but it changed back to active. Thanks!

I got much higher GPU usage, but still not 100% on both. Would there be anything else?


----------



## xer0h0ur

To be perfectly honest with you I have never run that benchmark on this particular card. I had only used it once for shiz and giggles on my 270 then uninstalled it. I have only benchmarked with 3dmark. I just play games that I know will tax the GPUs and monitor it with Afterburner. If you have Hitman Absolution its a good test at 1600 x 1200 or higher resolutions with maxed out and ultra settings.


----------



## electro2u

Interesting. So that pcie link power saving option does seem to have some serious functionality on our boards. It didn't hurt my benches but it hurt my basic system activities like scrolling around the web. It does save power... But windows shouldn't put power saving as default!

There are further options for pcie power saving in the bios advanced features.


----------



## xer0h0ur

All of you guys are running the 14.7 Betas I assume? Will be sure to check my power settings in BIOS and Windows when I get home. I am also going to give that benchmark another crack tonight. 10 hour work days suck, another 4.5 hours







.


----------



## Synthaxx

So, after switching the 'bad' card back in... it does 20 fps on average less than the 'good' card.
The good one gets about 135 FPS.

Another thing I just noticed is that the bad card makes popping sounds on my surround sound and the good one does not....


----------



## xer0h0ur

Yeah it sounds like you went 50/50 on the PCB lottery.


----------



## electro2u

Popping sounds... Not cool. What the heck could that even be... The d/a converter? Same slot right? I would probably RMA the bad card if that popping noise over the speakers is easy to reproduce. It's not artifacting?


----------



## xer0h0ur

electro2u brings up a good point. Have you tried the lesser performing card in another PCI-E slot? Or both cards for that matter.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh I almost forgot, did you disable ULPS?
> 
> http://www.tomshardware.com/faq/id-1904869/disable-ulps-amd-crossfire-setups.html


I advised him to disable ULPS in the information I gave him, so hopefully he did that









Don't mind TSM he's a hardcore AMD fanboy according to several folks I've talked to on this forum, if you say anything wrong about AMD he'll attack you like a lil' Chihuahua.

Their drivers can, will, and have, for several people, become messed up when swapping cards. Even the exact same type of card with different BIOSes can cause the drivers to get irritated. Uninstalling & using a driver sweeping program to clean everything out ensures this does not happen. Is it COMPLETELY necessary every time? Probably not. Is it better safe than sorry? Yes.

There are people that believe you don't have to reinstall drivers at all, like him, there are people that believe reinstalling drivers repeatedly can actually break your OS (I think those people are nuts), and there are people that only reinstall drivers when upgrading to a newer set, then there's people like me that reinstall drivers whenever swapping out hardware, just to be on the safe side. *shrug* it's as much about being "better safe than sorry" than anything else.

Also, I don't think Syntax touched on it here, but he DID have NVidia files left over in his system from before he installed the AMD hardware & drivers, and did not originally use a driver cleaning program, unfortunately that didn't fix his issue completely, but it may have led to issues down the line, or given him other issues as well, so it's still good he did as was suggested









To Syn: I would suggest testing like I said in the PMs we've been having, put the no-issues card as your top/primary card & the one acting a bit weird as the lower card, and test in QuadFire, see if you get numbers that are in line with other QuadFire setups. Keep in mind that a lot of folks running QuadFire have OCs running when submitting Heaven/Valley/etc FPS/scores, so yours will be lower than theirs if you're stock, which you shouldn't be trying to OC yet til you figure this issue out and see if the one card needs RMAing afterall.


----------



## xer0h0ur

I am a huge AMD fan to be certain. I just don't let it cloud my judgement with relation to everything else out there. In fact I am pretty pissed at AMD for falling so far behind Intel in the processor game. However they really hit it out of the park with the Hawaii series video cards so I jumped all over this card.

As for tsm, its fine I really don't care. The world would be a very boring place if everyone was like-minded copies of each other.

As for that ULPS setting I really don't know why it turns itself back on sometimes. IMO it should remain disabled unless you do a driver re-install.


----------



## Synthaxx

Quote:


> Originally Posted by *electro2u*
> 
> Popping sounds... Not cool. What the heck could that even be... The d/a converter? Same slot right? I would probably RMA the bad card if that popping noise over the speakers is easy to reproduce. It's not artifacting?


Yeah, It's the same PCIe slot and the bad one makes cracking/popping noises. I did a fast OC and I got to to 1120 MHZ without throttling nor artifacting (the bad one).
I just plugged the card in and it immediately started making these noises .... very weird


----------



## Jpmboy

Quote:


> Originally Posted by *Synthaxx*
> 
> I have put it at high performance in the past, but it seems it switched to 'power saving'!! My FPS is now at 120+ in Heaven 4.0 ...
> 
> 
> 
> 
> 
> 
> 
> 
> I disabled ULPS now, eventhough I disabled it with the other card but it changed back to active. Thanks!
> 
> I got much higher GPU usage, but still not 100% on both. Would there be anything else?


open CCC and check your frame pacing setting, disable vcynch (i'm sure you knew that







)

NVM... "popping sounds". could be RMA time, but it OCs pretty well... spin it up and see how far it'll go...


----------



## Synthaxx

Quote:


> Originally Posted by *Jpmboy*
> 
> open CCC and check your frame pacing setting, disable vcynch (i'm sure you knew that
> 
> 
> 
> 
> 
> 
> 
> )
> 
> NVM... "popping sounds". could be RMA time, but it OCs pretty well... spin it up and see how far it'll go...


At 1140 it was immediately artifacting like crazy, so I put it at 1120 and it was fine.
Even with the 1120Mhz OC, it still performed less than the good card stock ....


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Yeah, It's the same PCIe slot and the bad one makes cracking/popping noises. I did a fast OC and I got to to 1120 MHZ without throttling nor artifacting (the bad one).
> I just plugged the card in and it immediately started making these noises .... very weird


Was t his before or after reinstalling drivers with it in the computer? Mayhaps a driver redo would fix it, who knows. What are you running audio out of, your motherboard's ports, a sound card, or the GPU itself?

Edit: Saw your audio in your sig, is that from the motherboard's plugs or a separate sound card?


----------



## Synthaxx

I did the same benchmark as this review here at 1080P:

http://www.kitguru.net/components/graphic-cards/zardon/amd-radeon-r9-295x2-review/9/

The bad card is more than 50 FPS *BELOW* that.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Was t his before or after reinstalling drivers with it in the computer? Mayhaps a driver redo would fix it, who knows. What are you running audio out of, your motherboard's ports, a sound card, or the GPU itself?
> 
> Edit: Saw your audio in your sig, is that from the motherboard's plugs or a separate sound card?


Since I didn't have time yet to remove the Z906 from the box, I used my brother's surround sound.
It's attached to the monitor, so I guess it's indirectly attached to my GPU, no?


----------



## Jpmboy

Quote:


> Originally Posted by *electro2u*
> 
> Is there any way to force the card to run in 8x instead of 16x mode on your boards? I mean... it's a dumb idea but I'm just wondering if it has something to do with PCIE lanes and the PLEX chip in the 295x2. You would think it would just bypass the PLEX chip if it was running in 16x mode.


cannot bypass the onboard plx chip. the PCIE bus is not used for CFX on single pcb-2 gpu cards. Two 295x2's is a different story... in this quad cfx config, and especially at 4K you have a large PCIE bus load. use the concurrent bandwidth test to really verify the pcie bus speed. (google it)

Quote:


> Originally Posted by *Synthaxx*
> 
> At 1140 it was immediately artifacting like crazy, so I put it at 1120 and it was fine.
> Even with the 1120Mhz OC, it still performed less than the good card stock ....


wow - can you see crazy throttling in MSI afterburner?


----------



## Synthaxx

Quote:


> Originally Posted by *Jpmboy*
> 
> cannot bypass the onboard plx chip. the PCIE bus is not used for CFX on single pcb-2 gpu cards. Two 295x2's is a different story... in this quad cfx config, and especially at 4K you have a large PCIE bus load. use the concurrent bandwidth test to really verify the pcie bus speed. (google it)
> wow - can you see crazy throttling in MSI afterburner?


There is no throttling at all, it stays at max all the time.
The difference in permormance is simply because the GPU is not utilizing enough. The good card stayed at 100% utilization in firestrike, the bad card goes from 50-90% all the time (on both cores)...


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> I did the same benchmark as this review here at 1080P:
> 
> http://www.kitguru.net/components/graphic-cards/zardon/amd-radeon-r9-295x2-review/9/
> 
> The bad card is more than 50 FPS *BELOW* that.


Let me check that out...

Okay. The settings they use on that site are kinda stupid. They disable Tessellation & some other things that result in Heaven 4.0 being very CPU-biased, meaning their scores are going to be higher or lower than yours, based more on the system's raw throughput than the actual card's performance. That's just silly... also, KitGuru doesn't even clearly list what exact the specs of their test system are... that's freakin' stupid.

Having said that, just look at your scores, I can tell you they're definitely off on the "bad" card. I would definitely seek to RMA it at this point, unless it performs the same as the other card in non-Unigine benchmarks like 3DMark11 or 3DMark Firestrike, because it does sound like you got a card with some issues.

Did you ever compare BIOSes ono the two cards to see if they match or are different?
Quote:


> Originally Posted by *Synthaxx*
> 
> Since I didn't have time yet to remove the Z906 from the box, I used my brother's surround sound.
> It's attached to the monitor, so I guess it's indirectly attached to my GPU, no?


Sounds like it. If you're intending on using a sound system that you'll hook up to your computer's on board sound, I wouldn't worry as much about the sound aspect, and focus instead on the card's performance issues as a means of determining whether or not it should be RMAed... but right now it's sounding like it should be.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> There is no throttling at all, it stays at max all the time.
> The difference in permormance is simply because the GPU is not utilizing enough. The good card stayed at 100% utilization in firestrike, the bad card goes from 50-90% all the time (on both cores)...


Yeah, I'd say just see if you can RMA the card to where you bought it from for a replacement, it sounds like it's got issues, and you should NOT have to deal with that if it's a brand new card.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Did you ever compare BIOSes ono the two cards to see if they match or are different?


They are identical, I checked from the first switch.


----------



## pompss

Quote:


> Originally Posted by *Synthaxx*
> 
> There is no throttling at all, it stays at max all the time.
> The difference in permormance is simply because the GPU is not utilizing enough. The good card stayed at 100% utilization in firestrike, the bad card goes from 50-90% all the time (on both cores)...


maybe i have your same problem

Can you post a picture of afterburn core utilization for the good card and the bad one

This is mine after firestrike


----------



## Synthaxx

Quote:


> Originally Posted by *pompss*
> 
> maybe i have your same problem
> 
> Can you post a picture of afterburn core utilization for the good card and the bad one
> 
> This is mine after firestrike


The good one stays above 90% all the time on both cores.
The bad one is just like yours, but yours is even worse than mine...


----------



## xer0h0ur

The thing about looking at core utilization during 3dmark testing is that there are gaps between tests that unload the GPUs. I am not sure if looping the combined test can maintain the GPUs loaded. That is why I just use something instead that keeps constant load like a game.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> The thing about looking at core utilization during 3dmark testing is that there are gaps between tests that unload the GPUs. I am not sure if looping the combined test can maintain the GPUs loaded. That is why I just use something instead that keeps constant load like a game.


If you have the full version of 3DMark, I believe the Demo is actually a great GPU stresser, as it combines tests 1 & 2 as well as some elements in between with very little "down time" for the GPUs. It's about as stressful temperature wise as a loop of Valley, and also neat to watch







I used that to check out QuadFire load temperatures on my system, in addition to 2 loops of Valley. It had fairly high GPU load even across 4 GPUs, so it should keep a R9 295x2 loaded up no problem.

One thing y'all might wanna try if you haven't on the cards having issues is the 14.6RC2s or the 14.7 Beta Drivers, just to see if they fix the issue. On the off chance they do, you'll know that at least future full release WHQL drivers will not have the issue anymore, which means it's not hardware afterall... but, given in Syn's case that one card works better than the other, I'd assuming a driver change isn't going to solve much.


----------



## xer0h0ur

Yeah I bought it on Steam a while back. Figures I wouldn't know that since I never run the demo lol. I always bench without the demo.


----------



## pompss

Quote:


> Originally Posted by *Synthaxx*
> 
> The good one stays above 90% all the time on both cores.
> The bad one is just like yours, but yours is even worse than mine...


can u run valley or heaven and sent me a screenshot like mine before of the gpu utilization of the good one
i think i have to rma my card asap.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I bought it on Steam a while back. Figures I wouldn't know that since I never run the demo lol. I always bench without the demo.


Most people bench without the demo, but running the demo itself as a stress test is pretty nice, and it does have pretty good GPU load that's fairly constant, for about twice as long as the entire benchmark run of the two graphics tests & combined test put together, roughly speaking, and not loading or pauses in between







I didn't realize it either til I ran it to show someone how neat my new setup was that wanted to see what it could do, ran the Demo just to see how it ran since I knew it looked pretty awesome, and it has sound so it's actually more impressive to watch (as a display/showcase piece) than just the standard soundless benchmark tests.


----------



## xer0h0ur

You know what, thermal throttling is both a curse and a blessing now that I think about it. Its a curse for those keeping the stock cooling system but its a blessing to those slapping a full coverage waterblock on it since we know our cores are staying loaded to reach those throttling temps.


----------



## pompss

guys some its playing wolfenstein ???

i have this up and down on both core and i just try to figure out if it's normal or not
i get 60 fps with both gpu working.


----------



## ViRuS2k

My firestrike, i7 devil @4.8



http://i.imgur.com/ltccXXd.jpg


----------



## axiumone

Why would you use precision for the 295x2? Why would you use precision at all?


----------



## ViRuS2k

Quote:


> Originally Posted by *axiumone*
> 
> Why would you use precision for the 295x2? Why would you use precision at all?


Its just a skin man, chill


----------



## Jpmboy

3dmk11.. can't find firestrike with performance monitor in the picture.



FS


----------



## xer0h0ur

I have no idea how to use the Heaven benchmark. I also don't like that there isn't a default setup people use. If people change the settings or use a different resolution benching then its useless to compare. The entire idea of benchmarking is keeping the test the same with only people's systems being different.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I advised him to disable ULPS in the information I gave him, so hopefully he did that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't mind TSM he's a hardcore AMD fanboy according to several folks I've talked to on this forum, if you say anything wrong about AMD he'll attack you like a lil' Chihuahua.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Their drivers can, will, and have, for several people, become messed up when swapping cards. Even the exact same type of card with different BIOSes can cause the drivers to get irritated. Uninstalling & using a driver sweeping program to clean everything out ensures this does not happen. Is it COMPLETELY necessary every time? Probably not. Is it better safe than sorry? Yes.
> 
> There are people that believe you don't have to reinstall drivers at all, like him, there are people that believe reinstalling drivers repeatedly can actually break your OS (I think those people are nuts), and there are people that only reinstall drivers when upgrading to a newer set, then there's people like me that reinstall drivers whenever swapping out hardware, just to be on the safe side. *shrug* it's as much about being "better safe than sorry" than anything else.
> 
> Also, I don't think Syntax touched on it here, but he DID have NVidia files left over in his system from before he installed the AMD hardware & drivers, and did not originally use a driver cleaning program, unfortunately that didn't fix his issue completely, but it may have led to issues down the line, or given him other issues as well, so it's still good he did as was suggested
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To Syn: I would suggest testing like I said in the PMs we've been having, put the no-issues card as your top/primary card & the one acting a bit weird as the lower card, and test in QuadFire, see if you get numbers that are in line with other QuadFire setups. Keep in mind that a lot of folks running QuadFire have OCs running when submitting Heaven/Valley/etc FPS/scores, so yours will be lower than theirs if you're stock, which you shouldn't be trying to OC yet til you figure this issue out and see if the one card needs RMAing afterall.


I would call you an idiot but I think you're doing a pretty good job of it all on your own.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/28040_40#post_22672453
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/28080_40#post_22672559

Quote:


> Originally Posted by *xer0h0ur*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I am a huge AMD fan to be certain. I just don't let it cloud my judgement with relation to everything else out there. In fact I am pretty pissed at AMD for falling so far behind Intel in the processor game. However they really hit it out of the park with the Hawaii series video cards so I jumped all over this card.
> 
> 
> As for tsm, its fine I really don't care. The world would be a very boring place if everyone was like-minded copies of each other.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> As for that ULPS setting I really don't know why it turns itself back on sometimes. IMO it should remain disabled unless you do a driver re-install.


That's fair enough and commendable tbh. But you know, I wasn't laughing at you, but the person you thought was more experienced. You guys in this thread need to venture out more. I don't think you've been here long enough to know who's who on OCN.


----------



## tsm106

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *axiumone*
> 
> Why would you use precision for the 295x2? Why would you use precision at all?
> 
> 
> 
> *Its just a skin man*, chill
Click to expand...

You have not been following along with all the news and info regarding Precision huh? EVGA as recently a few weeks ago basically ripped Afterburner line for line because EVGA was either too cheap or barred from receiving all of Unwinders new code, new code which brought 64bit support for OSD and all the recent developments he's added. EVGA was also terribad at reverse engineering the code so PrecisionX ver 25 or whatever it was, ended up being a big pile of turd. It's been pulled afaik. I don't really keep up with Precision news.

Thus, in short it is not just a skin.


----------



## HoneyBadger84

Lol love that he linked to edited versions of his posts where he's slightly less of a jerk to me. Like I said, if you say anything bad about AMD, he'll attack you, I have him on block for a reason. Commenting on "who's who" just goes to show how self absorbed and full of himself he is.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Lol love that he linked to edited versions of his posts where he's slightly less of a jerk to me. Like I said, if you say anything bad about AMD, he'll attack you, I have him on block for a reason. Commenting on "who's who" just goes to show how self absorbed and full of himself he is.


Lmao, if calling you out for posting stuff you made up is attacking then so be it. And I didn't edit anything. Read the damn fine print.
Quote:


> Edited by Arizonian - Yesterday at 10:27 pm View History


You must have written something stupid or wanted to hide something so the mod changed your posts, not mine.









See that link view history, I can see what he edited and it's not what I typed. Hmm, or maybe he changed the large bold font I used n quoting you lol.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have no idea how to use the Heaven benchmark. I also don't like that there isn't a default setup people use. If people change the settings or use a different resolution benching then its useless to compare. The entire idea of benchmarking is keeping the test the same with only people's systems being different.


Most people use Extreme HD preset as the benchmark setting of choice. I have no idea why kitguru didn't.


----------



## xer0h0ur

Quote:


> Originally Posted by *tsm106*
> 
> That's fair enough and commendable tbh. But you know, I wasn't laughing at you, but the person you thought was more experienced. You guys in this thread need to venture out more. I don't think you've been here long enough to know who's who on OCN.


Its a shame I have no interest in who's who. The last place I look for validation from large egos is on the internet. *shrug*


----------



## tsm106

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's fair enough and commendable tbh. But you know, I wasn't laughing at you, but the person you thought was more experienced. You guys in this thread need to venture out more. I don't think you've been here long enough to know who's who on OCN.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its a shame I have no interest in who's who. The last place I look for validation from large egos is on the internet. *shrug*
Click to expand...

Make up your mind then. You started it by saying i wasn't experienced and you would not trust what i write over someone you thought was greatly experienced. Maybe you shouldn't judge others so hastily? You brought up the question of integrity and now you say it's a beef over egos. Shakes head...


----------



## xer0h0ur

You're borderline psychotic. I said I trust an experience someone has had over theoreticals and/or how things are things are supposed to work. I never said you or he or anyone else for that matter is any more or less experienced. I don't know if you just like to manipulate words or are just an evolved troll. Either way, have fun with that.


----------



## Synthaxx

I did some firestrike demo runs and it seems the GPU utilization is getting better.

The score with a 4.6 Ghz processor with a stock card is:
http://www.3dmark.com/3dm/3742983?

The score with a 4.6 Ghz processor with a OC card is:
http://www.3dmark.com/3dm/3743057?

The utilization now mostly stays above 75% during firestrike and valley.

The valley score is 5488 with an average fps of 131.2.
High graphics and 4x aa


----------



## electro2u

Quote:


> Originally Posted by *Synthaxx*
> 
> I did some firestrike demo runs and it seems the GPU utilization is getting better.
> 
> The score with a 4.6 Ghz processor with a stock card is:
> http://www.3dmark.com/3dm/3742983?
> 
> The score with a 4.6 Ghz processor with a OC card is:
> http://www.3dmark.com/3dm/3743057?
> 
> The utilization now mostly stays above 75% during firestrike and valley.
> 
> The valley score is 5488 with an average fps of 131.2.
> High graphics and 4x aa


Those are good/normal scores for a single 295x2.


----------



## Synthaxx

Quote:


> Originally Posted by *electro2u*
> 
> Those are good/normal scores for a single 295x2.


well I was amazed really... The first time I benched the card I had 7400 as score ...


----------



## Synthaxx

Just did a heaven 4.0 bench with the card OCed. Having 55 fps for some reaon ....


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Just did a heaven 4.0 bench with the card OCed. Having 55 fps for some reaon ....


You running it on the extreme HD preset? Or am I thinking of Valley that has that? That's ridiculously low fps. Like I said previously, I get about 61FPS Average single card with the card at bone stock on Heaven, single card:
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yep, 1080p, 4xAA is what he ran it at so that's what I ran it at, with Extreme Tess & the highest quality setting.
> 
> Here's 14.6 RC2 Drivers:
> 
> 
> 
> FPS: 61.7
> Score: 1554


----------



## ViRuS2k

Quote:


> Originally Posted by *tsm106*
> 
> You have not been following along with all the news and info regarding Precision huh? EVGA as recently a few weeks ago basically ripped Afterburner line for line because EVGA was either too cheap or barred from receiving all of Unwinders new code, new code which brought 64bit support for OSD and all the recent developments he's added. EVGA was also terribad at reverse engineering the code so PrecisionX ver 25 or whatever it was, ended up being a big pile of turd. It's been pulled afaik. I don't really keep up with Precision news.
> 
> Thus, in short it is not just a skin.


No comment, as i am under a NDA as a msi beta tester.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> You running it on the extreme HD preset? Or am I thinking of Valley that has that? That's ridiculously low fps. Like I said previously, I get about 61FPS Average single card with the card at bone stock on Heaven, single card:


It was custom.

1920x1080
normal tess
4xaa
high graphics (not ultra!)

The 55 fps is indeed rediculous. The GPU doesn't utilize at all in heaven ...


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> It was custom.
> 
> 1920x1080
> normal tess
> 4xaa
> high graphics (not ultra!)
> 
> The 55 fps is indeed rediculous. The GPU doesn't utilize at all in heaven ...


Still on 14.4s? Have you tried the 14.6RC2s or 14.7 Betas?

I'd be getting an RMA started either way, this card sounds like more trouble than it's worth as far as the bad one goes. How's the 'good one' do?


----------



## ViRuS2k

better view:

http://i.imgur.com/wXqNEGE.jpg

heaven 4.0 ultra preset @1170/1690 i forgot to up the power target so gpu usage is almost fully used defo headroom for more performance....
4xaa ultra preset full screen @1080p...


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Still on 14.4s? Have you tried the 14.6RC2s or 14.7 Betas?
> 
> I'd be getting an RMA started either way, this card sounds like more trouble than it's worth as far as the bad one goes. How's the 'good one' do?


I'm using 14.7.

It's getting worse again. Also I noticed that windows changed itself to power saving...
I switched it back (again) but it remains bad.
Firestrike doesn't run smooth anymore (~8000 score), same with valley


----------



## electro2u

Quote:


> Originally Posted by *Synthaxx*
> 
> I'm using 14.7.
> 
> It's getting worse again. Also I noticed that windows changed itself to power saving...
> I switched it back (again) but it remains bad.
> Firestrike doesn't run smooth anymore (~8000 score), same with valley


Because you keep changing bios/cards... Afterburner is getting confused.


----------



## Synthaxx

Quote:


> Originally Posted by *electro2u*
> 
> Because you keep changing bios/cards... Afterburner is getting confused.


I haven't changed a thing and my scores suddenly dropped again. Really strange


----------



## Synthaxx

I just put them in quadfire configuration and oh boy .....

The GPU utilization on both cards are going like crazy. Getting lower FPS than I would get from 1 card. There is a huge amount of stuttering and lagging.
The noises out of my surround got much worse.

What should I do now...


----------



## HoneyBadger84

RMA the card that was performing worse to begin with, clean drivers & reinstall them with just the better of the two cards installed, See if the issues go away or not.


----------



## axiumone

Quote:


> Originally Posted by *Synthaxx*
> 
> I just put them in quadfire configuration and oh boy .....
> 
> The GPU utilization on both cards are going like crazy. Getting lower FPS than I would get from 1 card. There is a huge amount of stuttering and lagging.
> The noises out of my surround got much worse.
> 
> What should I do now...


You're running in surround? Portrait or landscape?

Edit - You should have read the whole thread before picking up two of these cards for surround. There's nothing wrong with your cards. Surround with two of these cards just doesn't work. The drivers are broken. I tried to warn people.

Look at this post from earlier - http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/1200_100#post_22523222


----------



## HoneyBadger84

Seem to recall him being running 4K but I could be wrong, don't think he's running Eyefinity at the moment. If he is, QuadFire is known to have issues with Eyefinity currently, and AMD is "working on it" so to speak. That wouldn't effect his 1080p scores on Heaven & Valley, or FireStrike though, so there's definitely something up with at least one of the cards for sure.

Edit: yep according to signature he's running 4K single monitor, so by surround he's refering to speakers.


----------



## axiumone

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Seem to recall him being running 4K but I could be wrong, don't think he's running Eyefinity at the moment. If he is, QuadFire is known to have issues with Eyefinity currently, and AMD is "working on it" so to speak. That wouldn't effect his 1080p scores on Heaven & Valley, or FireStrike though, so there's definitely something up with at least one of the cards for sure.
> 
> Edit: yep according to signature he's running 4K single monitor, so by surround he's refering to speakers.


I realized that after as well. I have sigs disabled, can you tell me which monitor it is? It's not listed in his rig profile.


----------



## Synthaxx

Quote:


> Originally Posted by *axiumone*
> 
> I realized that after as well. I have sigs disabled, can you tell me which monitor it is? It's not listed in his rig profile.


I dont have the PB287Q yet, the 22nd of august they'll send it to me.
Currently I hijacked my brother's room, took his 1920x1080 monitor and surround sound.

After rechecking, it seemed my motherboard bios was not up to date.
I dd a new Firestrike, I'm surprised ....

http://www.3dmark.com/3dm/3744661?


----------



## HoneyBadger84

Quote:


> Originally Posted by *axiumone*
> 
> I realized that after as well. I have sigs disabled, can you tell me which monitor it is? It's not listed in his rig profile.


It's the Asus 4K monitor that ends in a Q.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> I dont have the PB287Q yet, the 22nd of august they'll send it to me.
> Currently I hijacked my brother's room, took his 1920x1080 monitor and surround sound.
> 
> After rechecking, it seemed my motherboard bios was not up to date.
> I dd a new Firestrike, I'm surprised ....
> 
> http://www.3dmark.com/3dm/3744661?


I probably should've asked if you'd updated your motherboard BIOS earlier, but I"m glad I asked if you did







, I know updating mine helped with TriFire & QuadFire a bit







Hopefully that fixes ALL of your issues. Test some stuff, hit us back when you know for sure.

You should also look in to if your on board audio has a driver update, as if it does that may help with the audio issues, that would be on the motherboard's support website as well, same place you got the BIOS update


----------



## axiumone

Quote:


> Originally Posted by *Synthaxx*
> 
> I dont have the PB287Q yet, the 22nd of august they'll send it to me.
> Currently I hijacked my brother's room, took his 1920x1080 monitor and surround sound.
> 
> After rechecking, it seemed my motherboard bios was not up to date.
> I dd a new Firestrike, I'm surprised ....
> 
> http://www.3dmark.com/3dm/3744661?


Yeah, that's inline with what I was getting. - http://www.3dmark.com/fs/2229674


----------



## ViRuS2k

Quote:


> Originally Posted by *Synthaxx*
> 
> I dont have the PB287Q yet, the 22nd of august they'll send it to me.
> Currently I hijacked my brother's room, took his 1920x1080 monitor and surround sound.
> 
> After rechecking, it seemed my motherboard bios was not up to date.
> I dd a new Firestrike, I'm surprised ....
> 
> http://www.3dmark.com/3dm/3744661?


not very good price to performance is it. with 1 card i can get 19k
and with 2 you get 25

thats a 6% increese
loads more power at the electricity bill

for the outweigh in price and performance you where royally screwed lol
you would have been better Tri firing instead of Quad firing and the performance will be even worse for you when gaming and not running synthetic.


----------



## axiumone

Quote:


> Originally Posted by *ViRuS2k*
> 
> not very good price to performance is it. with 1 card i can get 19k
> and with 2 you get 25
> 
> thats a 6% increese
> loads more power at the electricity bill
> 
> for the outweigh in price and performance you where royally screwed lol
> you would have been better Tri firing instead of Quad firing and the performance will be even worse for you when gaming and not running synthetic.


Dual gpu cards were never really a value proposition. It's more about convenience.


----------



## HoneyBadger84

Quote:


> Originally Posted by *ViRuS2k*
> 
> not very good price to performance is it. with 1 card i can get 19k
> and with 2 you get 25
> 
> thats a 6% increese
> loads more power at the electricity bill
> 
> for the outweigh in price and performance you where royally screwed lol
> you would have been better Tri firing instead of Quad firing and the performance will be even worse for you when gaming and not running synthetic.


3DMark FireStrike, as with all 3DMark scores, doesn't scale straight up with just GPU performance increasing because it does include CPU benchmark stats, which do not increase. What you should be comparing is the Graphics Score, which is where the big difference will be seen. Also your math is off, 25K is about a 32% increase in performance from 19K









For instance:

His QuadFire Graphics Score: 39362 ( http://www.3dmark.com/3dm/3744661 )
My Single 290X Graphics Score: 11276 ( http://www.3dmark.com/fs/2524238 )

So in that area, he's very much getting excellent scaling.


----------



## Synthaxx

Quote:


> Originally Posted by *ViRuS2k*
> 
> not very good price to performance is it. with 1 card i can get 19k
> and with 2 you get 25
> 
> thats a 6% increese
> loads more power at the electricity bill
> 
> for the outweigh in price and performance you where royally screwed lol
> you would have been better Tri firing instead of Quad firing and the performance will be even worse for you when gaming and not running synthetic.


Yes, but graphics score really increased a lot! It's almost 40K

Ofcourse when buying a system like this, price to performance hardly matters









My sound issues has also miraculously dissapeared ... so far so good


----------



## HoneyBadger84

And just in case you're wondering:

QuadFire 290Xs OCed to 1150/1450 with my CPU @ 4.6GHz = Graphics Score: 40966 ( http://www.3dmark.com/fs/2513592 ) so yeah, his performance is now right where it should be, as far as FireStrike is concerned at least.


----------



## ViRuS2k

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 3DMark FireStrike, as with all 3DMark scores, doesn't scale straight up with just GPU performance increasing because it does include CPU benchmark stats, which do not increase. What you should be comparing is the Graphics Score, which is where the big difference will be seen. Also your math is off, 25K is about a 32% increase in performance from 19K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For instance:
> 
> His QuadFire Graphics Score: 39362 ( http://www.3dmark.com/3dm/3744661 )
> My Single 290X Graphics Score: 11276 ( http://www.3dmark.com/fs/2524238 )
> 
> So in that area, he's very much getting excellent scaling.


yeah my math was off but if you think about it
lets say for instance i get 25k in graphics score. he gets 39k
Quote:


> Originally Posted by *HoneyBadger84*
> 
> 3DMark FireStrike, as with all 3DMark scores, doesn't scale straight up with just GPU performance increasing because it does include CPU benchmark stats, which do not increase. What you should be comparing is the Graphics Score, which is where the big difference will be seen. Also your math is off, 25K is about a 32% increase in performance from 19K
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For instance:
> 
> His QuadFire Graphics Score: 39362 ( http://www.3dmark.com/3dm/3744661 )
> My Single 290X Graphics Score: 11276 ( http://www.3dmark.com/fs/2524238 )
> 
> So in that area, he's very much getting excellent scaling.


try applying that methodology to games and not benchmarks.
the scailing will not be great at all, everyone knows Quad is not optimized for **** where as Tri Fire is.. outside synthetics that is.

though i know one things for sure i wont be getting another R9 295x2 even though i could if i wanted i will though be getting a single R9290x instead and trifireing. the issues with quad far outweight the issues with Tri.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> And just in case you're wondering:
> 
> QuadFire 290Xs OCed to 1150/1450 with my CPU @ 4.6GHz = Graphics Score: 40966 ( http://www.3dmark.com/fs/2513592 ) so yeah, his performance is now right where it should be, as far as FireStrike is concerned at least.


Heaven 4.0
1920x1080, tess extreme, ultra graphics and 4x aa = 190 fps

Valley
1920x1080, ultra graphics and 4x aa = 128.1 fps


----------



## Synthaxx

After doing some more Valley runs, I always seem to get the same score.

0xa: 5580
4xaa: 5586
8xaa: 5579

Is that a CPU limitation?


----------



## HoneyBadger84

At 4K QuadFire actually scales very well across a lot of titles, you should check out Newegg's review of 290Xs in QuadFire, they did a bit of 4K & 3x 2560x1600 testing, and they also did some game testing with 295x2 QuadFire at 4K: 



 (benchmarks are towards the end of the video, about 8:30 in)

Edit: have to say I disagree with his opinion of 4-Way vs QuadFire. I think it's QuadFire regardless. NVIdia made the distinction of Quad SLi vs 4-Way SLi, AMD never really said anything one way or the other, people are just assuming they'd use the same terminology. I prefer Crossfire, TriFire & QuadFire


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> After doing some more Valley runs, I always seem to get the same score.
> 
> 0xa: 5580
> 4xaa: 5586
> 8xaa: 5579
> 
> Is that a CPU limitation?


CPU Limitation & your GPUs laughing at how easy 1080p is. Wait til you run 4K, you'll notice the difference then when applying AA. Right now your system is CPU-bottlenecked at 1080p, so applying AA isn't going to do jack score wise, it'll just raise your GPU load, as your GPUs have power to spare at that resolution.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> CPU Limitation & your GPUs laughing at how easy 1080p is. Wait til you run 4K, you'll notice the difference then when applying AA. Right now your system is CPU-bottlenecked at 1080p, so applying AA isn't going to do jack score wise, it'll just raise your GPU load, as your GPUs have power to spare at that resolution.


ohh, there was the yawning coming from ...


----------



## HoneyBadger84

I have to say, the wires on the LEPA G 1600W are some of the worst PSU wires I've ever dealt with. I know that thicker wire is a good thing, but these things are borderline un-arrangeable in many cases... Right now I just have all the wires routed up to the cards in a straight line (all together), looks pretty crappy, but at least the PSU itself seems to be perfectly capable of handling this system both in terms of the rail arrangement & picked, and actually powering it. Just ran a full run of 3DMark FireStrike Extreme & got up to 1360W at the wall, so not too bad, with the GPUs at all stock. Still got PSU headroom if I ever wanna do some OC runs again.

I really don't like this power supply though. Honestly debating sending it back. The wire issue, on top of the fact that it doesn't have an off switch (which I noticed no one really mentioned in any of the reviews, but I should've just looked at the darn pictures TBH), so the only way to "power off" the computer completely is to literally unplug it, which I HATE, and the fact of how the 8-pin PCI-E wires on it work, as well as how silly the CPU wires are if you need both the 8 & 4 pin, which I do... I didn't even try to plug in the EZ Plug 4-pin FDD plug... ran the wire, saw how much of a pain my arse it was gonna be to run it correctly, and pulled it back out of the system immediately, as I honestly don't think I need it for regular operation unless the cards are being run full throttle high draw, which they're not, and won't be until I get 4K, by which point I'l probably be on a 295x2 + 2 290Xs instead, which will lessen the strain on the motherboard/PCI-E slots, slightly, since one would no longer be occupied.

Kinda wishing I'd just stock with the dual PSU setup but I THOUGHT the LEPA G would be better quality than this. Don't get me wrong, thus far the PSU itself has good performance... but the cord & other non-power related issues it has are just stupid for a ~$300 unit to have.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> You have not been following along with all the news and info regarding Precision huh? EVGA as recently a few weeks ago basically ripped Afterburner line for line because EVGA was either too cheap or barred from receiving all of Unwinders new code, new code which brought 64bit support for OSD and all the recent developments he's added. EVGA was also terribad at reverse engineering the code so PrecisionX ver 25 or whatever it was, ended up being *a big pile of turd*. It's been pulled afaik. I don't really keep up with Precision news.
> *Thus, in short it is not just a skin*.


that's for sure !!

Quote:


> Originally Posted by *ViRuS2k*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> better view:
> http://i.imgur.com/wXqNEGE.jpg
> 
> heaven 4.0 ultra preset @1170/1690 i forgot to up the power target so gpu usage is almost fully used defo headroom for more performance....
> 4xaa ultra preset full screen @1080p...


you are using 4x AA. comparative scores... use 8x AA. It will drop your 130+ fps by a significant amount.
spin 'er up and post your score to that thread per the entry requirements in the OP


----------



## xer0h0ur

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I have to say, the wires on the LEPA G 1600W are some of the worst PSU wires I've ever dealt with. I know that thicker wire is a good thing, but these things are borderline un-arrangeable in many cases... Right now I just have all the wires routed up to the cards in a straight line (all together), looks pretty crappy, but at least the PSU itself seems to be perfectly capable of handling this system both in terms of the rail arrangement & picked, and actually powering it. Just ran a full run of 3DMark FireStrike Extreme & got up to 1360W at the wall, so not too bad, with the GPUs at all stock. Still got PSU headroom if I ever wanna do some OC runs again.
> 
> I really don't like this power supply though. Honestly debating sending it back. The wire issue, on top of the fact that it doesn't have an off switch (which I noticed no one really mentioned in any of the reviews, but I should've just looked at the darn pictures TBH), so the only way to "power off" the computer completely is to literally unplug it, which I HATE, and the fact of how the 8-pin PCI-E wires on it work, as well as how silly the CPU wires are if you need both the 8 & 4 pin, which I do... I didn't even try to plug in the EZ Plug 4-pin FDD plug... ran the wire, saw how much of a pain my arse it was gonna be to run it correctly, and pulled it back out of the system immediately, as I honestly don't think I need it for regular operation unless the cards are being run full throttle high draw, which they're not, and won't be until I get 4K, by which point I'l probably be on a 295x2 + 2 290Xs instead, which will lessen the strain on the motherboard/PCI-E slots, slightly, since one would no longer be occupied.
> 
> Kinda wishing I'd just stock with the dual PSU setup but I THOUGHT the LEPA G would be better quality than this. Don't get me wrong, thus far the PSU itself has good performance... but the cord & other non-power related issues it has are just stupid for a ~$300 unit to have.


I hear ya. I had the same issue with the wiring on my PSU. Some of the wires are brutally stiff like the 24 pin motherboard cable. I also ended up having a problem with my 8 pin CPU cable being too short so I had to buy an extension cable to keep it out of the way of the radiator for the 295X2. Now I can and am running a push/pull on the rad. It makes enough of a difference that I am not thermal throttling anymore like I was playing Hitman Absolution at 1600x1200 with maxed out and Ultra settings. Dropped the temps like 3-5C.


----------



## Rohandy

Quote:


> Originally Posted by *ViRuS2k*
> 
> i dont get any throttling @ stock 1030/1250 but @1150/1690 @1.3v i get throttling on the 2nd core gpu
> 
> 68c/74c...
> 
> its a shame cause my EK water block and back plate do a great job of killing the heat and the backplate is egg cooking hot lol so i guess the heat transferring to the backplate is doing its job.
> i might need to invest in some fan gaskets and also some liquid metal pro on the gpu cores. or invest in better RADS. thought 2x 360 rads would kick it but not with this card lol


Hey ViRuS2k I was wondering if the nuctua fans and spacer worked out better for you? As it seems something might have been wrong with the block backplate application. I know your over clocking your card but even at stock, your temps seem pretty high. I had a few suggestions as I'm running 1 r9 295x2 on a XSPC Raystorm 750 EX240 setup using only 1x 240 EX rad with 4 cheap but very good Purex fans in push pull and EK copper water block and backplate. I used the IC diamond 7 carat compound as i ran out of the one provided by EK. but upon reapplying the paste i noticed one of the thermal pads still had the protective film on it. Luckily I had to go back under the hood (I forgot to apply past on the bridge) or else I wouldn't have noticed it still on there. I would check for that in addition to trying a different thermal paste. I'm currently getting temps between 35c and 55c on load stock. But given your setup you should be getting even lower numbers. I would re-check contact between the block/chips to make sure you have proper contact as it seems the rest of your setup is just fine. I didn't use anything special outside of the block but made sure the block was contacting properly in those critical areas. BTW this is in a Corsair Carbide Series Spec-02 case with fans mounted inside the front, the rad, and then 2 more fans for push pull. I'm not using the CPU water block, but I have a top fan for intake which conveniently hits the back plate and a back fan for exhaust. Specs are AMD phenom x4 940 8gigs ram and corsair CX750M power supply. The card will be transferred to my higher end rig later in another location. That setup has a 3770k 16gigs a 240 and 360 1250watt OCZ so I'm hoping to get even better numbers in that rig. this is just my test rig before transplanting the card. Please keep us posted on your results.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I have to say, the wires on the LEPA G 1600W are some of the worst PSU wires I've ever dealt with. I know that thicker wire is a good thing, but these things are borderline un-arrangeable in many cases... Right now I just have all the wires routed up to the cards in a straight line (all together), looks pretty crappy, but at least the PSU itself seems to be perfectly capable of handling this system both in terms of the rail arrangement & picked, and actually powering it. Just ran a full run of 3DMark FireStrike Extreme & got up to 1360W at the wall, so not too bad, with the GPUs at all stock. Still got PSU headroom if I ever wanna do some OC runs again.
> 
> I really don't like this power supply though. Honestly debating sending it back. The wire issue, on top of the fact that it doesn't have an off switch (which I noticed no one really mentioned in any of the reviews, but I should've just looked at the darn pictures TBH), so the only way to "power off" the computer completely is to literally unplug it, which I HATE, and the fact of how the 8-pin PCI-E wires on it work, as well as how silly the CPU wires are if you need both the 8 & 4 pin, which I do... I didn't even try to plug in the EZ Plug 4-pin FDD plug... ran the wire, saw how much of a pain my arse it was gonna be to run it correctly, and pulled it back out of the system immediately, as I honestly don't think I need it for regular operation unless the cards are being run full throttle high draw, which they're not, and won't be until I get 4K, by which point I'l probably be on a 295x2 + 2 290Xs instead, which will lessen the strain on the motherboard/PCI-E slots, slightly, since one would no longer be occupied.
> 
> Kinda wishing I'd just stock with the dual PSU setup but I THOUGHT the LEPA G would be better quality than this. Don't get me wrong, thus far the PSU itself has good performance... but the cord & other non-power related issues it has are just stupid for a ~$300 unit to have.


Send it back and get the evga Supernova 1600









I really like the PSU, good looking and it has a 'watercooling trick' included ( where you connect green and black cable thingy), and the cables are just like corsair's ax1200i.
I have been running all benches in quadfire and I haven't heard anything from the supernova!

One bad thing is that the supernova is quite big. But that's bad because my cosmos 2 doesn't have more space


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Send it back and get the evga Supernova 1600
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I really like the PSU, good looking and it has a 'watercooling trick' included ( where you connect green and black cable thingy), and the cables are just like corsair's ax1200i.
> I have been running all benches in quadfire and I haven't heard anything from the supernova!
> 
> One bad thing is that the supernova is quite big. But that's bad because my cosmos 2 doesn't have more space


Eh, it all depends. I'll try again tomorrow with a fresh head & such to get the wiring neater, if I can't, I'll seriously consider my options from there. I really don't want to return it overall simply for wire & other minor issues that have nothing to do with it's actual performance... that no power switch thing is so stupid though. Don't know why anyone would design a modern PSU without a safety-off switch on the back of it.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have no idea how to use the Heaven benchmark. I also don't like that there isn't a default setup people use. If people change the settings or use a different resolution benching then its useless to compare. The entire idea of benchmarking is keeping the test the same with only people's systems being different.


but there is a "default" setting: http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/0_20

hope to see some quad fire goodness there


----------



## pompss

No custom bios for the r9 295x2 ???


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> No custom bios for the r9 295x2 ???


I've been using the sapphire OC bios - works fine. But no custom that I know of.


----------



## fishingfanatic

I can tell you that the Devil gets pretty warm. To much heat being dumped into the case imho.

http://www.3dmark.com/fs/2508976

Just a quick result @ 4.3


----------



## Jpmboy

Quote:


> Originally Posted by *fishingfanatic*
> 
> I can tell you that the Devil gets pretty warm. To much heat being dumped into the case imho.
> 
> http://www.3dmark.com/fs/2508976
> 
> Just a quick result @ 4.3
> 
> 
> Spoiler: Warning: Spoiler!


Nice card.. and score bro!









whad'ya do with your 3 780Ti s?


----------



## fishingfanatic

Sold 1, traded 2 more, still have just the 1, but loathe to part with it as I had so much fun with these cards.

I bought 2 titans w wbs for 1 and $200 which wasn't too bad, Got a lapped 4770k with a MSI gaming board and Corsair platinums as well as a 4930k for another.

Lucked out and traded 1 of those titans for an i7 970 R3E a 560 ti and a pr of 670s with wbs installed , bridge as well ready to go

Lots of systems to build, sell/trade now...









FF


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> I've been using the sapphire OC bios - works fine. But no custom that I know of.


Sound good too me could you link where i can download it.








I have xfx r9 295 x2 i think the bios sucks


----------



## electro2u

The OC bios is available earlier in the thread. CoolMike posted it in text format.


----------



## xer0h0ur

Quote:


> Originally Posted by *fishingfanatic*
> 
> I can tell you that the Devil gets pretty warm. To much heat being dumped into the case imho.
> 
> http://www.3dmark.com/fs/2508976
> 
> Just a quick result @ 4.3


I don't know if you're aware of it but Alphacool is 3D scanning custom PCB designs for new waterblock designs and the first person to offer up their video card gets a waterblock for free. If you're willing to go without your video card for an undisclosed period of time you can get the first, free, of its kind custom waterblock for that Devil 13.

http://www.bit-tech.net/news/hardware/2014/04/18/alphacool-to-3d-scan-gpus-to-make-waterbloc/1


----------



## Jpmboy

Quote:


> Originally Posted by *electro2u*
> 
> The OC bios is available earlier in the thread. CoolMike posted it in text format.


^^ this. there's two bios' a master and slave. Convert the eh... I'll post it.

atiflashfiles.zip 386k .zip file

it's the socx2m and socx2s.rom files in the folder.


----------



## Synthaxx

Ahh, Just installed the water cooling and it seems there is a leak around my 120mm radiator. Ofcourse that's just above my 295x2s so i'm currently using a big fan to let it dry.

Also my D5 pump seemed insanely loud (seriously .. really really loud ...) and wasn't moving much water. It wasn't screaming though, so it must not have been sucking to much air.

I'm not really sure how there could be a leak, it seemed it was coming from the radiator itself, but it didn't leak at all when I flushed them out. Maybe it was a fitting not tight enough?


----------



## Jpmboy

Quote:


> Originally Posted by *Synthaxx*
> 
> Ahh, Just installed the water cooling and it seems there is a leak around my 120mm radiator. Ofcourse that's just above my 295x2s so i'm currently using a big fan to let it dry.
> 
> Also my D5 pump seemed insanely loud (seriously .. really really loud ...) and wasn't moving much water. It wasn't screaming though, so it must not have been sucking to much air.
> 
> I'm not really sure how there could be a leak, it seemed it was coming from the radiator itself, but it didn't leak at all when I flushed them out. Maybe it was a fitting not tight enough?


Probably air in the pump... be sure to follow the priming instructions.


----------



## xer0h0ur

Man if I sprung a leak on a $1500 video card with close to $500 worth of additional hardware and Fujipoly pads I would be







I get in today the pads and misc last items I needed to put on my EK WB.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Man if I sprung a leak on a $1500 video card with close to $500 worth of additional hardware and Fujipoly pads I would be
> 
> 
> 
> 
> 
> 
> 
> I get in today the pads and misc last items I needed to put on my EK WB.


I know, right! If it's anything but pure distilled water - you gotta wash off any residue with new DW (preferably mega-ohm water) then let it dry for days...


----------



## fishingfanatic

Yeah, zerOhOur, if they haven't found one yet I'll be happy to send mine!!!

Sounds like it could be fun!

Could you give me more info plz?









FF


----------



## xer0h0ur

Quote:


> Originally Posted by *fishingfanatic*
> 
> Yeah, zerOhOur, if they haven't found one yet I'll be happy to send mine!!!
> 
> Sounds like it could be fun!
> 
> Could you give me more info plz?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FF


It says to go to their website and contact them. When I went to their site and clicked the "Send it and get one cooler for free" link on the right hand side of the site it loaded a page with a lot of German on it. I don't know if you can use google to translate or if the site itself has a language change feature or something but you're supposed to contact them through there. Good luck, hopefully you're the first to offer up a Devil 13 since its in fact one of the series of cards they would accept.


----------



## fishingfanatic

Yeah thanks, I went to their fb page, then sent a msg after u gave me a link.

I'll let u know if I'm lucky enough to be the 1st Devil... hehehe









FF


----------



## xer0h0ur

Nice, IMO that is practically your only way of getting a WB on that card. Either way even if you're not the first Devil 13 offered up at least they will then make a block for you. Its just obviously cooler to get it for free for borrowing your card to them.


----------



## pompss

Try to update but i receive the error no valid adapter found









atiwinflash -f -p 0 socx2m.rom


----------



## axiumone

Did you unlock the rom?

You have to run these commands before flashing:

atiflash -unlockrom 0
atiflash -unlockrom 1

Also remember, you will need to flash this card twice, once for each gpu 0 and 1.


----------



## Synthaxx

Quote:


> Originally Posted by *Jpmboy*
> 
> Probably air in the pump... be sure to follow the priming instructions.


There are no priming instructions. It's a D5 pump with a bitspower top. Is it allowed to have the intake on the bottom of the pump, directly coming from the reservoir?


----------



## pompss

Quote:


> Originally Posted by *axiumone*
> 
> Did you unlock the rom?
> 
> You have to run these commands before flashing:
> 
> atiflash -unlockrom 0
> atiflash -unlockrom 1
> 
> Also remember, you will need to flash this card twice, once for each gpu 0 and 1.


Ati flash its not working on my computer

if i use atiwinflash i get "no adapater found" even with the command atiwinflash --unlockrom 0


----------



## xer0h0ur

Quote:


> Originally Posted by *pompss*
> 
> Ati flash its not working on my computer
> 
> if i use atiwinflash i get "no adapater found" even with the command atiwinflash --unlockrom 0


You have some serious bad juju working against you. Time to see a witch doctor or something


----------



## axiumone

Oh, I see what you're doing wrong. You're using atiwinflash. Use atiflash from dos and everything will be fine. Winflash doesn't support these cards.


----------



## doctakedooty

Just ordered me on of these bad boys from newegg. Can't wait to receive it but won't be able to play with it for awhile. I am using it in a MITX build inside a 250D case. I know the specs are 307mm for the length of the card and the case can hold 290 mm with the bracket in place and 305 mm without the bracket in. I will be able to fit it in there though. Going to full loop but how much rad space you guys think the 295x2 needs? Not been on the amd card thing for awhile and my big desktop has 3 780ti at the moment in it. In the case I am planning on doing 2 240mm rads and possibly modding the bottom to house a third 240mm rad. The cpu will be a i7 4770K on the maximus vi impact.


----------



## Synthaxx

Quote:


> Originally Posted by *doctakedooty*
> 
> Going to full loop but how much rad space you guys think the 295x2 needs?


I think you will have more than enough with 3 240s
I'll be running quadfire with a 360, 2 240s and a 120.

Edit: Not 3 very thin 240s ofcourse


----------



## xer0h0ur

Quote:


> Originally Posted by *doctakedooty*
> 
> Just ordered me on of these bad boys from newegg. Can't wait to receive it but won't be able to play with it for awhile. I am using it in a MITX build inside a 250D case. I know the specs are 307mm for the length of the card and the case can hold 290 mm with the bracket in place and 305 mm without the bracket in. I will be able to fit it in there though. Going to full loop but how much rad space you guys think the 295x2 needs? Not been on the amd card thing for awhile and my big desktop has 3 780ti at the moment in it. In the case I am planning on doing 2 240mm rads and possibly modding the bottom to house a third 240mm rad. The cpu will be a i7 4770K on the maximus vi impact.


Dual 240's should be enough to keep this card and a CPU cool. I am going to see if I can get by with a 144mm rad + an Alphacool NexXxos Monsta 120 which is basically an 80mm thick double stacked 120.


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Dual 240's should be enough to keep this card and a CPU cool. I am going to see if I can get by with a 144mm rad + an Alphacool NexXxos Monsta 120 which is basically an 80mm thick double stacked 120.


Yea my main rig I got 2 alphacool 480 monstas and 2 240s. Once I get everything in I am going to try to squeeze a 240 monsta in the front or bottom. The only way I figure I can fit the 295x2 in the 250D which I think would be a beast of a lan pc is I am going to have to disassemble the case and then pretty much rebuild the case around the card and motherboard. I plan on doing a build log on it and them I am already working on the sheet metal work for some of the case mods in it. Havent seen a 250d with a 295x2 in it so it should be interesting.


----------



## xer0h0ur

Nice, I am not very experienced in case modding but its always nice to see what people come up with.


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> Try to update but i receive the error no valid adapter found
> 
> 
> 
> 
> 
> 
> 
> 
> 
> atiwinflash -f -p 0 socx2m.rom


ugh - I thought you knew ... (sorry) you need to make a bootable usb stick with atiflash on it. winflash will no9t work! put the bios' on the key you boot from.


----------



## Jpmboy

Quote:


> Originally Posted by *axiumone*
> 
> Oh, I see what you're doing wrong. You're using atiwinflash. Use atiflash from dos and everything will be fine. Winflash doesn't support these cards.


^^ this!!

this is what you need on the bootable usb stick:

usbcontent.zip 386k .zip file


@doctakedooty - great!! post up when you get it. It's a great card with a waterblock.


----------



## xer0h0ur

To anyone that has installed the EK WB. Where am I supposed to put TIM other than the GPUs and the PLX chip? The manual says "on the phase regulator that is being covered with thermal pad" but doesn't say which is the phase regulator. I also don't know if they mean to put TIM directly on the phase regulator then the pad on top of that or if its the pad directly on top of the phase regulator then TIM on top of the pad.


----------



## Skinnered

Quote:


> Originally Posted by *HoneyBadger84*
> 
> So basically you want 2 where it's showing you, then the other 2 in the middle top and right top, via that diagram:
> 
> 
> 
> That gives you one connector in each of the 30A rails.


Following this connection solves all my power issue's. Thanx for the help!!
(I was so stupid to use the combined cables on only two rails for the two R295x2's total)


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> To anyone that has installed the EK WB. Where am I supposed to put TIM other than the GPUs and the PLX chip? The manual says "on the phase regulator that is being covered with thermal pad" but doesn't say which is the phase regulator. I also don't know if they mean to put TIM directly on the phase regulator then the pad on top of that or if its the pad directly on top of the phase regulator then TIM on top of the pad.


EK generally advises using a dab of tim with their T-pads. You don't need it with Fuji pads. Just the GPUs IMO. Does the EK block ACTUALLY make contact with the PLX chip? ie, is there a milled and polished surface where the PLX is? Impressive if there is. No such thing on the koolance block.

erm... the milling looks the same except for 2 additional contact millings on the koolance.







I think my koolance has a pad on the plx.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jpmboy*
> 
> EK generally advises using a dab of tim with their T-pads. You don't need it with Fuji pads. Just the GPUs IMO. Does the EK block ACTUALLY make contact with the PLX chip? ie, is there a milled and polished surface where the PLX is? Impressive if there is. No such thing on the koolance block.


Yeah the picture for the mounted side of the waterblock shows that the PLX chip is making contact with it.


----------



## Jpmboy

edited post above

if you are using fuji, just the gpu's need tim. I found the tim on the fuji pads "gums" things up... more than they already are with these t-pads. (like putty)


----------



## xer0h0ur

Okay cool. Thanks. I just got home from work. Bloody 11 hour work day cut me off at the knees. As excited as I am to crack open this package and install everything, I don't know if I have the energy for it. Might just wait till my next day off, Monday. Coincidentally my birthday. Turning 30. Its all downhill from here boys.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Okay cool. Thanks. I just got home from work. Bloody 11 hour work day cut me off at the knees. As excited as I am to crack open this package and install everything, I don't know if I have the energy for it. Might just wait till my next day off, Monday. Coincidentally my birthday. Turning 30. Its all downhill from here boys.


ABSOLUTELY best to do it with a clear (.. focused) head. Mistakes can be costly.


----------



## xer0h0ur

Well I decided to wait to install everything but I was pre-fitting and making sure I had everything I needed when I realized....guess who bought the wrong size barbs? Yup. This guy.


----------



## electro2u

Quote:


> Originally Posted by *axiumone*
> 
> Oh, I see what you're doing wrong. You're using atiwinflash. Use atiflash from dos and everything will be fine. Winflash doesn't support these cards.


Works in windows on my 295x2. Adaptor not found means he has a bricked bios and he's trying to write over it using the 295x2. To rewrite a corrupt bios on these you have to use another video source like connecting your monitor to the onboard video or another video card.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well I decided to wait to install everything but I was pre-fitting and making sure I had everything I needed when I realized....guess who bought the wrong size barbs? Yup. This guy.


I feel your pain. I'm still waiting on parts I didn't know I needed. About watercooling while tired--don't. I punctured a 300$ radiator because I was slightly drunk while test fitting and forgot to watch what I was doing with the fan mounting screws, which were slightly too long.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Skinnered*
> 
> Following this connection solves all my power issue's. Thanx for the help!!
> (I was so stupid to use the combined cables on only two rails for the two R295x2's total)


I'm using the same PSU now, it's even more complicated getting 4 individual cards wired. Lol are the plugs on your PSU as stiff as mine? Very hard to plug in all the way.


----------



## Synthaxx

I finally was able to get the loop running without leaks, now I'm trying to get all the air out and in a few hours i'll probably run the PC for the first time under a full water loop.
I hope I installed everything ( waterblocks) properly so I don't have to drain the whole system again.

What will be a realistic temperature for a quadfire config under a open loop?


----------



## axiumone

Quote:


> Originally Posted by *electro2u*
> 
> Works in windows on my 295x2. Adaptor not found means he has a bricked bios and he's trying to write over it using the 295x2. To rewrite a corrupt bios on these you have to use another video source like connecting your monitor to the onboard video or another video card.


Could depend on the windows version. In 8.1 winflash wouldn't even start properly for me, while in dos I was able to flash these cards with no issues. In any case, it's always a terrible idea to flash any bios in windows, because the chances of a bad flash are much greater.


----------



## Skinnered

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm using the same PSU now, it's even more complicated getting 4 individual cards wired. Lol are the plugs on your PSU as stiff as mine? Very hard to plug in all the way.


Yeah and you get 4 loose ends pcie connectors now too which I had to tape to prevent electrical interference.
They are a bit stiff but I managed to connect them fairly easy allthough I had to be sure I had the right ones unplugged.


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> ugh - I thought you knew ... (sorry) you need to make a bootable usb stick with atiflash on it. winflash will no9t work! put the bios' on the key you boot from.


Quote:


> Originally Posted by *axiumone*
> 
> Oh, I see what you're doing wrong. You're using atiwinflash. Use atiflash from dos and everything will be fine. Winflash doesn't support these cards.


Yes i didn't know that.
I was able to flash the new bios and i get better performance with valley around 5-6 fps more.So thank for the help guyz

Think is that if i try to increase the power limit and the voltage my vga get worse like 15-20 fps less in valley.With 3d mark i get really bad artifact with the same core speed i use with valley.
Also if i increase the power to 30+ and voltage 30+ when benching with 3d mark the artifact stops but i get worse performance and at the beginning of the last test my pc crash turning off.
This was happen with my original bios and also with sapphire bios.
My psu its seasonic x 1250 so i don't understand why by increasing the power and voltage i get worse performance and 3d mark crashes .


----------



## Synthaxx

Quote:


> Originally Posted by *pompss*
> 
> Yes i didn't know that.
> 
> I was able to flash the new bios and i get better performance with valley around 5-6 fps more.So thank for the help
> Think is that if i try to increase the power limit and the voltage my vga get worse like 15-20 fps less in valley.With 3d mark i get really bad artifact with the same core speed i use with valley.
> Also if i increase the power to 30+ and voltage 30+ when benching with 3d mark the artifact stops but i get worse performance and at the beginning of the last test my pc crash turning off.
> This was happen with my original bios and also with sapphire bios.
> My psu its seasonic x 1250 so i don't understand why by increasing the power and voltage i get worse performance and 3d mark crashes .


Check whether windows is on High performance.
Are you really sure you have latest motherboard BIOS? Flashing my motherboard bios solved my problems. I have an Asus rampage IV Black edition.


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> Yes i didn't know that.
> I was able to flash the new bios and i get better performance with valley around 5-6 fps more.So thank for the help guyz
> 
> Think is that if i try to increase the power limit and the voltage my vga get worse like 15-20 fps less in valley.With 3d mark i get really bad artifact with the same core speed i use with valley.
> *Also if i increase the power to 30+ and voltage 30+ when benching with 3d mark the artifact stops but i get worse performance and at the beginning of the last test my pc crash turning off.
> This was happen with my original bios and also with sapphire bios.*
> My psu its seasonic x 1250 so i don't understand why by increasing the power and voltage i get worse performance and 3d mark crashes .


two things to check for:

1) a system shut down (eg, no BSOD, just turns off) is an OCP (over current protection) either on your power supply, or in your MB. That seasonic 850W may not be able to cope with the power draw. Verify that your CPU OC is stable. Run p95 with the following settings set your card(s) to default clocks: Custom, 12288 ram (out of 16GB) and 5 min per FFT. Run it for 20 min or so. If it doesn't crash or workers stop, your MB OCP is not it. Or.. 5 loops of IBT with 65% of ram committed. (this is more a test of your cooling solution than cpu stability)
2) beg seal or borrow a 1000-1200W PSU and see if it shuts off. I'm betting the 850W PSU just is not enough with overclocking the 295x2.


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> two things to check for:
> 
> 1) a system shut down (eg, no BSOD, just turns off) is an OCP (over current protection) either on your power supply, or in your MB. That seasonic 850W may not be able to cope with the power draw. Verify that your CPU OC is stable. Run p95 with the following settings set your card(s) to default clocks: Custom, 12288 ram (out of 16GB) and 5 min per FFT. Run it for 20 min or so. If it doesn't crash or workers stop, your MB OCP is not it. Or.. 5 loops of IBT with 65% of ram committed. (this is more a test of your cooling solution than cpu stability)
> 2) beg seal or borrow a 1000-1200W PSU and see if it shuts off. I'm betting the 850W PSU just is not enough with overclocking the 295x2.


with intel cpu stress standard level my system is stable

As is said i have a seasonic x 1250 watt.

I get really good result with valley but with 3d mark firestrike i get 13500 score and a lot for artifact


----------



## xer0h0ur

I think he was looking at your signature. Your sig rig specs are outdated.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> I think he was looking at your signature. Your sig rig specs are outdated.










Thanks. NOthing more frustrating than playing "guess my kit" when trying to help folks out.


----------



## doctakedooty

Quote:


> Originally Posted by *pompss*
> 
> Yes i didn't know that.
> I was able to flash the new bios and i get better performance with valley around 5-6 fps more.So thank for the help guyz
> 
> Think is that if i try to increase the power limit and the voltage my vga get worse like 15-20 fps less in valley.With 3d mark i get really bad artifact with the same core speed i use with valley.
> Also if i increase the power to 30+ and voltage 30+ when benching with 3d mark the artifact stops but i get worse performance and at the beginning of the last test my pc crash turning off.
> This was happen with my original bios and also with sapphire bios.
> My psu its seasonic x 1250 so i don't understand why by increasing the power and voltage i get worse performance and 3d mark crashes .


Pc crashing during last test of 3dmark usually means the psu has hit ocp like jpm said. More then likely since its the last test which puts both the gpu and cpu under full load its probably going over the 1250w but the seasonic will actually handle 1300 w before it trips the ocp. Increasing voltage on cards you will be surprised how much more wattage it will draw. I would try running your board, hdd, cpu etc off a spare psu if you have one and then use your seasonic on the gpu alone. Also you can try putting cpu back at stock settings and try it again as it shouldnt draw that much on stock. Any oc on the cpu you need to account for and then the increase in voltage for the gpu and how many watts its pulling. I have fried a few psu and a 1250w seasonic was one by drawing to much on it with a oc 3930k and 3 780s at 1.25v each. Anyways sorry for the long response my money is just on yout going to need a bigger psu to handle the load your asking for when benching gaming is slighly different as you wont probably see everything aka your cpu and gpus at max load at the same time so you may never run into that issue.


----------



## xer0h0ur

How many things are running off that PSU and how far overclocked? I find it hard to believe that a PSU that needs to draw over 1300 Watts to exceed its limits is failing you. I was under the impression a 1250W PSU should be enough to handle a 295X2 plus a 290X.


----------



## HoneyBadger84

Quote:


> Originally Posted by *xer0h0ur*
> 
> How many things are running off that PSU and how far overclocked? I find it hard to believe that a PSU that needs to draw over 1300 Watts to exceed its limits is failing you. I was under the impression a 1250W PSU should be enough to handle a 295X2 plus a 290X.


More like will. QuadFire 290Xs in my testing at stock only pull 1300-1400W at the wall and that's with a 4.6GHz 3930K feeding them. If a 1250W PSU can't handle TriFire the PSU is either a POS, going bad, or you have the rails arranged wrong.


----------



## electro2u

Pompps did you take your CPU's overclock off? Try running witha stock 100BCLK if you aren't.


----------



## Synthaxx

Tested my quadfire config under xspc waterblocks and during 3Dmark firestrike I don't go above 50°c.
My CPU on the other hand goes under load in prime95 to 74° C. Overclocked to 4.7Ghz.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> Tested my quadfire config under xspc waterblocks and during 3Dmark firestrike I don't go above 50°c.
> My CPU on the other hand goes under load in prime95 to 74° C. Overclocked to 4.7Ghz.


Sounds like my Vapor-X R9 280Xs, 54C max load at 1111MHz #rekt

I wonder why Sapphire hasn't put out a Vapor-X aircooled 295x2, I think it could handle that card pretty well. I mean stock 280Card start under 1000MHz, the fact that it can keep the card cool at 1111MHz at 60% fan speed is freakin' nuts.

Some evil genius should saw up two Vapor-X coolers and stick them on a 295x2, see how it does. Lol

Your CPU getting that warm isn't that abnormal, but you may wanna check the seating on the CPU block just in case.

I broke a retention clip on the stock heatsink for the i5 2320 I put in the P67 resurrection build without realizing it and it was idling at 60C. I was like "I knew the stock heatsink sucked but Jesus!" Turned out it was only half making paste contact. Slapped the H100 on it and, with no fans on the radiator, it idles at 36-38C lol


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Sounds like my Vapor-X R9 280Xs, 54C max load at 1111MHz #rekt
> 
> I wonder why Sapphire hasn't put out a Vapor-X aircooled 295x2, I think it could handle that card pretty well. I mean stock 280Card start under 1000MHz, the fact that it can keep the card cool at 1111MHz at 60% fan speed is freakin' nuts.
> 
> Some evil genius should saw up two Vapor-X coolers and stick them on a 295x2, see how it does. Lol
> 
> Your CPU getting that warm isn't that abnormal, but you may wanna check the seating on the CPU block just in case.
> 
> I broke a retention clip on the stock heatsink for the i5 2320 I put in the P67 resurrection build without realizing it and it was idling at 60C. I was like "I knew the stock heatsink sucked but Jesus!" Turned out it was only half making paste contact. Slapped the H100 on it and, with no fans on the radiator, it idles at 36-38C lol


I did a +40mv on my cards with a 4.6Ghz processor (it seems it can't handle 4.7 Ghz under prime anymore :S ). On all cards 1100/1690, the temps got up to 62°. I do have to admit that on a 240 and the 120 are no fans installed yet...


----------



## doctakedooty

Quote:


> Originally Posted by *HoneyBadger84*
> 
> More like will. QuadFire 290Xs in my testing at stock only pull 1300-1400W at the wall and that's with a 4.6GHz 3930K feeding them. If a 1250W PSU can't handle TriFire the PSU is either a POS, going bad, or you have the rails arranged wrong.


The seasonic x1250 which is what he said he has is a single rail psu. Just going off reviews as I dont have either so I cant use a kill-a-watt to test the draw. According to tomshardware.com and guru3d.com both on stock bios. The 290x with maxed voltage can pull up to around 335w. The 295x2 can pull 507w with no voltage increase at full load. Custom bios and increased voltages and power targets can increase these. Also depending how old the psu is and how hard it has been pushed in its life it loses wattage over time. A kill-a-watt on his rig and watchimg it which he is benching should give a good idea how much his psu is pulling from the wall.

He wasnt running stock and using a custom bios with a alot of extra voltage. Thats where the things get fishy is when increasing voltages as the wattage starts to increase substantially on any card. Also 4.6 ghz says a clock not the voltage used for the 3930k. When I fried mine I had my 780s running at 1.3 v on the cards in tri sli and 3930k at 5 ghz at 1.44v roughly and right before it popped the 1250w psu and it was during the last test of firestrike where the combined test is where it pushes the gpu and cpu to max.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Synthaxx*
> 
> I did a +40mv on my cards with a 4.6Ghz processor (it seems it can't handle 4.7 Ghz under prime anymore :S ). On all cards 1100/1690, the temps got up to 62°. I do have to admit that on a 240 and the 120 are no fans installed yet...


Fans will definitely help a lot, you're talkin' about a lot of heat them radiators are dissipating between the GPUs & that CPU. 62C is high for liquid cooling on GPUs, if I understand correctly, but not ridiculous, especially since you still have fans to install. I seem to remember someone saying anything above mid 50s load is a bit high under liquid, generally speaking, but, that may be different from the R9 290X/295x2 cores.


----------



## Synthaxx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Fans will definitely help a lot, you're talkin' about a lot of heat them radiators are dissipating between the GPUs & that CPU. 62C is high for liquid cooling on GPUs, if I understand correctly, but not ridiculous, especially since you still have fans to install. I seem to remember someone saying anything above mid 50s load is a bit high under liquid, generally speaking, but, that may be different from the R9 290X/295x2 cores.


Yeah, 62° is indeed quite high. But ofcourse with a stock water cooler the card can go up in the 70s. Another factor is that my fans didn't want to spin up properly because they're connected to the CPU 4pin pwm, and it was during graphics card stress test. Maybe I should install speedfan or any other sort of program which can adapt my fans not only to my cpu, but also to my graphics cards.


----------



## ImperialOne

Would Speedfan be better than letting a good bios like Asus' UEFI run the fans w/ FanXpert 2?


----------



## doctakedooty

Quote:


> Originally Posted by *Synthaxx*
> 
> Yeah, 62° is indeed quite high. But ofcourse with a stock water cooler the card can go up in the 70s. Another factor is that my fans didn't want to spin up properly because they're connected to the CPU 4pin pwm, and it was during graphics card stress test. Maybe I should install speedfan or any other sort of program which can adapt my fans not only to my cpu, but also to my graphics cards.


Another thing you can do I did it for about a month before I got my lamptron cw611 controller is pick up a water temp sensor and plug it into a temp sensor if your board has a plug in for it then set your profile based on the water temps. Also it will allow you to see how warm your water is getting which I find very useful to control the temps of my components.

I use a temp sensor coming out of each of my radiators on my 900d and my lamptron fan controller adjust each of my radiators fans according to there temps to get it around my desired temps.
Quote:


> Originally Posted by *ImperialOne*
> 
> Would Speedfan be better than letting a good bios like Asus' UEFI run the fans w/ FanXpert 2?


If you use the asus software for more then just fan profiles which the software in the ai suite is pretty good for fan controlling but if you use the voltage adjustment etc in it then I would keep it as it essentially does the same thing but if you dont use any of the ai suite stuff for anything besides fan profiles I would just remove the software and use the fanxpert since it uses less resources and hdd space.


----------



## Synthaxx

Quote:


> Originally Posted by *doctakedooty*
> 
> Another thing you can do I did it for about a month before I got my lamptron cw611 controller is pick up a water temp sensor and plug it into a temp sensor if your board has a plug in for it then set your profile based on the water temps. Also it will allow you to see how warm your water is getting which I find very useful to control the temps of my components.
> 
> I use a temp sensor coming out of each of my radiators on my 900d and my lamptron fan controller adjust each of my radiators fans according to there temps to get it around my desired temps.


That is an option but can't I do it software only?

I use fan xpert2 and the only thing I can control is cpu temp. Ofcourse benching a quadfire config with 3 fans short (they didn't have them in stock) and fans not spinning up is quite dangerous for overheating.

So I read speedfan is one of the best software control tools around, anyone uses this tool?


----------



## doctakedooty

Quote:


> Originally Posted by *Synthaxx*
> 
> That is an option but can't I do it software only?
> 
> I use fan xpert2 and the only thing I can control is cpu temp. Ofcourse benching a quadfire config with 3 fans short (they didn't have them in stock) and fans not spinning up is quite dangerous for overheating.
> 
> So I read speedfan is one of the best software control tools around, anyone uses this tool?


I use speedfan and like it for fan controlling. If you have a asus board which sorry I cant see rig specs since I am on my phone and using the mobile site you should have 2 pin temp sensors on the board which all the rog boards I know have. Then use speed fan to adjust the rads fans based on the temp. I will say roughly all my rads have a temp sensor when the water exits and my 480 monsta is usually about a 1.2C drop from the temps coming in to the going out temps. My 240 alphacool slim is about .4 C drop 240 ek about .6C drop and other 480 is about .8C drop. All those are with corsair sp120 fans on then at about 50%. During gaming tri sli 780tis max voltage of 1.21v and my 4930k at 4.7 @ 1.38v my water temp is about 27C max with ambiant at 21-23C


----------



## remnant

wait there are enough people who actually own these for there to be a club?!?!


----------



## HoneyBadger84

Quote:


> Originally Posted by *remnant*
> 
> wait there are enough people who actually own these for there to be a club?!?!


Of course.


----------



## doctakedooty

Quote:


> Originally Posted by *remnant*
> 
> wait there are enough people who actually own these for there to be a club?!?!


I mean I know this will sound fan boy but I love my 780ti and would love sli ti in my mitx build I am doing but titan z is a joke at $3000. I dont hate the red team either and honestly at the performace per dollar for a single slot gpu with 2 gpu cores this is a no brainer. Even a x79 chipset would be nice to have 2 of these to take full advantage of the x16 pcie slots


----------



## Earth Dog

Anything can be a club here... this place is so big... and the information so hard to find... 1900+ posts for a $1500 card, with zero useful information in the first post... 'dem britches... 'dey big.







Quote:


> Originally Posted by *HoneyBadger84*
> 
> Fans will definitely help a lot, you're talkin' about a lot of heat them radiators are dissipating between the GPUs & that CPU. 62C is high for liquid cooling on GPUs, if I understand correctly, but not ridiculous, especially since you still have fans to install. I seem to remember someone saying anything above mid 50s load is a bit high under liquid, generally speaking, but, that may be different from the R9 290X/295x2 cores.


I have it on a MCR320 and an old PA120.2 and with overclocking (CPU at 4.8GHz) the GPU at 1110/1600 +30mv, I was seeing temps hitting 63C after the loop saturates (about 30 minutes in). This is using 3 yate loon's at 1K RPM (other rad fans were not turned on).

I'm guessing his loop wasn't close to being saturated. That is over 600W of heat to dissipate...with no fans...

That rule cannot apply to this card...


----------



## remnant

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Of course.


which I was able to be apart of such a club. :'(

Quote:


> Originally Posted by *doctakedooty*
> 
> I mean I know this will sound fan boy but I love my 780ti and would love sli ti in *my mitx build I a*m doing but titan z is a joke at $3000. I dont hate the red team either and honestly at the performace per dollar for a single slot gpu with 2 gpu cores this is a no brainer. Even a x79 chipset would be nice to have 2 of these to take full advantage of the x16 pcie slots


I get that.


----------



## fishingfanatic

Well xer0h0ur it looks like if I can wait for a bit I could be getting a block.

Hi Dave,
thank you for your email.
At the moment we are in the holiday season.
If there is still interest in a month, send me an email.
I have a question. Which country are you from?
Best Regards, Janusch Kawetzki

From Alphacool.

FF


----------



## Synthaxx

Quote:


> Originally Posted by *doctakedooty*
> 
> I mean I know this will sound fan boy but I love my 780ti and would love sli ti in my mitx build I am doing but titan z is a joke at $3000. I dont hate the red team either and honestly at the performace per dollar for a single slot gpu with 2 gpu cores this is a no brainer. Even a x79 chipset would be nice to have 2 of these to take full advantage of the x16 pcie slots


I first had a 780ti classy. Then I sold it and got an extremely good deal on a titan Z, so I bought that. A week after I sold the Titan Z with 500$ profit and bought my quadfire 295x2.
I have no hate for either graphics card maker but the Titan Z's price is hard *NOT* to hate

Quote:


> Originally Posted by *Earth Dog*
> 
> Anything can be a club here... this place is so big... and the information so hard to find... 1900+ posts for a $1500 card, with zero useful information in the first post... 'dem britches... 'dey big.
> 
> 
> 
> 
> 
> 
> 
> 
> I have it on a MCR320 and an old PA120.2 and with overclocking (CPU at 4.8GHz) the GPU at 1110/1600 +30mv, I was seeing temps hitting 63C after the loop saturates (about 30 minutes in). This is using 3 yate loon's at 1K RPM (other rad fans were not turned on).
> 
> I'm guessing his loop wasn't close to being saturated. That is over 600W of heat to dissipate...with no fans...
> 
> That rule cannot apply to this card...


You have 1 or 2 of these cards? If you have 2, cooling it with that rad space is quite an achievement...


----------



## Earth Dog

One card...


----------



## xer0h0ur

Quote:


> Originally Posted by *fishingfanatic*
> 
> Well xer0h0ur it looks like if I can wait for a bit I could be getting a block.
> 
> Hi Dave,
> thank you for your email.
> At the moment we are in the holiday season.
> If there is still interest in a month, send me an email.
> I have a question. Which country are you from?
> Best Regards, Janusch Kawetzki
> 
> From Alphacool.
> 
> FF


Sweet. I figured you had pretty good odds of being the first Devil 13 since there are only 250 of them including all the cards they gave out to reviewers.


----------



## Jpmboy

Quote:


> Originally Posted by *remnant*
> 
> wait there are enough people who actually own these for there to be a club?!?!


yep, just think of all the $$ !!
Quote:


> Originally Posted by *doctakedooty*
> 
> I mean I know this will sound fan boy but I love my 780ti and would love sli ti in my mitx build I am doing but titan z is a joke at $3000. I dont hate the red team either and honestly at the performace per dollar for a single slot gpu with 2 gpu cores this is a no brainer. Even a x79 chipset would be nice to have 2 of these to take full advantage of the x16 pcie slots


Hey Doc - I'm running green and red. I really like the 295x2 - does eberything very well... well except benchmarking since we only have OEM bios' at this point.

Anyone know of an AMD equivalent to Kepler Bios Tweaker??


----------



## papi4baby

Hey guys, if you would like AMD to make the crossfire on/off on CCC switch available to this card please post on this thread and ask for it.

Thanks.

http://www.overclock.net/t/1501216/amd-catalyst-14-7-rc-driver-for-windows-out-now/130#post_22696236


----------



## axiumone

Quote:


> Originally Posted by *papi4baby*
> 
> Hey guys, if you would like AMD to make the crossfire on/off on CCC switch available to this card please post on this thread and ask for it.
> 
> Thanks.
> 
> http://www.overclock.net/t/1501216/amd-catalyst-14-7-rc-driver-for-windows-out-now/130#post_22696236


You can already disable crossfire with the 295x2. In CCC create a profile for the game you'd like to run on the single GPU and select the crossfire profile as "DISABLED". This will run the game on a single GPU.


----------



## papi4baby

I want to turn it completely off via CCC not game profile.

It still does not run right that way. At least not for me.


----------



## axiumone

Quote:


> Originally Posted by *papi4baby*
> 
> I want to turn it completely off via CCC not game profile.
> 
> It still does not run right that way. At least not for me.


Not sure what you're trying to run, but this method disables EVERY gpu except for the primary to run whatever application. Why would they implement another way to achieve the exact same result?


----------



## fishingfanatic

Yeah that's what I got my hands on, so it wasn't full price, but a sweet card when running well.

I imagine it'll surprise me under watercooling conditions, though the unit I have now won't be enough, so it's time to build a real wcing system. maybe go to the wreckers and get a rad with fans...

A chiller,....I can c that wcing adjustments will likely never end hehehe. Perhaps an aquarium with ice water to run the lines thru,...hhmmmm!!!









Oh darn it there goes the budget...









FF


----------



## Synthaxx

Just got a question about the crossfire disable thing.

So If I would run a quadfire and with this setup some games are crashing all the time and I only want the first 295x2 to work, what do I do then?
Just disable crossfire? Or will that only use 1 gpu of the first 295x2?


----------



## axiumone

Quote:


> Originally Posted by *Synthaxx*
> 
> Just got a question about the crossfire disable thing.
> 
> So If I would run a quadfire and with this setup some games are crashing all the time and I only want the first 295x2 to work, what do I do then?
> Just disable crossfire? Or will that only use 1 gpu of the first 295x2?


If you disabling crossfire through CCC, then it will run the games on ONE card, disabling crossfire with a game profile will run the game on ONE gpu.


----------



## Synthaxx

Quote:


> Originally Posted by *axiumone*
> 
> If you disabling crossfire through CCC, then it will run the games on ONE card, disabling crossfire with a game profile will run the game on ONE gpu.


Ahh, that explains a lot. Thanks!


----------



## fishingfanatic

You don't need both for gaming, way overkill, but I like overkill. Pull 1 card imho.

FF


----------



## xer0h0ur

ITS ALIVE! ITS ALIIIIIIIIIIIIIIIIIIIIIIIIIIIVE! Okay, its only partially alive. I mounted the block and backplate yesterday but I am still waiting on the right size barbs and the extra quick disconnects.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> ITS ALIVE! ITS ALIIIIIIIIIIIIIIIIIIIIIIIIIIIVE! Okay, its only partially alive. I mounted the block and backplate yesterday but I am still waiting on the right size barbs and the extra quick disconnects.


nice! To bad you bought some wrong sized barbs.
Also don"t forget to keep all stock thermal pads (in good condition).


----------



## HoneyBadger84

The way I'm figuring the math, I should be able to get one of these sometime before my birthday (in November) easily enough. Really looking forward to it. Resold 3 of my Core Editions cuz they irritated me too much yesterday. lol


----------



## remnant

Quote:


> Originally Posted by *HoneyBadger84*
> 
> The way I'm figuring the math, I should be able to get one of these sometime before my birthday (in November) easily enough. Really looking forward to it. Resold 3 of my Core Editions cuz they irritated me too much yesterday. lol


the way i figure the math i should be able to get one of these sometimes before my grave,


----------



## xer0h0ur

Quote:


> Originally Posted by *Synthaxx*
> 
> nice! To bad you bought some wrong sized barbs.
> Also don"t forget to keep all stock thermal pads (in good condition).


Well that ship has sailed lol. In the past it was a good idea to run the card and warm it up before removing stock coolers and I did that so the pads were a bit gummy and tore up when taking off the Asetek cooler. In other words if anyone is going to remove the Asetek cooler to mount a waterblock, do it while the card is cold so you get the pads to come off clean.


----------



## papi4baby

Quote:


> Originally Posted by *axiumone*
> 
> Not sure what you're trying to run, but this method disables EVERY gpu except for the primary to run whatever application. Why would they implement another way to achieve the exact same result?


Sorry for the late reply.

World of Warplanes is also another tittle that i would love to play but cannot due to xfire making it look like crap. And i tried your profile idea( worked on Ghost, thanks by the way) and i would not turn off xfire that way.

I would just like to have an on/off switch on CCC like if i was running two card xfire setup.

Thanks for anyone's support on this matter.


----------



## axiumone

Quote:


> Originally Posted by *papi4baby*
> 
> Sorry for the late reply.
> 
> World of Warplanes is also another tittle that i would love to play but cannot due to xfire making it look like crap. And i tried your profile idea( worked on Ghost, thanks by the way) and i would not turn off xfire that way.
> 
> I would just like to have an on/off switch on CCC like if i was running two card xfire setup.
> 
> Thanks for anyone's support on this matter.


I'll check out world of planes. In the meantime, make sure you're pointing the game profile to the actuall game .exe and not just the game launcher. I'm almost willing to bet that's the issue.


----------



## Jpmboy

Quote:


> Originally Posted by *papi4baby*
> 
> Sorry for the late reply.
> 
> World of Warplanes is also another tittle that i would love to play but cannot due to xfire making it look like crap. And i tried your profile idea( worked on Ghost, thanks by the way) and i would not turn off xfire that way.
> 
> I would just like to have an on/off switch on CCC like if i was running two card xfire setup.
> 
> Thanks for anyone's support on this matter.


have you tried changing the Frame Pacing setting?


----------



## xer0h0ur

Question to those who waterblocked their 295X2. Did anyone have no video output issues upon initial startup? I need to remove the block and check to make sure everything is making contact correctly but there is no video output on startup and the system won't give the single beep from the POST. In other word there are no audible codes from the motherboard. I hope I didn't brick the damn thing. I dropped back in the 270 to see if the motherboard was damaged or anything else and it took a few restarts for the system to go past the POST and load Windows.

Am I missing something here? Did Diamond put in some sort of fail-safe in their bios to detect a missing VRM fan or something? I think I am going to re-install the original Asetek cooler and try again.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well that ship has sailed lol. In the past it was a good idea to run the card and warm it up before removing stock coolers and I did that so the pads were a bit gummy and tore up when taking off the Asetek cooler. In other words if anyone is going to remove the Asetek cooler to mount a waterblock, do it while the card is cold so you get the pads to come off clean.


Quote:


> Originally Posted by *xer0h0ur*
> 
> Question to those who waterblocked their 295X2. Did anyone have no video output issues upon initial startup? I need to remove the block and check to make sure everything is making contact correctly but there is no video output on startup and the system won't give the single beep from the POST. In other word there are no audible codes from the motherboard. I hope I didn't brick the damn thing. I dropped back in the 270 to see if the motherboard was damaged or anything else and it took a few restarts for the system to go past the POST and load Windows.
> 
> Am I missing something here? Did Diamond put in some sort of fail-safe in their bios to detect a missing VRM fan or something? I think I am going to re-install the original Asetek cooler and try again.


No, I had no issues. I did have a little longer black screen but once I installed ccc 14.7 that was gone too.


----------



## Earth Dog

No issues here either... thankfully.


----------



## xer0h0ur

Well ****. Not looking good for me then.


----------



## crazygamer123

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well ****. Not looking good for me then.


Hopefully you find a solution because Diamond's warranty is voided once you install the block.


----------



## xer0h0ur

Yup, I am aware.


----------



## CrisInuyasha

I just received the uefi capable bios for the Sapphire OC version of the card. If anyone is interested:

https://dl.dropboxusercontent.com/u/20405147/sapphire_r9_295x2_oc_uefi.zip


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Question to those who waterblocked their 295X2. Did anyone have no video output issues upon initial startup? I need to remove the block and check to make sure everything is making contact correctly but there is no video output on startup and the system won't give the single beep from the POST. In other word there are no audible codes from the motherboard. I hope I didn't brick the damn thing. I dropped back in the 270 to see if the motherboard was damaged or anything else and it took a few restarts for the system to go past the POST and load Windows.
> 
> Am I missing something here? Did Diamond put in some sort of fail-safe in their bios to detect a missing VRM fan or something? I think I am going to re-install the original Asetek cooler and try again.


this is gonna sound stupid but... make sure the bios switch is not in-between the two positions. happens sometimes when handling the card.








Quote:


> Originally Posted by *crazygamer123*
> 
> Hopefully you find a solution because Diamond's warranty is voided once you install the block.


unless they put stickers over the mount screws... undetectable.


----------



## xer0h0ur

No they don't have any stickers anywhere that are broken or removed when taking off the Asetek cooler and backplate. My problem lies in that I took the pads off the original cooler and the backplate since they got all torn up on removal. In other words I would need to get the same type of pads back on it or else they would notice.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> No they don't have any stickers anywhere that are broken or removed when taking off the Asetek cooler and backplate. My problem lies in that I took the pads off the original cooler and the backplate since they got all torn up on removal. In other words I would need to get the same type of pads back on it or else they would notice.


I'd [bet] someone in this forum knows which ones they are/were... open a thread asking. Bbios switch fully set on a position? (happens to 780 classifieds on occasion)
BUt unless you did something REALLY wrong when mounting the block - it's gotta work.


----------



## xer0h0ur

My card is still within the 30 days with Newegg, wouldn't it just be better to send it to them instead of sending it to Diamond?


----------



## Jazam

@xer0h0ur

Going through a similar issue man. When my r9 is plugged in my pc shuts down in 5 minutes. Runs fine with integrated graphics. I have the xspc water block on it and I'm a freaking dumb **** and the fan from the reference cooler is cut. So I'll have to find an exact cord to match resolder and hook back up. Then hope they take the warranty. Unless I can find out if I did something wrong on install. /:


----------



## xer0h0ur

I have been at work all damn day and the what ifs are just torturing me. I won't even have time to check the card tonight until way later if at all. Probably won't know for sure until I crack it open again tomorrow since its my day off. I felt a giant pit in my stomach last night though when the system wouldn't POST.


----------



## Jazam

I know this will be too much of a hassle for some if your card is mounted, but could some one measure the length of the 3 pin cable coming from the fan into the extension. Since my fan no longer has a cable I'll have to cut and solder a new one to much the original.


----------



## crazygamer123

Quote:


> Originally Posted by *Jazam*
> 
> I know this will be too much of a hassle for some if your card is mounted, but could some one measure the length of the 3 pin cable coming from the fan into the extension. Since my fan no longer has a cable I'll have to cut and solder a new one to much the original.


Its about 15 cm.


----------



## Traviisty

Not sure if this is the right place to ask but I just got my new computer today with the R9295x2, i7 4790k 4.0, MSI Z97 Gaming 7 MB, and corsair 850W PSU. I'm having an issue where as soon as I install the R9's drivers and reboot I get a black screen right after the windows logo...when i disable or uninstall the drivers it works fine again. I've tried clean installs of the OS, clean installs of the drivers switching from the mini DP to the DVI, using older drivers..lots of things.

Is this a common problem? Is it the PSU? nothing is overclocked at the moment. It's frustrating becasue i have this awesome(expensive) new piece and I can't even get it to work...

Thanks for any help you can give me....


----------



## crazygamer123

Quote:


> Originally Posted by *Traviisty*
> 
> Not sure if this is the right place to ask but I just got my new computer today with the R9295x2, i7 4790k 4.0, MSI Z97 Gaming 7 MB, and corsair 850W PSU. I'm having an issue where as soon as I install the R9's drivers and reboot I get a black screen right after the windows logo...when i disable or uninstall the drivers it works fine again. I've tried clean installs of the OS, clean installs of the drivers switching from the mini DP to the DVI, using older drivers..lots of things.
> 
> Is this a common problem? Is it the PSU? nothing is overclocked at the moment. It's frustrating becasue i have this awesome(expensive) new piece and I can't even get it to work...
> 
> Thanks for any help you can give me....


What is your driver version? 14.7 might fix your problem. And "no" your PSU is fine with the R9 as long as you dont OC it too far.


----------



## xer0h0ur

Its not bricked! I just re-installed the original Asetek closed loop cooler, put the card back in and shazam it worked just fine on the first boot. This has me supremely confused. Why in the hell does my card not boot with the EKWB? The only explanation I can arrive at is that Diamond's bios has a fail-safe in it that doesn't allow the card to power on if it doesn't detect those three plugs connected from the Asetek cooler?


----------



## xer0h0ur

Quote:


> Originally Posted by *Traviisty*
> 
> Not sure if this is the right place to ask but I just got my new computer today with the R9295x2, i7 4790k 4.0, MSI Z97 Gaming 7 MB, and corsair 850W PSU. I'm having an issue where as soon as I install the R9's drivers and reboot I get a black screen right after the windows logo...when i disable or uninstall the drivers it works fine again. I've tried clean installs of the OS, clean installs of the drivers switching from the mini DP to the DVI, using older drivers..lots of things.
> 
> Is this a common problem? Is it the PSU? nothing is overclocked at the moment. It's frustrating becasue i have this awesome(expensive) new piece and I can't even get it to work...
> 
> Thanks for any help you can give me....


Well I ditched an 850W PSU because it was multi rail and couldn't dish out the amps needed per 8-pin. What model PSU are you using?

Edit: Brainfart, saw it in your sig


----------



## Pilotseye

Hello, I am having a problem with the pump of my 295X2 from Sapphire, it won't turn on together with my PC an the card. It used to work at first boot, but after some time it stopped while PC was running and forced the fan on the card to turn higher and crash with black screen. Even after several boots the pump incl fan won't turn on again.

Has anybody experienced already this issue or knows a fix?

My specs: Asrock 970 Extreme 4 with 4GB RAM and bequiet PowerZone 1000W PSU

Thanks


----------



## HoZy

Quote:


> Originally Posted by *Pilotseye*
> 
> Hello, I am having a problem with the pump of my 295X2 from Sapphire, it won't turn on together with my PC an the card. It used to work at first boot, but after some time it stopped while PC was running and forced the fan on the card to turn higher and crash with black screen. Even after several boots the pump incl fan won't turn on again.
> 
> Has anybody experienced already this issue or knows a fix?
> 
> My specs: Asrock 970 Extreme 4 with 4GB RAM and bequiet PowerZone 1000W PSU
> 
> Thanks


Warranty it.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Pilotseye*
> 
> Hello, I am having a problem with the pump of my 295X2 from Sapphire, it won't turn on together with my PC an the card. It used to work at first boot, but after some time it stopped while PC was running and forced the fan on the card to turn higher and crash with black screen. Even after several boots the pump incl fan won't turn on again.
> 
> Has anybody experienced already this issue or knows a fix?
> 
> My specs: Asrock 970 Extreme 4 with 4GB RAM and bequiet PowerZone 1000W PSU
> 
> Thanks


Definitely RMA the card with the place of purchase or if its too late for that, the manufacturer.


----------



## fishingfanatic

Yeah, Jpmboy been there done that,... took a few minutes to realize it, and it actually sounds similar to what I had , but I don't know too

much about this stuff to mention anything like that.

Good luck X...

FF


----------



## Traviisty

Quote:


> Originally Posted by *crazygamer123*
> 
> What is your driver version? 14.7 might fix your problem. And "no" your PSU is fine with the R9 as long as you dont OC it too far.


I tried 14.4 and 14.7 and some of the older drivers as well


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its not bricked! I just re-installed the original Asetek closed loop cooler, put the card back in and shazam it worked just fine on the first boot. This has me supremely confused. Why in the hell does my card not boot with the EKWB? The only explanation I can arrive at is that Diamond's bios has a fail-safe in it that doesn't allow the card to power on if it doesn't detect those three plugs connected from the Asetek cooler?












I'd remount the EK block - you put a pad on the plx chip - right?.
Save the two current bios' with gpuZ. If you really think there is a Fan error-trap in the Diamond bios. post the Diamond master and slave bios and let's have a look.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd remount the EK block - you put a pad on the plx chip - right?.
> Save the two current bios' with gpuZ. If you really think there is a Fan error-trap in the Diamond bios. post the Diamond master and slave bios and let's have a look.


Man it felt like I took a 1000lbs off my shoulders when I realized it wasn't bricked. When I mounted the EKWB I noticed that a couple of the RAM pads weren't making good contact with the block and one of the backplate's pads didn't make any contact at all with the VRMs. I highly doubt any of that was the reason behind the system not posting though. When I had originally mounted the EKWB I had used PK-3 on the PLX chip as the mounting instructions say to do but when I put the original cooler back on I instead stuck on a 1.5mm Fujipoly pad on it.

With respect to the BIOS I am going to need a little hand holding on this one since I have never done anything with a video card's BIOS. I saved a BIOS with GPU-Z but its a single file. Do I need to turn off my system and flip the BIOS switch on the card to get the other BIOS? Or is it just selecting the other GPU from the bottom drop down box and saving that BIOS?


----------



## Jpmboy

Select the other bios in gpuz and save in the same way
*gd, I hate typing on this s4!


----------



## electro2u

Man I'm glad it wasn't bricked. I was thinking of how that could even happen... would have to seriously twist the pcb or something. Would definitely notice if you made a mistake big enough to brick the card. I'd probably take the OC master/slave BIOS CoolMike posted and try that out, with your regular BIOS on the other switch. Each switch position has 2 BIOS files associated with it. One for each GPU, master&slave.


----------



## HoneyBadger84

Hope your card fixing works out, glad to hear it's not bricked Xero









I want one of these so bad, having to heavily resist pulling the trigger til I get my debt paid off. I could buy it right now but I'm about $4k in the hole because of truck repair bills & getting my dad a newer computer setup (Scarlett Destroyer Resurrection in my signature), so I can't do it right now...

I think probably right around the time of my B-day in November I'll be able to get one, or who knows, maybe if I can get some people to just gift me money, I might get two


----------



## xer0h0ur

Okay so here are the BIOS files I got from GPU-Z: http://xer0hour.com/Vesuvius.zip


----------



## Jazam

I have the power color r9 295x2. If gpuz shows ATI as bios would that cause terrible performance and pc shut downs?


----------



## electro2u

Quote:


> Originally Posted by *Jazam*
> 
> I have the power color r9 295x2. If gpuz shows ATI as bios would that cause terrible performance and pc shut downs?


No, they all say ATI, afaik. All the 295x2s are reference except for the Devil 13. Sounds like a PSU issue possibly. I was hitting around 800Watts easily with just 1 295x2. Depending on the rest of your system you may be seeing a higher power draw than your PSU is up for.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Okay so here are the BIOS files I got from GPU-Z: http://xer0hour.com/Vesuvius.zip










gimme some time...


----------



## xer0h0ur

Thanks bro, I appreciate the help.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Thanks bro, I appreciate the help.


It's identical to the stock powercolor bios... so no fan-error trap specific to the Diamond card (doubted there would be)

OEM reference bios(yours) vs SapphireOC bios: lot's of difference. (I'm running this one - works great!)


----------



## xer0h0ur

So ultimately there wasn't anything in there that would have stopped the card from powering on? Should I still change my BIOS? If so how do I do that. Remember I have never flashed a video card's BIOS so I have no idea what I am doing when it comes to that.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> So ultimately there wasn't anything in there that would have stopped the card from powering on? Should I still change my BIOS? If so how do I do that. Remember I have never flashed a video card's BIOS so I have no idea what I am doing when it comes to that.


get atiflash, make a bootable usb stick. put the 2 sapphire bios' on the stick, boot from USB and flash from DOS. the command sequence is straight forward (easy google for step-by-step). Flash the slave first then the master.
... or just stick with the Diamond (ati reference) bios. flash after you get some run time on that card!


----------



## electro2u

Honestly flashing the BIOS on these is slightly a pain in the ass and pretty much unnecessary. Must be something pretty simple that kept the card from POSTing. I liked JPMboy's theory on the dipswitch being stuck but there are a few other possibilities as well. I suspect if you try it again you'll have no trouble.


----------



## xer0h0ur

Well that is the thing, the dip switch was not stuck in the middle. I tried it flipped both ways too when the EKWB was mounted. There really shouldn't have been any reason why it didn't POST. Not going to lie, I feel anxiety just thinking about mounting the block again. I would also need to get some extra RAM pads since these things stick far too well and some get torn up on removal.


----------



## electro2u

You did fine, obviously--the card is still kicking.

http://s647.photobucket.com/user/electrotoyou/media/DSC_0002_zpsbdd57c82.jpg.html

Here's where I'm at atm. I haven't had my 295x2 running in like a month. The only place I could find local to install everything still hadn't really made any progress after 3 weeks so I took it home last week and I've been working on it myself. Should be fairly clear from the amateurish vinyl work. I'm redoing the worst spots right now.

No idea if my loop will leak once I finish it but I'm waiting on a few fittings and some wiring accessories I need. The drain port on the bottom was so tight to fit in I had to cheat a little and I think the threading on the radiator might have been damaged. Bit worried about that. It's hidden under the front radiator.


----------



## spacefighter

Well, I've had my Diamond 295x2 for less than a week and it just died. Heard a loud popping sound and the system restarted. There was no burnt smell after checking things out. Next I booted the system and the video card card had lights but was no longer outputting video. Video was now being sent over my mobo's HDMI. I swapped out my 8pin pcie connectors and plugged into different slots on my PSU. Same result. I also noticed there was no heat coming off the 295x2 radiator. Popped in a spare video card and system booted with video just fine.

I'm guessing it's a blown capacitor somewhere on the card. Any thoughts?

Suppose I'll add a couple of pictures. Glad I hadn't zip tied anything yet.


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> So ultimately there wasn't anything in there that would have stopped the card from powering on? Should I still change my BIOS? If so how do I do that. Remember I have never flashed a video card's BIOS so I have no idea what I am doing when it comes to that.


Are you able to mod the bios that way? Specifically adjust the vrm fan curve?


----------



## crazygamer123

Quote:


> Originally Posted by *spacefighter*
> 
> Well, I've had my Diamond 295x2 for less than a week and it just died. Heard a loud popping sound and the system restarted. There was no burnt smell after checking things out. Next I booted the system and the video card card had lights but was no longer outputting video. Video was now being sent over my mobo's HDMI. I swapped out my 8pin pcie connectors and plugged into different slots on my PSU. Same result. I also noticed there was no heat coming off the 295x2 radiator. Popped in a spare video card and system booted with video just fine.


Same problem with the first sapphire i got. A defective card for sure! time to RMA.


----------



## xer0h0ur

Dang that really blows spacefighter. At least since its only a week old you can return it to the retailer instead of waiting weeks with an RMA to Diamond.


----------



## spacefighter

Quote:


> Originally Posted by *xer0h0ur*
> 
> Dang that really blows spacefighter. At least since its only a week old you can return it to the retailer instead of waiting weeks with an RMA to Diamond.


Yeah, I'm surprised newegg is making me pay the shipping. In the past, they just emailed me a shipping label free of charge. I imagine I can get them to refund the cost if I call. In the mean time, I think I might order some sleeved corsair cables. About a third of my case is being taken up by cables right now.


----------



## xer0h0ur

Before I even attempt to put the EKWB back on my card I need to shorten almost all of my tubing and delete a quick disconnect so that I can fit the drain port. My case is cramped as all hell right now.


----------



## Jeronbernal

Anyone know diamonds warranty info? I want to waterblock the diamond 295x2, but I'm hesitant if removing the stock cooler will void it's warranty. Any ideas?


----------



## xer0h0ur

From what I can tell only MSI allows people to WB their card without voiding the warranty. I was told I had voided my warranty when putting the block on my Diamond card.


----------



## HoneyBadger84

Quote:


> Originally Posted by *electro2u*
> 
> You did fine, obviously--the card is still kicking.
> 
> http://s647.photobucket.com/user/electrotoyou/media/DSC_0002_zpsbdd57c82.jpg.html
> 
> Here's where I'm at atm. I haven't had my 295x2 running in like a month. The only place I could find local to install everything still hadn't really made any progress after 3 weeks so I took it home last week and I've been working on it myself. Should be fairly clear from the amateurish vinyl work. I'm redoing the worst spots right now.
> 
> No idea if my loop will leak once I finish it but I'm waiting on a few fittings and some wiring accessories I need. The drain port on the bottom was so tight to fit in I had to cheat a little and I think the threading on the radiator might have been damaged. Bit worried about that. It's hidden under the front radiator.


You should really run the R9 295x2 on top unless you have no plans for multiple screens. The whole point of the 295x2 is having the possibility to run up to 6 screens out of the one card y'know  unless you want the ability to run Crossfire off single card. I'll probably do a setup like yours when I get one, but that's because I'll be running the 295x2 for [email protected] more often than gaming. Probably...


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> From what I can tell only MSI allows people to WB their card without voiding the warranty. I was told I had voided my warranty when putting the block on my Diamond card.


when you get a card.. check the backside for stickers over screws... if none are, well then you know.


----------



## xer0h0ur

Oh I know. These aren't the droids you're looking for.


----------



## Juris

Guys just looking for a bit of info if possible. I'm looking to join the 295x2 club in the next week or so as I'm starting my 1st build in 10 yrs but need a bit of info and your honest opinions.

I'm going to be running 6 monitors, 4 x 27" 2560x1440 and 2 x 24" 1920x1080 Dell U2414h panels. I know the 295x2 has only 5 outputs but the Dells have DP 1.2 MST so I was wondering could I daisychain the 2 Dell 24" panels from 1 of the DP outs on the 295 allowing me to run all six monitors.

Also is now a stupid time to buy the 295x2 with the 300 series coming out and Nvidia about to launch the 800 series. Vram is very important to me as I'm going to be running triples on a lot of sims including X-Plane so the usual 3gb cards won't cut it. Cheers.


----------



## xer0h0ur

I suspect its going to be a while before they release a 300 series dual gpu card anyways. Ultimately I bought this card because it was the most powerful thing I could put into my system using a single PCI-E slot. Once I get this damn EKWB to work right it will be a single slot dual gpu beast.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Juris*
> 
> Guys just looking for a bit of info if possible. I'm looking to join the 295x2 club in the next week or so as I'm starting my 1st build in 10 yrs but need a bit of info and your honest opinions.
> 
> I'm going to be running 6 monitors, 4 x 27" 2560x1440 and 2 x 24" 1920x1080 Dell U2414h panels. I know the 295x2 has only 5 outputs but the Dells have DP 1.2 MST so I was wondering could I daisychain the 2 Dell 24" panels from 1 of the DP outs on the 295 allowing me to run all six monitors.
> 
> Also is now a stupid time to buy the 295x2 with the 300 series coming out and Nvidia about to launch the 800 series. Vram is very important to me as I'm going to be running triples on a lot of sims including X-Plane so the usual 3gb cards won't cut it. Cheers.


Pretty sure the 295x2 has native 6 display support via the MST on DP supporting more displays, yep.

Quite directly from a card maker's website:
Quote:


> Up to 6 displays supported with DisplayPort 1.2 Multi-Stream Transport


----------



## electro2u

Quote:


> Originally Posted by *HoneyBadger84*
> 
> You should really run the R9 295x2 on top unless you have no plans for multiple screens. The whole point of the 295x2 is having the possibility to run up to 6 screens out of the one card y'know  unless you want the ability to run Crossfire off single card. I'll probably do a setup like yours when I get one, but that's because I'll be running the 295x2 for [email protected] more often than gaming. Probably...


I guess I don't understand why the 295x2 needs to be on top even if I wanted to use multiple monitors. I would have thought you could run more monitors by adding another GPU regardless of which pcie slot it's in. In practice for now I'm a single 1440p 120Hz monitor user--I wanted to be able to run 120FPS vsynced maxed out and have my GPUs underutilized while doing it. Also I just thought it was cool you could get good scaling from 3 GPUs and only use 2 PCIE slots.

Considering my system will be very under used if it doesn't explode when I try to start it up the first time, maybe I should look into folding @ home. I don't really play games much anymore. I'm a recovered former MMO addict. Is [email protected] the cancer research thing? I'd definitely be interested in donating the system's computational power for that kind of cause.


----------



## HoneyBadger84

Quote:


> Originally Posted by *electro2u*
> 
> I guess I don't understand why the 295x2 needs to be on top even if I wanted to use multiple monitors. I would have thought you could run more monitors by adding another GPU regardless of which pcie slot it's in. In practice for now I'm a single 1440p 120Hz monitor user--I wanted to be able to run 120FPS vsynced maxed out and have my GPUs underutilized while doing it. Also I just thought it was cool you could get good scaling from 3 GPUs and only use 2 PCIE slots.
> 
> Considering my system will be very under used if it doesn't explode when I try to start it up the first time, maybe I should look into folding @ home. I don't really play games much anymore. I'm a recovered former MMO addict. Is [email protected] the cancer research thing? I'd definitely be interested in donating the system's computational power for that kind of cause.


Well mostly I meant that as in if you ever plan to do a large eyefinity setup. The 295x2 can do 3x 4K if you ever go that route.

[email protected] research includes cancer, Parkinson's, Alzheimer's and a few others. On an aftermarket liquid cooled 295x2 you'd likely easily get 400-500k PPD without OCing, and very little heat. That's a lot  is a big part of why I want one, quiet, cool, easy PPD, just like these vapor-x 280Xs I got. Except they take up 2 1/2 slots each. Lol


----------



## doctakedooty

Hey so been debating on pulling the trigger on this card for a week now to put in to my mitx build. My question is guys 1440p resolution what is your avg fps in BF4. The videos I have seen people have vsync on at 60 fps which is great but I can tell the difference between my 120fps on my 120hz monitor and 60 fps on 60hz without even having fps numbers up the movement is just alot more fluid. So with everything ultra 2x aa etc what is your avg fps before I order this beast from newegg to make sure it fits my needs.


----------



## xer0h0ur

Trying to get 120 FPS or more constantly at ultra settings in 1440p may be a bit overzealous. I don't know if you would be able to hold down non-stop 120 FPS or more. A tri-fired setup would probably hold up to your standards but two Hawaii cores I don't know. Then again I have yet to get my 4K monitor so I can't say for certain. I am just basing it off benchmarking at 1440p.


----------



## Juris

Cheers for the info guys. I'm looking at the Corsair RM1000 PSU to run only a single 295x2. Has anyone else used one of these or can you think of any reason why it wouldn't suit.

I'm looking for a PSU that will remain fairly when running a 6 screens at the desktop for movies, browsing, research, data feeds etc and the RM only turns the fan on at 40% load so I reckoned it would allow the system to run quieter when not gaming in triples.


----------



## xer0h0ur

Well I went with a stronger PSU purely to keep the option of tri-firing another 290X later on if I chose to. I would rather have the option in my back pocket then have to buy another PSU later to do it.


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Trying to get 120 FPS or more constantly at ultra settings in 1440p may be a bit overzealous. I don't know if you would be able to hold down non-stop 120 FPS or more. A tri-fired setup would probably hold up to your standards but two Hawaii cores I don't know. Then again I have yet to get my 4K monitor so I can't say for certain. I am just basing it off benchmarking at 1440p.


Its a mini itx build so only one pcie slot. I would hope 295x2 can produce that framerate as 2 780ti can pretty much hold 120 fps constant and my 3 780ti well are around 150 fps in bf4.


----------



## xer0h0ur

Well in that case you don't have much of a choice. That is the most powerful dual GPU card you can get. Like I said though I can't give you actual in game framerates since I don't have a monitor right now capable of 1440p. There are plenty of reviews out there though.


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> Its a mini itx build so only one pcie slot. I would hope 295x2 can produce that framerate as 2 780ti can pretty much hold 120 fps constant and my 3 780ti well are around 150 fps in bf4.


the 295x2 can't give the same frame rates as 2 OC titans... never mind 2 OC 780Ti's in my hands. But it games much smoother. You asking BF4 MP or campaign?


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well in that case you don't have much of a choice. That is the most powerful dual GPU card you can get. Like I said though I can't give you actual in game framerates since I don't have a monitor right now capable of 1440p. There are plenty of reviews out there though.


I mean your right not much of a choice but I will scrap the whole build if it wont perform. I have read alot of the reviews but most are scewed because most are in single player or with vsync on at 60 fps.


----------



## xer0h0ur

I believe the excuse for not testing in MP has something to do with not being able to re-create the exact same test conditions due to different things going on at any given moment in MP action. Either way I think targeting the same performance as a couple of OCed 780 Ti's is going to miss the mark.


----------



## doctakedooty

Quote:


> Originally Posted by *Jpmboy*
> 
> the 295x2 can't give the same frame rates as 2 OC titans... never mind 2 OC 780Ti's in my hands. But it games much smoother. You asking BF4 MP or campaign?


Multiplayer not much of a campaign player plus this is being built for a lan sff.

I just figure mantle support and the way everyone talks about it should beat my tis in bf4 since it was optimized for them.


----------



## xer0h0ur

I was under the impression that Mantle only really makes large differences with lower tier video cards not high end cards.


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> I was under the impression that Mantle only really makes large differences with lower tier video cards not high end cards.


Its suppose to take load off cpu and load it to gpu. I was planning on doing a lower end intel cpu so mantle would help out. I know that mp maps for bf4 you cant recreate but you can take a few maps to get a idea of your avg fps. I am wanting to squeeze all this in a 250d so it matches my corsair 900d but if I have to I will sell the maximus vi impact for a matx and get the fractal design node 804 its just bigger then I wanted and put ti's in it.


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> I mean your right not much of a choice but I will scrap the whole build if it wont perform. I have read alot of the reviews but most are scewed because most are in single player or with vsync on at 60 fps.


i'm getting ~90 (80-130) at 1440P in large conquest on my 2700K rig. *"perfoverlay.drawfps 1"


----------



## xer0h0ur

Yeah its been a couple of months since I read about Mantle so I forgot already which setups get the most benefit from it.


----------



## doctakedooty

Quote:


> Originally Posted by *Jpmboy*
> 
> i'm getting 9 (80-130) at 1440P in large conquest on my 2700K rig.


Thank you so might be better to pair it with 2 tis instead with a 4770k.


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> Thank you so might be better to pair it with 2 tis instead with a 4770k.


not sure i understand your question.


----------



## doctakedooty

Quote:


> Originally Posted by *Jpmboy*
> 
> not sure i understand your question.


Saying 2 780ti might be the better route then.


----------



## axiumone

Well, I asked this on guru3d. Thought I'd share it with you. It's a major bummer.








Quote:


> Originally Posted by *axium*
> unwinder, I know this has been asked ad nauseam, but do you think it may be possible to add control over the VRM fan on a 295x2 card?
> 
> Thanks again.


Quote:


> Originally Posted by *Unwinder*
> I've already answered it: no, sorry. There won't be low-level fan control in AB.


I asked AMD as well and they told me that the fan is only controlled through the bios on these cards. So it looks like those of us with the stock heatsink are stuck.


----------



## xer0h0ur

Quote:


> Originally Posted by *axiumone*
> 
> Well, I asked this on guru3d. Thought I'd share it with you. It's a major bummer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked AMD as well and they told me that the fan is only controlled through the bios on these cards. So it looks like those of us with the stock heatsink are stuck.


I remember several people saying that fan control on the 295X2 is unlocked when you tri-fire with a 290X.


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> I remember several people saying that fan control on the 295X2 is unlocked when you tri-fire with a 290X.


I re-read the whole thread, it looks like it was only one person who had success with that. I've actually tried it in my own rig. I retraced his steps and I wasn't able to control the VRM fan.


----------



## xer0h0ur

So he was lying then


----------



## xer0h0ur

Now that I think of it can't you just unplug the fan from the PCB then power and control it manually yourself?


----------



## Synthaxx

So I have been playing BF4 lately... and since i"m waiting for my 4K hd monitor, I'm still playing on a 1080p monitor.

When I play quadfire sometimes the frames are awesome, at 200fps constant in perfoverlay.drawfps 1. Other times the quadfire is hell, dipping at 10 fps, stutterting lagging, doing annoying stuff which makes BF4 unplayable. I don't know what makes quadfire so unstable... Anyone has a suggestion?

I disabled frame pacing in CCC and at first it seemed to work, but after some time it started again.
Would it be possible they are getting to hot? GPU cores are not going above 56 degrees.

PS: I can almost heat the whole house with what this pc gives. If I start at 20 degrees ambient, it goes to 26 in no time ...

Edit: using only 1 card at 1080p gives about 190fps in multiplayer BF4


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> Saying 2 780ti might be the better route then.


Sure would be...
for that matter so would two 290x's. as a single slot solution, the 295x2 is currently the best (price or performance)


----------



## xer0h0ur

Quote:


> Originally Posted by *Synthaxx*
> 
> So I have been playing BF4 lately... and since i"m waiting for my 4K hd monitor, I'm still playing on a 1080p monitor.
> 
> When I play quadfire sometimes the frames are awesome, at 200fps constant in perfoverlay.drawfps 1. Other times the quadfire is hell, dipping at 10 fps, stutterting lagging, doing annoying stuff which makes BF4 unplayable. I don't know what makes quadfire so unstable... Anyone has a suggestion?
> 
> I disabled frame pacing in CCC and at first it seemed to work, but after some time it started again.
> Would it be possible they are getting to hot? GPU cores are not going above 56 degrees.
> 
> PS: I can almost heat the whole house with what this pc gives. If I start at 20 degrees ambient, it goes to 26 in no time ...
> 
> Edit: using only 1 card at 1080p gives about 190fps in multiplayer BF4


Your cards are waterblocked right? Did you ever check to make sure all of your thermal pads are fully contacting your surfaces? Ever since I re-mounted the Asetek cooler back on I noticed one of my cores gets hotter than the other by a 4-5 degree margin which to me seems a bit much. I know I accidentally smeared the TIM on core 1 so I had to add some more and I guess that turned out to be enough extra TIM to keep it that much cooler than core 2. I had also noticed that when I had the EKWB on that at least 3 pads were making little to no contact.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Your cards are waterblocked right? Did you ever check to make sure all of your thermal pads are fully contacting your surfaces? Ever since I re-mounted the Asetek cooler back on I noticed one of my cores gets hotter than the other by a 4-5 degree margin which to me seems a bit much. I know I accidentally smeared the TIM on core 1 so I had to add some more and I guess that turned out to be enough extra TIM to keep it that much cooler than core 2. I had also noticed that when I had the EKWB on t*hat at least 3 pads were making little to no contact*.


memory pads? mount screws not tight enough??


----------



## xer0h0ur

Yeah two were memory pads and the other was a backplate pad that literally made no contact on the VRMs. All of the screws were as tight as they can be.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah two were memory pads and the other was a backplate pad that literally made no contact on the VRMs. All of the screws were as tight as they can be.


woah - something ain't right. (be careful not to over tighten any and "bow" the PCB)


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> Now that I think of it can't you just unplug the fan from the PCB then power and control it manually yourself?


+1 internets for you!

It didnt even occur to me to do that. Ordered the cables. Let see how this will work. It looks like AMD used a 2 pin fan connector for the VRM fan... so no actual rpm monitoring.


----------



## xer0h0ur

I meant as tight as they can be without putting force on the screws. I never overtighten anything.


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> I meant as tight as they can be without putting force on the screws. I never overtighten anything.


There are different mm thermal pads you did put the right thickness for the mem modules and the thicker ones for the vrms.


----------



## xer0h0ur

Yeah, between the waterblock and the backplate there are 0.5mm, 1mm and 1.5mm pads used. On the backplate its only 0.5mm pads and 1.5mm pads. At least according to the installation instructions.

For the sake of being specific as to which pads had little to no contact, on the install PDF http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869093.pdf its the right side #3 pad. The RAM pads that didn't make good contact I believe were on the block's side. Pretty sure the backplate was making excellent contact on the RAM pads.

Round 2 with mounting the EKWB and backplate will be Tuesday or Wednesday when I get the extra Fuji pads from frozencpu. I also ended up getting some right angled rotating barbs as I need those badly for the waterblock connections.


----------



## electro2u

Quote:


> Originally Posted by *Synthaxx*
> 
> So I have been playing BF4 lately... and since i"m waiting for my 4K hd monitor, I'm still playing on a 1080p monitor.
> 
> When I play quadfire sometimes the frames are awesome, at 200fps constant in perfoverlay.drawfps 1. Other times the quadfire is hell, dipping at 10 fps, stutterting lagging, doing annoying stuff which makes BF4 unplayable. I don't know what makes quadfire so unstable... Anyone has a suggestion?
> 
> I disabled frame pacing in CCC and at first it seemed to work, but after some time it started again.
> Would it be possible they are getting to hot? GPU cores are not going above 56 degrees.
> 
> PS: I can almost heat the whole house with what this pc gives. If I start at 20 degrees ambient, it goes to 26 in no time ...
> 
> Edit: using only 1 card at 1080p gives about 190fps in multiplayer BF4


There is a memory leak with BF4 and performance degrades over time is what I'm hearing. Should be fixed soon.


----------



## electro2u

Quote:


> Originally Posted by *axiumone*
> 
> Well, I asked this on guru3d. Thought I'd share it with you. It's a major bummer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I asked AMD as well and they told me that the fan is only controlled through the bios on these cards. So it looks like those of us with the stock heatsink are stuck.


That's pretty lame someone would come in here and lie like that about having control over the vrm fan.


----------



## axiumone

I don't think it was a lie. Maybe a software bug that only he was able to accomplish.


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah, between the waterblock and the backplate there are 0.5mm, 1mm and 1.5mm pads used. On the backplate its only 0.5mm pads and 1.5mm pads. At least according to the installation instructions.
> 
> For the sake of being specific as to which pads had little to no contact, on the install PDF http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869093.pdf its the right side #3 pad. The RAM pads that didn't make good contact I believe were on the block's side. Pretty sure the backplate was making excellent contact on the RAM pads.
> 
> Round 2 with mounting the EKWB and backplate will be Tuesday or Wednesday when I get the extra Fuji pads from frozencpu. I also ended up getting some right angled rotating barbs as I need those badly for the waterblock connections.


I hope it goes well this time. Just take your time and all should end well and no more issues.


----------



## 4K-HERO

Quote:


> Originally Posted by *axiumone*
> 
> +1 internets for you!
> 
> It didnt even occur to me to do that. Ordered the cables. Let see how this will work. It looks like AMD used a 2 pin fan connector for the VRM fan... so no actual rpm monitoring.


b/2010#post_22722440"]
I re-read the whole thread, it looks like it was only one person who had success with that. I've actually tried it in my own rig. I retraced his steps and I wasn't able to control the VRM fan.[/quote]
Quote:


> Originally Posted by *crazygamer123*
> 
> hi
> 
> What software you guys are using to monitor the GPUs in game? (full screen)


Wa ..

I updated gpu z and it shows vrm fan monitoring, and I have the radiator fan connected to the fan controller. So I'm pretty sure it's the VRM fan info showing up.


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> I hope it goes well this time. Just take your time and all should end well and no more issues.


Hey Doc - is the Valley thread hot or cold these days...?


----------



## doctakedooty

Quote:


> Originally Posted by *Jpmboy*
> 
> Hey Doc - is the Valley thread hot or cold these days...?


There is alot of people posting I been updating it here and there just havent posted anything when I do.


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> There is alot of people posting I been updating it here and there just havent posted anything when I do.


was hoping that a lot more higher-rez data was coming in. not yet tho it seems.


----------



## axiumone

Quote:


> Originally Posted by *4K-HERO*
> 
> b/2010#post_22722440"]
> Wa ..
> 
> I updated gpu z and it shows vrm fan monitoring, and I have the radiator fan connected to the fan controller. So I'm pretty sure it's the VRM fan info showing up.


Wow, you are right! The new gpuz shows rpm for the VRM fan. Now if there was software that could control it.


----------



## cennis

how about vrm temps?


----------



## axiumone

Quote:


> Originally Posted by *cennis*
> 
> how about vrm temps?


No temps, just fan speed.


----------



## arieldeboca

Hi guys.
I have some questions for users of one amd 295x2.
Because a amd 295x2 and why not a 780ti sli? Many forums say it has better performance and lower price 780 sli

I'm thinking about buying a amd 295x2 (always wanted a plate with 2 chip) but I know that one is cheaper than 780 sli and perhaps exceeds performance.
I'm playing at 1080p (maybe in the future 4k) .I change the video card every 3 years, so I need a powerful vga to have optimum performance in games the next 3 years.
1) they think? 295x2 or 780 sli ?
2) sli has many problems?
3) Dual actalmente plates are working well?


----------



## Jpmboy

Quote:


> Originally Posted by *arieldeboca*
> 
> Hi guys.
> I have some questions for users of one amd 295x2.
> Because a amd 295x2 and why not a 780ti sli? Many forums say it has better performance and lower price 780 sli
> 
> I'm thinking about buying a amd 295x2 (always wanted a plate with 2 chip) but I know that one is cheaper than 780 sli and perhaps exceeds performance.
> I'm playing at 1080p (maybe in the future 4k) .I change the video card every 3 years, so I need a powerful vga to have optimum performance in games the next 3 years.
> 1) they think? 295x2 or 780 sli ?
> 2) sli has many problems?
> 3) Dual actalmente plates are working well?


I'm running both configs right now. you will get higher FPS with SLI 780ti's (mine are kingpins) - but game play with the 295x2 is a very smooth experience at 4K. For 1080P, both are overkill. Unless you are getting 4K real soon, one 290x or a 780Ti would be plenty for 1080P


----------



## xer0h0ur

Overkill is indeed the name of the game


----------



## Juris

Ok this might get me the dumb question of the day award but I have to ask as I'm in a bind. I've just bought a 295x2 for the new build (awaiting delivery) to put into a Silverstone FT-05 which is being released next month (basically a Raven RV-05 in aluminium).

I was hoping to mount the fan/rad piping through a removed pci backplate, or 2, and mount it on top of the only 120mm fan spacing in the upper chamber. The entire roof of the FT-05 is perforated for cooling. Slight problem is the CPU will be just below it in the lower chamber and as its a 90 degree case with no other space for rad mounting I will have to cpu cool with air & the fans will be pushing air south to north up into that same 120mm space.

I know this is obviously not good but will the air being pushed from a Noctua NH-D15 cooling a 4790k basically into the 295x2 fan then onto the rad destroy the 295x2?

The thermals on the FT-05 (if the RV-05 is anything to go by) are fantastic but the above hot air problem scares the crap out of me. Help.


----------



## Rohandy

Does anyone here own an XSPC Razor block for their R9 295x2? Can't seem to find any info on the web about temps for that water block. Would like to compare numbers with the EK block. Or if anyone knows how hot they run. Specs on loop setup also if possible. Thanks!


----------



## Syceo

Hi everyone, I finally took the plunge and got myself an MSI 295x2 , got 2x 780's now sitting staring at me like... "*** you'v abandoned us" .... nevertheless im going to run with this 295 until i come across any issues that will have me change my mind. So far the one game that is giving me micro stutter is Fifa 14, so any ideas guys, im new here so sorry if im asking what may seem like a noob question.


----------



## xer0h0ur

Quote:


> Originally Posted by *Juris*
> 
> Ok this might get me the dumb question of the day award but I have to ask as I'm in a bind. I've just bought a 295x2 for the new build (awaiting delivery) to put into a Silverstone FT-05 which is being released next month (basically a Raven RV-05 in aluminium).
> 
> I was hoping to mount the fan/rad piping through a removed pci backplate, or 2, and mount it on top of the only 120mm fan spacing in the upper chamber. The entire roof of the FT-05 is perforated for cooling. Slight problem is the CPU will be just below it in the lower chamber and as its a 90 degree case with no other space for rad mounting I will have to cpu cool with air & the fans will be pushing air south to north up into that same 120mm space.
> 
> I know this is obviously not good but will the air being pushed from a Noctua NH-D15 cooling a 4790k basically into the 295x2 fan then onto the rad destroy the 295x2?
> 
> The thermals on the FT-05 (if the RV-05 is anything to go by) are fantastic but the above hot air problem scares the crap out of me. Help.


You're not going to destroy anything. Worst case scenario is your rad will get hotter and your GPUs will throttle at reaching the 75C mark. In reality its all dependent on the load you're running on the GPUs. The higher the resolution you're gaming at then the hotter your cores will be running.


----------



## xer0h0ur

Quote:


> Originally Posted by *Rohandy*
> 
> Does anyone here own an XSPC Razor block for their R9 295x2? Can't seem to find any info on the web about temps for that water block. Would like to compare numbers with the EK block. Or if anyone knows how hot they run. Specs on loop setup also if possible. Thanks!


Someone here did get that block but I don't know if he has already installed it on his cards.
Quote:


> Originally Posted by *Syceo*
> 
> Hi everyone, I finally took the plunge and got myself an MSI 295x2 , got 2x 780's now sitting staring at me like... "*** you'v abandoned us" .... nevertheless im going to run with this 295 until i come across any issues that will have me change my mind. So far the one game that is giving me micro stutter is Fifa 14, so any ideas guys, im new here so sorry if im asking what may seem like a noob question.


Nice choice. I believe you can waterblock that card without voiding your warranty.


----------



## Syceo

Yeah, so ive heard... I know nothing about custom loops , but have been considering it for this particular card. However in its current configuration , I'm not seeing temps rise above 65c , then again its at stock clocks at the moment. Im only running 1x 1440p 1x1080 p monitors right now so i'll ponder on it. Thinking about a single 4k paired up with the 1440. Any suggestions on a waterblock pump setup just for the card (cpu is on a h100i @ stock so not sure if its worth including in the loop)


----------



## xer0h0ur

Most people would seem to agree that a 240 is bare minimum for this video card and a 360 is more ideal yet some people go further with dual 360s or a 480 rad. Me personally, I am not trying to keep the cores ice cold or as cool as humanly possible. I just want improvement over the stock 120mm radiator. That is enough to satisfy my purposes. If I could afford to isolate my video card's loop from my cpu's then I would but in my case its not feasible due to space and I need the extra radiator helping to cool the video card too.


----------



## CrisInuyasha

Replaced my case and finally got some pics of my card:


----------



## xer0h0ur

I am guessing you have a power supply with single 12V rail design since you're drawing all that power from a single 8-pin cable.


----------



## Syceo

Just joined the club  oh happy days .. money well spent. hummm now what to do about those 780's i have sitting around.


----------



## electro2u

Before I got mine in the mail I thought the shroud had the RADEON logo backlit with LEDs the way Nvidia's TITAN cooler does. It's an LED fan...


----------



## xer0h0ur

Dang that case is quite wide. Well done sir. Welcome to the fray. Another 6 hours before I get out of work and give it another crack at mounting the EKWB and backplate. Praying to the hardware gods it works this time around.


----------



## CrisInuyasha

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am guessing you have a power supply with single 12V rail design since you're drawing all that power from a single 8-pin cable.


Yes, the ax1200i, but I'm was kinda afraid when I decided to test it. But everything worked fine, already stress tested it and its ok, so its one less cable to deal with.


----------



## xer0h0ur

Yeah in that case I would be worried more about drawing that much power through a single 8 pin cable. That is 50 Amps and 500W going through there. I bet if you point an IR thermometer at that cable under load its quite hot.

Edit: Actually its 350W and 50 Amps through the 8-pin. If I remember right the PCI-E slot is drawing the other 150W.


----------



## ozlay

found this amazing deal on newegg http://www.newegg.com/Product/ComboDealDetails.aspx?ItemList=Combo.1762079

decided i would post it here basically it comes with a free 500gig ssd and a free 1000w Platinum antec psu and 3 games free so basically $500 worth of free stuff


----------



## CrisInuyasha

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah in that case I would be worried more about drawing that much power through a single 8 pin cable. That is 50 Amps and 500W going through there. I bet if you point an IR thermometer at that cable under load its quite hot.
> 
> Edit: Actually its 350W and 50 Amps through the 8-pin. If I remember right the PCI-E slot is drawing the other 150W.


I guess I will revert back to the dual cable for this then. Thanks for the advice!


----------



## xer0h0ur

Don't get me wrong, I can't say with any measure of certainty if what you did is perfectly fine as I never attempted it myself. I just wouldn't tempt fate on drawing that much power through a single cable. I know that in reviews where they pointed an IR thermometer or a flir cam at the cables and the 8-pin connectors, they were quite hot. That would make me believe combining that power draw on a single cable would only make it hotter.


----------



## CrisInuyasha

Better be safe than sorry then. Just changed back to the safer way







:


----------



## arieldeboca

I know it's overkill for 1080p one 295x2. But within 3 years, a 295x2 amd will serve more than a 780ti. But the truth I am very hesitant between 295x2 and 780 sli
Quote:


> Originally Posted by *Jpmboy*
> 
> I'm running both configs right now. you will get higher FPS with SLI 780ti's (mine are kingpins) - but game play with the 295x2 is a very smooth experience at 4K. For 1080P, both are overkill. Unless you are getting 4K real soon, one 290x or a 780Ti would be plenty for 1080P


----------



## doctakedooty

Got my 295x2 in today also. Will test out tonight gaming see if it will handle what I need it to for my 250d build.


----------



## CrisInuyasha

Quote:


> Originally Posted by *arieldeboca*
> 
> I know it's overkill for 1080p one 295x2. But within 3 years, a 295x2 amd will serve more than a 780ti. But the truth I am very hesitant between 295x2 and 780 sli


Personally I don't find it overkill. I like to play above 120 fps when possible on 1080p (120 hz screen), and some games are demanding to the point they get around 80fps or even less, like metro ll. But for a 60 hz screen, it's like the perfect card.


----------



## Juris

Ok 1st off thanks to all who have helped me out on the thread. You've led me to being the new owner of a 295x2 and being down a serious amount of cash







. 2nd I have a bit of a deal to share now that the courier has just arrived and I know its for real.

Scan.co.uk are doing 2 bundle deals on the 295x2. Both strangely cost £1038 (€1298 in my money) so be careful if you're buying to pick the right one. The 1st is an MSI 295x2 with a Crucial MX100 512gb SSD.

The second is an XFX 295x2 with the same Crucial MX100 512gb SSD plus an XFX Pro 1250w 80+ Gold single 12v rail fully modular black edition PSU with hybrid mode which on their site is worth £191.

So total for £1038 (€1298) minus
XFX 1250w PSU = £191 (€239)
Crucial 512gb SSD = £149 (€186)

= XFX 295x2 for £698 (€873)

This is the correct deal
http://www.scan.co.uk/products/go-live-22-7-8gb-xfx-radeon-r9-295x2-vga-card-plus-512gb-crucial-mx100-ssd-bundle

Its the cheapest I've seen and in stock (minus 1 thats sitting beside me, pic below)


----------



## rdr09

Quote:


> Originally Posted by *Juris*
> 
> Ok 1st off thanks to all who have helped me out on the thread. You've led me to being the new owner of a 295x2 and being down a serious amount of cash
> 
> 
> 
> 
> 
> 
> 
> . 2nd I have a bit of a deal to share now that the courier has just arrived and I know its for real.
> 
> Scan.co.uk are doing 2 bundle deals on the 295x2. Both strangely cost £1038 (€1298 in my money) so be careful if you're buying to pick the right one. The 1st is an MSI 295x2 with a Crucial MX100 512gb SSD.
> 
> The second is an XFX 295x2 with the same Crucial MX100 512gb SSD plus an XFX Pro 1250w 80+ Gold single 12v rail fully modular black edition PSU with hybrid mode which on their site is worth £191.
> 
> So total for £1038 (€1298) minus
> XFX 1250w PSU = £191 (€239)
> Crucial 512gb SSD = £149 (€186)
> 
> = XFX 295x2 for £698 (€873)
> 
> This is the correct deal
> http://www.scan.co.uk/products/go-live-22-7-8gb-xfx-radeon-r9-295x2-vga-card-plus-512gb-crucial-mx100-ssd-bundle
> 
> Its the cheapest I've seen and in stock (minus 1 thats sitting beside me, pic below)


any games bundled?


----------



## Juris

Quote:


> Originally Posted by *rdr09*
> 
> any games bundled?


AMD have their Never Settle Gold with all the higher end cards sold but frankly its pretty crap compared to Nvidia's offering or even Intel giving Grid Autosport and 2 other high ranking titles for buying a new CPU.

http://sites.amd.com/us/promo/never-settle/Pages/nsreloadedforever.aspx

Then again if you choose a mid, high or especially an ultra high end GPU on the basis of bundled games its a bit like buying a Ferrari over a Lambo because it had nicer fluffy dice. (I know your not doing that of course)


----------



## Synthaxx

Quote:


> Originally Posted by *Rohandy*
> 
> Does anyone here own an XSPC Razor block for their R9 295x2? Can't seem to find any info on the web about temps for that water block. Would like to compare numbers with the EK block. Or if anyone knows how hot they run. Specs on loop setup also if possible. Thanks!


I have the XSPC block for both my 295x2s. When idle at startup, they run at about 35° and idle after many benches at 40°s. Games are high 50s and benches are around 60°s.
They are solid made and have good instructions. Installation is easy.

I'm waiting for my red leds to light em up!









Edit: Mine are OCed at 1080/1600


----------



## Rohandy

Quote:


> Originally Posted by *Synthaxx*
> 
> I have the XSPC block for both my 295x2s. When idle at startup, they run at about 35° and idle after many benches at 40°s. Games are high 50s and benches are around 60°s.
> They are solid made and have good instructions. Installation is easy.
> 
> I'm waiting for my red leds to light em up!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Mine are OCed at 1080/1600


Sounds good especially at those clock speeds. So it seems all the blocks from each manufacturer performs just around the same as the next : ) what kind of rad setup do you have in your system? And other specs?


----------



## Synthaxx

Quote:


> Originally Posted by *Rohandy*
> 
> Sounds good especially at those clock speeds. So it seems all the blocks from each manufacturer performs just around the same as the next : ) what kind of rad setup do you have in your system? And other specs?


Copied my sig:

Hardware
Cosmos II / Asus Rampage IV Black Edition / Intel I7 4930K @ 4.7Ghz / 32GB Dominator Platinum 2133Mhz / 2x 295x2 Sapphire OC/ EVGA 1600 G2 / Samsung 840 Pro 256GB / Hitachi 7k4000 2TB
Cooling
D5 /w Bitspower top & Mod / XSPC Raystorm / XSPC Razor x2 / AX360 / 2x UT60 240mm / AX120/ Bitspoweeeerrrr Fittings/ 3/8-5/8 Primochill LRT advanced


----------



## Rohandy

Quote:


> Originally Posted by *Synthaxx*
> 
> Copied my sig:
> 
> Hardware
> Cosmos II / Asus Rampage IV Black Edition / Intel I7 4930K @ 4.7Ghz / 32GB Dominator Platinum 2133Mhz / 2x 295x2 Sapphire OC/ EVGA 1600 G2 / Samsung 840 Pro 256GB / Hitachi 7k4000 2TB
> Cooling
> D5 /w Bitspower top & Mod / XSPC Raystorm / XSPC Razor x2 / AX360 / 2x UT60 240mm / AX120/ Bitspoweeeerrrr Fittings/ 3/8-5/8 Primochill LRT advanced


That's a mini refrigerator : ) nice setup but those baby's run pretty warm for all those rads don't you think? I guess overclocking these cards really pushes they're thermal limits!


----------



## rdr09

Quote:


> Originally Posted by *Juris*
> 
> AMD have their Never Settle Gold with all the higher end cards sold but frankly its pretty crap compared to Nvidia's offering or even Intel giving Grid Autosport and 2 other high ranking titles for buying a new CPU.
> 
> http://sites.amd.com/us/promo/never-settle/Pages/nsreloadedforever.aspx
> 
> Then again if you choose a mid, high or especially an ultra high end GPU on the basis of bundled games its a bit like buying a Ferrari over a Lambo because it had nicer fluffy dice. (I know your not doing that of course)


i thought they didn't have any there 'cause they have them here but . . . Just Cause 2?


----------



## billyboy8888

Hi guys, I'm planning on doing a faux-high end rig, with 295x2 and the cheapest possible for everything else, just enough to not bottle-neck the GPU when gaming. I noticed DIY a rig from scratch is same as furnishing my college era apartment, it could either be super cheap or super expensive. But I just want the best possible gaming performance for the least money. I'm not tight on money per-se, but I just want to money on the part that actually matters, the GPU, and cut on everything ese, while not bottlenecking the GPU. When the graphic card alone already costs ~1700+ after tax, every bit hurts after that.

I have read an article on an OC oriented website, they tested 4670k and 4770k on high end cards and the fps difference was less than 1 in most cases. But even so, my originally planned 4690k is still somewhat a mid-high end CPU. So I want to hear from you guys, especially those with first hand experience, what's the cheapest/lowest CPU possible.

On top of that, how about the cases and PSU and everything else. Would it be possible to squeeze it all to less than $400 (I wrote this from a Canadian Igloo, everything here is about 10-15% higher than in the states)?


----------



## cennis

Quote:


> Originally Posted by *billyboy8888*
> 
> Hi guys, I'm planning on doing a faux-high end rig, with 295x2 and the cheapest possible for everything else, just enough to not bottle-neck the GPU when gaming. I noticed DIY a rig from scratch is same as furnishing my college era apartment, it could either be super cheap or super expensive. But I just want the best possible gaming performance for the least money. I'm not tight on money per-se, but I just want to money on the part that actually matters, the GPU, and cut on everything ese, while not bottlenecking the GPU. When the graphic card alone already costs ~1700+ after tax, every bit hurts after that.
> 
> I have read an article on an OC oriented website, they tested 4670k and 4770k on high end cards and the fps difference was less than 1 in most cases. But even so, my originally planned 4690k is still somewhat a mid-high end CPU. So I want to hear from you guys, especially those with first hand experience, what's the cheapest/lowest CPU possible.
> 
> On top of that, how about the cases and PSU and everything else. Would it be possible to squeeze it all to less than $400 (I wrote this from a Canadian Igloo, everything here is about 10-15% higher than in the states)?


buy 3 290s from miners and call it a day.


----------



## xer0h0ur

I didn't end up re-mounting the EKWB last night. Instead I just readjusted all my hosing sizes and connectors to clean everything up. Its now perfectly ready to just swap coolers on the card and slap together a couple of quick disconnects to rock and roll. It really is a pain in the butt to shorten hosing and swap connectors when the system is already primed and filled.


----------



## xer0h0ur

Quote:


> Originally Posted by *cennis*
> 
> buy 3 290s from miners and call it a day.


You might want to let him know just how loud that setup will be.
Quote:


> Originally Posted by *billyboy8888*
> 
> Hi guys, I'm planning on doing a faux-high end rig, with 295x2 and the cheapest possible for everything else, just enough to not bottle-neck the GPU when gaming. I noticed DIY a rig from scratch is same as furnishing my college era apartment, it could either be super cheap or super expensive. But I just want the best possible gaming performance for the least money. I'm not tight on money per-se, but I just want to money on the part that actually matters, the GPU, and cut on everything ese, while not bottlenecking the GPU. When the graphic card alone already costs ~1700+ after tax, every bit hurts after that.
> 
> I have read an article on an OC oriented website, they tested 4670k and 4770k on high end cards and the fps difference was less than 1 in most cases. But even so, my originally planned 4690k is still somewhat a mid-high end CPU. So I want to hear from you guys, especially those with first hand experience, what's the cheapest/lowest CPU possible.
> 
> On top of that, how about the cases and PSU and everything else. Would it be possible to squeeze it all to less than $400 (I wrote this from a Canadian Igloo, everything here is about 10-15% higher than in the states)?


If sound is a factor in your build then going with three 290's may not be the way you want to go. Also there are bundles for this video card where you get a power supply, a SSD and the three games so that at least can mitigate the cost of the 295X2 a bit.


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> You might want to let him know just how loud that setup will be.
> If sound is a factor in your build then going with three 290's may not be the way you want to go. Also there are bundles for this video card where you get a power supply, a SSD and the three games so that at least can mitigate the cost of the 295X2 a bit.


he said he wants the best gaming performance for least money,

3 290x + 3 waterblocks + pump + rad + PSU + SSD + 3 games will cost less than 1500$

at the end of the day trifire 290x with waterblocks will be faster than 295x2, while being more silent (if thats what he cares about). At the same time cheaper

really 295x2 is about being able to utilize just 1 pcie slot, with form factors such as MITX and MATX with sound card / trifire/quadfire

or possibly need for the 4 mini DP ports but I doubt thats relevant


----------



## xer0h0ur

Dang you're right. I didn't know used 290X's were going for that cheap. If you have the space for three cards and know how to mount waterblocks yourself then that will save you some money. Just make sure you have enough real estate within the case for the radiators necessary to cool three Hawaii cores.


----------



## ImperialOne

Quote:


> Originally Posted by *xer0h0ur*
> 
> Dang you're right. I didn't know used 290X's were going for that cheap. If you have the space for three cards and know how to mount waterblocks yourself then that will save you some money. Just make sure you have enough real estate within the case for the radiators necessary to cool three Hawaii cores.


The question is, do you trust used 290/290x? Do you know where they have been?


----------



## xer0h0ur

I don't even buy used video games much less used hardware. The only times I ever buy used hardware are from personal friends I know take care of their stuff. That doesn't mean everyone else shouldn't. Its just my personal preference to buy things new if I can afford it.


----------



## Synthaxx

Quote:


> Originally Posted by *Rohandy*
> 
> That's a mini refrigerator : ) nice setup but those baby's run pretty warm for all those rads don't you think? I guess overclocking these cards really pushes they're thermal limits!


nahh, the rads have a cooling capacity for at least 1650 watts (10 degrees delta).

The rig is very silent, fans are running at 800-1000 rpm (7 Noctua NF F12 industrialppc 2000 and 3 dead silence fans)


----------



## Rohandy

Quote:


> Originally Posted by *Synthaxx*
> 
> nahh, the rads have a cooling capacity for at least 1650 watts (10 degrees delta).
> 
> The rig is very silent, fans are running at 800-1000 rpm (7 Noctua NF F12 industrialppc 2000 and 3 dead silence fans)


Really nice! @ those rpm's you can hear a pin drop : )


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> Got my 295x2 in today also. Will test out tonight gaming see if it will handle what I need it to for my 250d build.










let us know what you think Bro!
Quote:


> Originally Posted by *cennis*
> 
> he said he wants the best gaming performance for least money,
> 3 290x + 3 waterblocks + pump + rad + PSU + SSD + 3 games will cost less than 1500$
> at the end of the day trifire 290x with waterblocks will be faster than 295x2, while being more silent (if thats what he cares about). At the same time cheaper
> really 295x2 is about being able to utilize just 1 pcie slot, with form factors such as MITX and MATX with sound card / trifire/quadfire
> or possibly need for the 4 mini DP ports but I doubt thats relevant


yeah - get those fine 290x's from a something like this:











Quote:


> Originally Posted by *ImperialOne*
> 
> The question is, do you trust used 290/290x? Do you know where they have been?


No.


----------



## doctakedooty

Quote:


> Originally Posted by *Jpmboy*
> 
> 
> 
> 
> 
> 
> 
> 
> let us know what you think bro


I am happy with it paired with my 4770k at stock clocks and the 295x2 at stock clocks it does bf4 1440p everything ultra 2x aa I can stay above 100 fps. I can squeeze a few more after I delid and get the rads hooked up and pump etc and see what I can overclock the cpu to while holding decent temps. One thing is for sure fitting this gpu in the 250D is going to be my hardest task.


----------



## shadow85

Is it worth getting this now? Wondering how the nvidia gtx 880 will perform and SLI 880, then the 880 ti's shouldnt be long after.


----------



## Earth Dog

Review of the EK waterblock for this card.









http://www.overclockers.com/EK-295x2-waterblock-review


----------



## Jpmboy

Quote:


> Originally Posted by *doctakedooty*
> 
> I am happy with it paired with my 4770k at stock clocks and the 295x2 at stock clocks it does bf4 1440p everything ultra 2x aa I can stay above 100 fps. I can squeeze a few more after I delid and get the rads hooked up and pump etc and see what I can overclock the cpu to while holding decent temps. One thing is for sure fitting this gpu in the 250D is going to be my hardest task.


it gets a lot smaller with a waterblock, and runs 20C cooler at least.


----------



## Jpmboy

Quote:


> Originally Posted by *shadow85*
> 
> Is it worth getting this now? Wondering how the nvidia gtx 880 will perform and SLI 880, then the 880 ti's shouldnt be long after.


nah.. nor the 880s, cause the 980s will follow.


----------



## jerrolds

I might actually skip a generation this time 6970 -> 6970CF -> 7970 -> 290X should be pretty good for 1440p gaming for at least another year. That and resell value for 290s arent very good atm.


----------



## shadow85

Quote:


> Originally Posted by *Jpmboy*
> 
> nah.. nor the 880s, cause the 980s will follow.


Well I was planning on waiting another 1-3 months max before i buy a GPU, wasnt planning on waiting 1+ years


----------



## Jpmboy

Quote:


> Originally Posted by *shadow85*
> 
> Well I was planning on waiting another 1-3 months max before i buy a GPU, wasnt planning on waiting 1+ years


in that case, wait and see what comes ouot this fall. Nvidia has to do something significant. I'll be sticking with these cards for a while (295x2 and triSLI kingpins) for at least a year (well... unless a real game changer shows up







)


----------



## xer0h0ur

HOLY COW I GOT IT! I FREAKIN GOT IT RUNNING WITH THE BLOCK! YEAAAAAAAAAAAAAAAAAAAAAAAAH!!!!!!!!!!!!!!

I found the reason why the video card was refusing to start. The solder was protruding from one of the components on the PCB and it was making direct contact with the block. I took a small file and painstakingly filed away at the solder for almost an hour. The pictures below are about halfway into filing away at the solder. It really was a big glob of it. I had to mount the block and backplate three times before I finally got it all to work. I literally ran out of PK-3 last night and I overnighted two tubes so as soon as my doorbell rang I was back to work.

I AM SOOOOOOOOOOO HAPPY!




And this is the final product after all the damn blood sweat and tears.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> HOLY COW I GOT IT! I FREAKIN GOT IT RUNNING WITH THE BLOCK! YEAAAAAAAAAAAAAAAAAAAAAAAAH!!!!!!!!!!!!!!
> 
> I found the reason why the video card was refusing to start. The solder was protruding from one of the components on the PCB and it was making direct contact with the block. I took a small file and painstakingly filed away at the solder for almost an hour. The pictures below are about halfway into filing away at the solder. It really was a big glob of it. I had to mount the block and backplate three times before I finally got it all to work. I literally ran out of PK-3 last night and I overnighted two tubes so as soon as my doorbell rang I was back to work.
> 
> I AM SOOOOOOOOOOO HAPPY!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> And this is the final product after all the damn blood sweat and tears.
> 
> 
> Spoiler: Warning: Spoiler!


NICE!!!
Craftsmanship on your part.








Manufacturer


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> HOLY COW I GOT IT! I FREAKIN GOT IT RUNNING WITH THE BLOCK! YEAAAAAAAAAAAAAAAAAAAAAAAAH!!!!!!!!!!!!!!


Did it bring temps down??


----------



## xer0h0ur

Still purging all the air bubbles from the system and testing. I thought I had gotten it out last night and with some tilting of the case I saw my reservoir drop to the fill line three times. Still working the air out and refilling as I go.


----------



## Juris

Just a quick heads-up on AMD today announcing the R9 285. Obviously it won't be anywhere near to our 295x2's but the important part for anyone who has just bought or is about to buy AMD is they have just updated the Never Settle bundle to include the Space Edition which comes with Star Citizen, Alien Isolation, Habitat and Space Run.

If you haven't used your Never Settle coupon even if you have bought earlier in the year you will be entitled to pick up the new games. I've been dying to get Alien Isolation and Star Citizen for ages so for me this is great news.

http://www.eteknix.com/amd-reveals-never-settle-space-edition-adds-star-citizen/


----------



## xer0h0ur

Quote:


> Originally Posted by *electro2u*
> 
> Did it bring temps down??


Idle temps at startup were 30C GPU1 / 32C GPU2. Gaming temps were mid 50's on GPU1 and high 50's to 60C on GPU2. I have been getting roughly 1-3C differences at idle between cores and 4-5C differences at load. Once back at idle it was 32C GPU1 33C GPU2

Firestrike Extreme max temp GPU1 51C / GPU2 56C

Absolutely no thermal throttling. Cores consistently running at full speed.

Not sure if I should remount the block with more PK-3 on the cores or swap the direction of the flow on the loop to even out those temps more. Although GPU2 has always been the hotter core no matter the cooling solution on my card. Anyone else have some insight here?

Still not entirely sure if all the air is out of the radiator. I need to dismount it from the case so I can flip it around with the pump running to be certain.


----------



## Jeronbernal

You guys think I'd have any problem powering this diamond 295x2 & 4790k oc'd on a ax860i psu?


----------



## HeXrAm

How does the xfx R9 295x2 compare with the other vendors?


----------



## ANGELPUNISH3R

Quote:


> Originally Posted by *Jeronbernal*
> 
> You guys think I'd have any problem powering this diamond 295x2 & 4790k oc'd on a ax860i psu?


I'm running a 4770k at 4.6ghz 1.275volts with this card on a 860i its fine.


----------



## Syceo

Hi guys, im ordering water cooling parts for my rig (first time water-cooling) I would really appreciate some help in choosing "gold" compression fittings to go with the gold theme i am running. I would also really be grateful if anyone could have a look at my components list and give their opinion on the parts i have chosen.

In terms of the compression fittings , I need a solution that would allow me to bypass the hotswap bay in the bottom of the case ( so maybe a gold 45 degree compression fitting " such a noob" sorry guys)

I would also like a suggestion on draining ports, what i mean is should i get a compression fitting for the 360 rad that has a "y" split so i can run an additional length of tubing (with a stop at the end) for drainage

Any help would be great, Thanks in advance ,

My rig as it is now:



Components:

*Cpu block
Aqua Computer Cuplex Kryos PRO
*








[/URL]

*EK-D5 Vario X-RES 140 including Pump*








[/URL]

Alphacool 35273 NexXxoS XT45 Radiator 360










*Alphacool 35279 NexXxoS UT60 Radiator 240*



*EK-FC R9-295X2 - Acetal+Nickel Water Block*




Tubing / Hose ( need help with some quality black tubing
Please let me know if you think im missing anything

Thanks again


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Idle temps at startup were 30C GPU1 / 32C GPU2. Gaming temps were mid 50's on GPU1 and high 50's to 60C on GPU2. I have been getting roughly 1-3C differences at idle between cores and 4-5C differences at load. Once back at idle it was 32C GPU1 33C GPU2
> *Firestrike Extreme max temp GPU1 51C / GPU2 56C*
> Absolutely no thermal throttling. Cores consistently running at full speed.
> *Not sure if I should remount the block with more PK-3 on the cores* or swap the direction of the flow on the loop to even out those temps more. Although GPU2 has always been the hotter core no matter the cooling solution on my card. Anyone else have some insight here?
> Still not entirely sure if all the air is out of the radiator. I need to dismount it from the case so I can flip it around with the pump running to be certain.


that's about what the cores on mine will top out at too. I don't think you need to remount the block.








Now take that IB-E up to 4.5GHz at least!








Quote:


> Originally Posted by *Jeronbernal*
> 
> You guys think I'd have any problem powering this diamond 295x2 & 4790k oc'd on a ax860i psu?


should be fine with the stock bios and some OC.


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> would really appreciate some help in choosing "gold" compression fittings to go with the gold theme i am running.


http://www.frozencpu.com/products/21757/ex-tub-2195/Bitspower_G14_Thread_12_ID_x_34_OD_Compression_Fitting_-_True_Brass_BP-TBCPF-CC5.html
OOS right now but you can order them direct from bitspower if you are savvy. There are going to be brass fittings available at any big hardware store.
Quote:


> Cpu block
> Aqua Computer Cuplex Kryos PRO


These CPU blocks are very expensive and not particularly well reviewed. I would suggest the new EK block that is coming out soon. Their current Clean CSQ cpu blocks are End of Life... which is pretty lame.
Quote:


> EK-D5 Vario X-RES 140 including Pump


Note the little circles all over the acetyl on this unit. This is the mark of EK's orig. CSQ line and the unit you've chosen is perfectly fine, I'm sure. But I just thought I'd point out EK changed it because people got tired of the little circles.
Quote:


> Alphacool 35273 NexXxoS XT45 Radiator 360


Be careful with the width on your top radiator. What case are you using? Might want to consider a lower profile version.
Quote:


> Alphacool 35279 NexXxoS UT60 Radiator 240


I like these rads--they are very sturdy and perform well in tests and reviews, I'm using an XT45 280mm and an 30x360mm in the top of my Corsair 760T
You may find with the tube reservoir that 60mm wide radiator + push/pull requires a lot of depth in your case.
Quote:


> Tubing / Hose ( need help with some quality black tubing


Primochill LRT Advanced
http://www.frozencpu.com/products/17890/ex-tub-1623/PrimoChill_PrimoFlex_Advanced_LRT_Tubing_38ID_x_12_OD_-_10ft_Retail_Pack_-_Onyx_Black_PFLEXA10-12-BK_w_Free_Sys_Prep.html


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> http://www.frozencpu.com/products/21757/ex-tub-2195/Bitspower_G14_Thread_12_ID_x_34_OD_Compression_Fitting_-_True_Brass_BP-TBCPF-CC5.html
> OOS right now but you can order them direct from bitspower if you are savvy. There are going to be brass fittings available at any big hardware store.
> These CPU blocks are very expensive and not particularly well reviewed. I would suggest the new EK block that is coming out soon. Their current Clean CSQ cpu blocks are End of Life... which is pretty lame.
> Note the little circles all over the acetyl on this unit. This is the mark of EK's orig. CSQ line and the unit you've chosen is perfectly fine, I'm sure. But I just thought I'd point out EK changed it because people got tired of the little circles.
> Be careful with the width on your top radiator. What case are you using? Might want to consider a lower profile version.
> I like these rads--they are very sturdy and perform well in tests and reviews, I'm using an XT45 280mm and an 30x360mm in the top of my Corsair 760T
> You may find with the tube reservoir that 60mm wide radiator + push/pull requires a lot of depth in your case.
> Primochill LRT Advanced
> http://www.frozencpu.com/products/17890/ex-tub-1623/PrimoChill_PrimoFlex_Advanced_LRT_Tubing_38ID_x_12_OD_-_10ft_Retail_Pack_-_Onyx_Black_PFLEXA10-12-BK_w_Free_Sys_Prep.html


Thanks for the above ....

I'll be sticking with the case i have , Its a corsairs 540 air

In terms of the cpu block, the reason i went with it is it works with the theme im currently running


----------



## electro2u

Started a build log for my current work in progress.


<<Tri-FireWater>> Corsair 760T, Z87 --> 97 Transition, AMD 295x2+290x, Aquacomputer,...


----------



## xer0h0ur

Nice, hopefully you're up and running soon electro2u. I really do love that waterblock. If Aquacomputer had released the active backplate they had promised then I would have gone with that block just because its that good looking and well the idea of actively cooling the backplate's VRMs is also badass.


----------



## electro2u

It's alive!!!


----------



## xer0h0ur

Lookin good


----------



## ViRuS2k

http://www.overclock.net/content/type/61/id/2146821/
:bigger image : http://i.imgur.com/PJ5UMne.jpg

My EK single slot waterblock is working great now, just added a aquacool monsta rad 420mm as external cooling with 3x alaska viper 140mm fans.

added that to my existing loop of 2x 360mm rads and i have the cooling power now to cool this card to perfection
with no throttling issues at all
max load 62c on coregpu1 and 65c max on coregpu2









also delided my I7 4790k running @4.8ghz., man what a mission to delid these bloody things almost broke my hammer lol


----------



## arieldeboca

you run only 2 vgacard with a 1000w power supply?
Quote:


> Originally Posted by *electro2u*
> 
> 
> It's alive!!!


----------



## xer0h0ur

And I am running a single video card with a 1250W power supply, your point?


----------



## ViRuS2k

bigger image : http://i.imgur.com/NyXuM0z.png

here is another image guys this time the temps @ default card speeds 1030/1300 @defaut Voltage.
I am only uploading these images for reference so people know what to expect with high end watercooling vs aio watercooling.

----

temps taken from extreme mode @3440x1440 resolution @sleeping dogs.
very demanding once you crank up the res and extreme mode. 53c/57c


----------



## ViRuS2k

Quote:


> Originally Posted by *arieldeboca*
> 
> you run only 2 vgacard with a 1000w power supply?


With a quality PSU 1000w is fine for a 295x2 and a single 290x
though he would be pushing it to the brink if he overclocks both cards, i currently run a quality 1250w ocz psu and plan on getting another card very soon myself to add to my loop


----------



## electro2u

Actually I'm using a 1250w psu. Doesn't overkill just make you want to kick a puppy?


----------



## ViRuS2k

Quote:


> Originally Posted by *electro2u*
> 
> Actually I'm using a 1250w psu. Doesn't overkill just make you want to kick a puppy?


hahha probably not a puppy







i like puppys


----------



## electro2u

Yes... Power supply noise just became a new thing for me. Pulling 1075w from the Seasonic x-1250 is enough to make the fan ramp up like a tornado.

Vrms on the 295x2 are... Who knows? But I put Fujipoly ultra on them. Used regular stuff on the 290x vrms and they are 48/58 under extreme load (extended 1440p ultra valley). The three cores are all 58C.

Very very impressed with the technology available to do this water cooling stuff with. I went all out and it was worth it to me.


----------



## arieldeboca

Quote:


> Originally Posted by *xer0h0ur*
> 
> And I am running a single video card with a 1250W power supply, your point?


my question was wrong
I wanted to ask if you have 2 295x2 with a power of 1000 w. I thought that you have 2 295x2 but now I see it's a 295x2 and 290x


----------



## Jpmboy

Quote:


> Originally Posted by *electro2u*
> 
> Yes... Power supply noise just became a new thing for me. Pulling 1075w from the Seasonic x-1250 is enough to make the fan ramp up like a tornado.
> 
> Vrms on the 295x2 are... Who knows? But I put Fujipoly ultra on them. Used regular stuff on the 290x vrms and they are 48/58 under extreme load (extended 1440p ultra valley). The three cores are all 58C.
> 
> *Very very impressed with the technology available to do this water cooling stuff with*. I went all out and it was worth it to me.


it's the only way to go with a high-end build.


----------



## Synthaxx

I did some more temperature testing with quadfire.

The cards currently run at 1100/1600 +40mv and +50 power target.

First I played BF4 for about 6 hours, then did 2 3dmark firestrike and after that 5 valley runs (extreme preset). At the end of the last valley, the cores were at: 66/72/66/72 degrees. Ambient is about 25 degrees.

I did these tests because I wasn't sure why BF4 had insane fps drops after a while. For example I start at 200 fps constant, after 15 minutes it's at 150 after 30 minutes below 100 fps and lastly I stay at 40 fps. Yeah ..40 FPS!!!
When I quit the game, origin always displays a fatal error ...









First I thought of the vrms, so I installed a leafblower (not really, just a normal table fan) but that didn't help either.

So why the hell is BF4 running like crap after a while? If anyone has an idea, please share


----------



## xer0h0ur

Didn't someone say there is a memory leak in the code? Something about an upcoming fix.


----------



## xer0h0ur

Quote:


> Originally Posted by *electro2u*
> 
> There is a memory leak with BF4 and performance degrades over time is what I'm hearing. Should be fixed soon.


----------



## xer0h0ur

So after fumbling around with my loop trying to get out all the air bubbles in the radiators I noticed a few drops of coolant directly under the reservoir. My coolant level had dropped without getting any more air bubbles coming out of the radiators, tubing or the EKWB so I figured I had a hole somewhere in the system. I cleaned the coolant leak and checked every hose connection only to find diddly poo. I was left scratching my head so I figured I had spilled some coolant at some point when I was getting air pockets out and re-filling the reservoir over and over again. Then yesterday an hour after powering on the system I saw coolant again in the same spot so I pulled out the Thermaltake unit and there was the culprit. A hairline crack at the top of the reservoir coming from the cap and extending forward an inch and a half. I guess at some point I overtightened the cap and caused the crack. I took a healthy amount of crazy glue to it, let it dry, then put a second coat on top and surrounding it. Once it was dry again I tested it for leaks, flipped the system upside down with a full reservoir. Finally no leaks and no more air bubbles. Now I have to test all over again. The moral of the story is don't overtighten your reservoir ya big dummy


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> So after fumbling around with my loop trying to get out all the air bubbles in the radiators I noticed a few drops of coolant directly under the reservoir. My coolant level had dropped without getting any more air bubbles coming out of the radiators, tubing or the EKWB so I figured I had a hole somewhere in the system. I cleaned the coolant leak and checked every hose connection only to find diddly poo. I was left scratching my head so I figured I had spilled some coolant at some point when I was getting air pockets out and re-filling the reservoir over and over again. Then yesterday an hour after powering on the system I saw coolant again in the same spot so I pulled out the Thermaltake unit and there was the culprit. A hairline crack at the top of the reservoir coming from the cap and extending forward an inch and a half. I guess at some point I overtightened the cap and caused the crack. I took a healthy amount of crazy glue to it, let it dry, then put a second coat on top and surrounding it. Once it was dry again I tested it for leaks, flipped the system upside down with a full reservoir. Finally no leaks and no more air bubbles. Now I have to test all over again. The moral of the story is don't overtighten your reservoir ya big dummy


AArghhh! You must be the unluckiest guy ever .... First fittings, then waterblock, now the reservoir ...
I thought I was unlucky by just having a leak twice, but you are unlucky with every step.


----------



## electro2u

If it make you feel any better I've managed to somehow kill my aquaero, which still powers on but is no longer outputting any power to anything. No idea how I did this.


----------



## xer0h0ur

They say bad luck happens in threes. For me it feels more like three times three


----------



## electro2u

Xero. You and me should go have lunch. I have luck similar to you it's how It took me so long to get up and running. Luckily for the most part the manufacturers make products pretty stout. I figured out my fan controller was powering on but not distributing power because of a short in my power distribution. Smart little device. I also thought my res was defective and leaking from the faceplate. Turned out to be the acrylic led stop plugs I had in the top. They just weren't tight enough. I had about 6 different leaks in different spots from too loose fittings. You did fine. My project has been slightly nightmarish.


----------



## Jpmboy

takes a lot to kill an aquaero - best controller made.


----------



## AussieCol

Hi Guys,
Please can anyone help me with this issue:http://www.overclock.net/t/1509698/cant-see-both-gpus-with-radeon-r9-295x2-please-help

Anything would be appreciated!

Many thanks.


----------



## Mega Man

been meaning to do this

http://www.techpowerup.com/gpuz/details.php?id=hez4g

atm i have one, thinking i might buy a second,

my a10 wants 2x295x2

i have to say i LOVE the stock shroud,
i would buy a shroud only if a manufacture would just sell them :./


----------



## Synthaxx

Ahh god damnit. The 4K hd just arrived and now I noticed a leak.
I already noticed that the reservoir was dropping very slowly, so I started my search. I noticed one compression fitting was a little wet so I tried tightening it. Suddenly a flood of water came trough.
I'm unlucky and lucky at the same time, because I disconnected the power supply and made sure all electricity was gone by pressing start like 10 times.


----------



## Jpmboy

what's with all the leaks around here? Geeze. luckily, no "snap-crackle-pop" yet.


----------



## electro2u

Like 5 people in here all doing their first water builds thanks to these cards lol. It's become the "leak test" club. I guarantee I had more leaks than any of you. I also apparently have a defective flow meter and I was rewiring the LEDs on my reservoir from molex to 3 pin fan connector and they seem to have died.


----------



## Jpmboy

Quote:


> Originally Posted by *electro2u*
> 
> Like 5 people in here all doing their first water builds thanks to these cards lol. It's become the "leak test" club. I guarantee I had more leaks than any of you. I also apparently have a defective flow meter and I was rewiring the LEDs on my reservoir from molex to 3 pin fan connector and they seem to have died.


it can be tragic to watch - but luckily no "kills" yet.


----------



## xer0h0ur

Yeah its been an adventure no doubt. I am one of the guys doing their first loop because of this card. Its a learning experience if nothing else.

You live by Chicago electro2u?

Sorry to hear about your leak Synthaxx. At least you caught it before it turned tragic. Do let us know what your temps and framerates are like with that 4K monitor whenever you get around to testing.


----------



## Jazam

Did my first liquid cooling loop because of this card and I had to build that little stand because it's scary how heavy the card is lol. Luckily no leaks 1 weeks in. That's red Petg tubing and revolver fittings.


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Sorry to hear about your leak Synthaxx. At least you caught it before it turned tragic. Do let us know what your temps and framerates are like with that 4K monitor whenever you get around to testing.


I just filled up the system and i'm almost through the bleeding process. I'm quite sure it was the bitspower rotary 90° was leaking trough the rubber. I reapplied thermal paste, checked every fitting and used vaseline on the outside of the fittings. Since it repels water I doubt there will be another leak...

Ofcourse I will keep you up to date with the 4k HD monitor









I can already say that overvolting this card causes massive temp increases. At +40 mv, cores hit 72°, at +25 it averages 64°
No matter how fast i put the fans, I can't get the temp down...


----------



## Synthaxx

Ok .... The beast is running again, finally installed all fans and LEDS


----------



## xer0h0ur

Not gonna lie.

Little jelly.


----------



## electro2u

They really are scary heavy. At least the stock setup the radiator was sort of holding mine up. With the giant hunk of copper I've slapped on it it's even heavier perhaps and I'm considering hanging it from a string to add support.


----------



## xer0h0ur

These full coverage waterblocks really are freakin heavy. Stiff as a board, but heavy. I don't know what I could do other than sticking a piece of acrylic under the card or perhaps rigging some fishing line.


----------



## Mega Man

Quote:


> Originally Posted by *Jazam*
> 
> 
> 
> 
> 
> 
> Did my first liquid cooling loop because of this card and I had to build that little stand because it's scary how heavy the card is lol. Luckily no leaks 1 weeks in. That's red Petg tubing and revolver fittings.


On mobile or I would hide the img.

I was gonna suggest fishing line
Quote:


> Originally Posted by *xer0h0ur*
> 
> These full coverage waterblocks really are freakin heavy. Stiff as a board, but heavy. I don't know what I could do other than sticking a piece of acrylic under the card or perhaps rigging some fishing line.


But he did

Or a reverse case labs case or one of the horizontal ones
Quote:


> Originally Posted by *Synthaxx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Sorry to hear about your leak Synthaxx. At least you caught it before it turned tragic. Do let us know what your temps and framerates are like with that 4K monitor whenever you get around to testing.
> 
> 
> 
> I just filled up the system and i'm almost through the bleeding process. I'm quite sure it was the bitspower rotary 90° was leaking trough the rubber. I reapplied thermal paste, checked every fitting and used vaseline on the outside of the fittings. Since it repels water I doubt there will be another leak...
> 
> Ofcourse I will keep you up to date with the 4k HD monitor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can already say that overvolting this card causes massive temp increases. At +40 mv, cores hit 72°, at +25 it averages 64°
> No matter how fast i put the fans, I can't get the temp down...
Click to expand...

You should consider a case labs build

I have 5x480s and just bought 100 4250 rpm gts. Before install I will have them all modded to pwm.


----------



## Mega Man

ordered my second 295x2 tonight


----------



## Synthaxx

Quote:


> Originally Posted by *Mega Man*
> 
> ordered my second 295x2 tonight


hehe congratz!


----------



## Synthaxx

So I did some test range BF4 on 4K hd.

Settings: everything ultra but 0xaa

First I had flickering problems but those magically dissapeared. Then I noticed my mouse was completely off center, I found out that was caused by windows scaling. It was automatically set to 150%. Changed it back to 100% and mouse is back on center in BF4 ... at the cost of a zoomed out windows.

The frames were quite amazing. I got in the helicopter and while blowing things up, frames were around 160 fps average. It got as low as 130 fps and higher as 200 fps.
Temps were still around 60 degrees, but there is still a lot of air in the system, so that might drop later on.


----------



## rdr09

Quote:


> Originally Posted by *Synthaxx*
> 
> So I did some test range BF4 on 4K hd.
> 
> Settings: everything ultra but 0xaa
> 
> First I had flickering problems but those magically dissapeared. Then I noticed my mouse was completely off center, I found out that was caused by windows scaling. It was automatically set to 150%. Changed it back to 100% and mouse is back on center in BF4 ... at the cost of a zoomed out windows.
> 
> The frames were quite amazing. I got in the helicopter and while blowing things up, frames were around 160 fps average. It got as low as 130 fps and higher as 200 fps.
> Temps were still around 60 degrees, but there is still a lot of air in the system, so that might drop later on.


jelly - very. would you get a similar experience or results with a 4K TV? i saw a 55 inch in a local Costco. Looks like the prices are starting to normalize.


----------



## Synthaxx

Quote:


> Originally Posted by *rdr09*
> 
> jelly - very. would you get a similar experience or results with a 4K TV? i saw a 55 inch in a local Costco. Looks like the prices are starting to normalize.


I guess so, but I'm not an expert when it comes to monitors or tvs.


----------



## cennis

Hey, can anyone comment on the vrm cooling capabilities with their waterblocks?

I am looking at the aquacomputer one but it is 60$ more expense than the xspc razor.

based on 290/x experience, the aquacomputer block was only good for vrm IF the aquacomputer backplate was used (to apply pressure)? otherwise it ranks one of the bottom few for vrm cooling almost as a bad as air cooling

and currently there are no aquacomputer backplate for 295x2.

source. http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/


----------



## xer0h0ur

Well I know my EK block is doing its job because overclocking the RAM does raise my gaming temperatures. I was messing around with 1700 MHz clocks and while its fully doable for me I can't handle the heat since it jacks up my temps back into the 70's. 1600 MHz seems to be fine but I just went back to 1500 MHz just for the sake of being safe with my gaming temps. I just don't have the radiator real estate necessary to push my overclocks.


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well I know my EK block is doing its job because overclocking the RAM does raise my gaming temperatures. I was messing around with 1700 MHz clocks and while its fully doable for me I can't handle the heat since it jacks up my temps back into the 70's. 1600 MHz seems to be fine but I just went back to 1500 MHz just for the sake of being safe with my gaming temps. I just don't have the radiator real estate necessary to push my overclocks.


I am more concerned about the VRM temps.

But just a side note, your core temps seems a little high.

awhile back, I bought a spare set of 295x2 cooler from some one who went watercooling, and installed one cooler to each core, with the other blocks hanging.
with 1150/1700 on core/ram at +100mw I did not encounter any temps more than 70c with ambient 26c under valley 1440p.

This is using 2x 120mm stock ALUMINUM rads/stock fans , one for each core. your result shouldn't be worse than mine


----------



## xer0h0ur

Quote:


> Originally Posted by *cennis*
> 
> I am more concerned about the VRM temps.
> 
> But just a side note, your core temps seems a little high.
> 
> awhile back, I bought a spare set of 295x2 cooler from some one who went watercooling, and installed one cooler to each core, with the other blocks hanging.
> with 1150/1700 on core/ram at +100mw I did not encounter any temps more than 70c with ambient 26c under valley 1440p.
> 
> This is using 2x 120mm stock ALUMINUM rads/stock fans , one for each core. your result shouldn't be worse than mine


My loop is also cooling a 4930K with a mild overclock as well as the two GPUs. I don't have the real estate in my case for another radiator. My only option would be an external radiator routed out through an empty PCI slot. Were not using identical setups so you can't assume performance should be equal.


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> My loop is also cooling a 4930K with a mild overclock as well as the two GPUs. I don't have the real estate in my case for another radiator. My only option would be an external radiator routed out through an empty PCI slot. Were not using identical setups so you can't assume performance should be equal.


right, did not put your cpu into consideration.it looks like you have 2x120mm?


----------



## xer0h0ur

Quote:


> Originally Posted by *cennis*
> 
> right, did not put your cpu into consideration.it looks like you have 2x120mm?


Its a 144mm aluminum radiator and an Alphacool NexXxos Monsta 120 copper rad. One of those 80mm thick rads they make. Basically as long as I just use mild overclocks my temps remain in the mid to high 50's and occasionally into the low 60's. Granted I have yet to push 1440p or 4K resolutions since this old monitor only goes to 1600x1200 so I am sure that will change when I pony up the cash for the 4K screen.


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its a 144mm aluminum radiator and an Alphacool NexXxos Monsta 120 copper rad. One of those 80mm thick rads they make. Basically as long as I just use mild overclocks my temps remain in the mid to high 50's with and occasionally into the low 60's. Granted I have yet to push HD resolutions since this old monitor only goes to 1600x1200 so I am sure that will change when I pony up the cash for the 4K screen.


nice, did you have any problems with aluminum rad and cooper rad in the same loop?


----------



## xer0h0ur

Quote:


> Originally Posted by *cennis*
> 
> nice, did you have any problems with aluminum rad and cooper rad in the same loop?


Not thus far but this loop is like days old at this setup. I am using EK's mixed metal coolant and a silver kill coil. I will be flushing and replacing the coolant every three months for the sake of avoiding problems. If it turns out to be problematic then I will say screw it and just take off the thermaltake unit and install my own pump, reservoir and 2nd copper rad.

I am using a Noctua NF-F12 Industrial PPC 2000 RPM PWM fan on the Monsta and the stock thermaltake fan on the aluminum rad. Today I should get in the 2nd Noctua I ordered so I am hoping the increased static pressure can drop the temps a degree or two.


----------



## electro2u

Xero not trying to tell you what to do but you should read about mixing aluminum with copper bro. Then you should immediately remove he Alu from your loop. No one uses aluminum in water cooling loops anymore. It's a galvanic corrosion ticking tomb bomb waiting to happen.


----------



## Jpmboy

Quote:


> Originally Posted by *Mega Man*
> 
> ordered my second 295x2 tonight


awesome!! amazing power with quad cfx!
Quote:


> Originally Posted by *Synthaxx*
> 
> So I did some test range BF4 on 4K hd.
> 
> Settings: everything ultra but 0xaa
> First I had flickering problems but those magically dissapeared. Then I noticed my mouse was completely off center, I found out that was caused by windows scaling. It was automatically set to 150%. Changed it back to 100% and mouse is back on center in BF4 ... at the cost of a zoomed out windows.
> The frames were quite amazing. I got in the helicopter and while blowing things up, frames were around 160 fps average. It got as low as 130 fps and higher as 200 fps.
> Temps were still around 60 degrees, but there is still a lot of air in the system, so that might drop later on.


nice - with one, I get 80-120 with the same settings in large conquest. 160fps @ 4K is sick.


----------



## Jpmboy

Quote:


> Originally Posted by *cennis*
> 
> nice, did you have any problems with aluminum rad and cooper rad in the same loop?


Quote:


> Originally Posted by *xer0h0ur*
> 
> Not thus far but this loop is like days old at this setup. I am using EK's mixed metal coolant and a silver kill coil. I will be flushing and replacing the coolant every three months for the sake of avoiding problems. If it turns out to be problematic then I will say screw it and just take off the thermaltake unit and install my own pump, reservoir and 2nd copper rad.
> 
> I am using a Noctua NF-F12 Industrial PPC 2000 RPM PWM fan on the Monsta and the stock thermaltake fan on the aluminum rad. Today I should get in the 2nd Noctua I ordered so I am hoping the increased static pressure can drop the temps a degree or two.


Quote:


> Originally Posted by *electro2u*
> 
> Xero not trying to tell you what to do but you should read about mixing aluminum with copper bro. Then you should immediately remove he Alu from your loop. No one uses aluminum in water cooling loops anymore. It's a galvanic corrosion ticking tomb bomb waiting to happen.


Just FYI (a bit of myth busting): You can only get galvanic electrolysis if the two metals (with appropriate e-potentials, which Cu and Al have) are in contact with the solution (electrolyte) and in direct electrical contact/connection (eg, physically thru direct contact or thru a wire-circuit of ANY type). Without the direct electrical contact - no galvanic effect. What's possibly an issue is poor control of teh pH of your coolant. get some test strips from any pool store and stay above 6.8. MOst corrosion is bad pH and dead-spots in the flow. Stagnant coolant is very bad.

I ran Al + copper for ~2 years (all aquacomputer stuff: solid copper blocks on 7970's and their 720Mark III external tower) - zero corrosion on the blocks and the 720XT is still running (24/7) cooling the 295x2 in this rig. Used Aq pre-mix in BigPicture. DW + 3% redline water wetter in "ParkBench".


----------



## xer0h0ur

Thanks for the information, it really didn't make sense to me that Thermaltake was selling a water cooling system that used an aluminum radiator with a nickel plated copper waterblock if galvanic corrosion was such an issue.

The only thing I didn't really fully understand was "thru a wire-circuit of ANY type". What does that mean? I already know nothing is making direct contact between the copper and the aluminum in my loop. Its just that other part I don't know.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Thanks for the information, it really didn't make sense to me that Thermaltake was selling a water cooling system that used an aluminum radiator with a nickel plated copper waterblock if galvanic corrosion was such an issue.
> 
> The only thing I didn't really fully understand was "thru a wire-circuit of ANY type". What does that mean? I already know nothing is making direct contact between the copper and the aluminum in my loop. Its just that other part I don't know.


the two metals need to be in electrical contact - eg: your continuity tested will beep







- in addition to the same electrolyte solution. one or the other is insufficient to create a galvanic loop. need both conditions met.


----------



## xer0h0ur

Oh okay, gotcha. Well at least that puts my mind at ease. I will be sure to buy some pH testing strips and test my coolant. Its always cool to learn new things and I sure have learned a lot since beginning this water cooling loop.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh okay, gotcha. Well at least that puts my mind at ease. I will be sure to buy some pH testing strips and test my coolant. Its always cool to learn new things and I sure have learned a lot since beginning this water cooling loop.


Silver coil is another gimmick... and not good to use with nickel. It's supposed to kill growth of anything "green"... well, any trace of copper in the loop has the same exact effect at the molecular/biochemistry. Both are toxic to anything with chlorophyll. If your coolant has ethylene glycol.... toxic to anaerobes. lol, any other life forms please call the CDC.









If you haven't already been there, This is a great resource and links to the OCN liquidcoolers thread.


----------



## Mega Man

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Synthaxx*
> 
> So I did some test range BF4 on 4K hd.
> 
> Settings: everything ultra but 0xaa
> 
> First I had flickering problems but those magically dissapeared. Then I noticed my mouse was completely off center, I found out that was caused by windows scaling. It was automatically set to 150%. Changed it back to 100% and mouse is back on center in BF4 ... at the cost of a zoomed out windows.
> 
> The frames were quite amazing. I got in the helicopter and while blowing things up, frames were around 160 fps average. It got as low as 130 fps and higher as 200 fps.
> Temps were still around 60 degrees, but there is still a lot of air in the system, so that might drop later on.
> 
> 
> 
> jelly - very. would you get a similar experience or results with a 4K TV? i saw a 55 inch in a local Costco. Looks like the prices are starting to normalize.
Click to expand...

no tvs are 30hz only unless a new one has come out with DP ( meaning 30FPS ONLY)
Quote:


> Originally Posted by *cennis*
> 
> Hey, can anyone comment on the vrm cooling capabilities with their waterblocks?
> 
> I am looking at the aquacomputer one but it is 60$ more expense than the xspc razor.
> 
> based on 290/x experience, the aquacomputer block was only good for vrm IF the aquacomputer backplate was used (to apply pressure)? otherwise it ranks one of the bottom few for vrm cooling almost as a bad as air cooling
> 
> and currently there are no aquacomputer backplate for 295x2.
> 
> source. http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/


it was the addtional ACTIVE cooling backplate not the backplate that made it the best as it was semi actively cooling the back as well

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Oh okay, gotcha. Well at least that puts my mind at ease. I will be sure to buy some pH testing strips and test my coolant. Its always cool to learn new things and I sure have learned a lot since beginning this water cooling loop.
> 
> 
> 
> Silver coil is another gimmick... and not good to use with nickel. It's supposed to kill growth of anything "green"... well, any trace of copper in the loop has the same exact effect at the molecular/biochemistry. Both are toxic to anything with chlorophyll. If your coolant has ethylene glycol.... toxic to anaerobes. lol, any other life forms please call the CDC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you haven't already been there, This is a great resource and links to the OCN liquidcoolers thread.
Click to expand...

agreed specifically read this
http://martinsliquidlab.org/2012/01/24/corrosion-explored/

also to note, your block is attached to the ground, which can ! ( very specifically said can not will ) be all the oomph needed for galvanic corrosion
Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> ordered my second 295x2 tonight
> 
> 
> 
> awesome!! amazing power with quad cfx!
Click to expand...

thanks ! although i still have my 5 290xs and 5 7970s, no i never have nor do i mine, i dont even know how >.>


----------



## Jpmboy

And... no body makes PC cooling rads with Al tubing.. Al fins, sure but they use copper tubing. So any name-brand Aluminum radiator is very unlikely to have Al as a coolant contact surface. Fortunately, I have never experienced galvanic (or any form of) corrosion over many years of water cooling.

@mega man - that's a h3ll of a collection of AMD you got there.


----------



## Mega Man

thanks i went overboard, but i love single slot vgas, so i had to get the 295x2 as they made the 2900x i/o with so much fail

this is the IO all VGAs in todays world *SHOULD* have


----------



## Synthaxx

While playing crysis 3, both at 4xaa and 2xaa, 1 core seems to be throttling.
The cores are all high 60s to medium 70s. This is ofcourse with the +25mv 1100/1600 OC.
The frames average above 60fps though ...


----------



## rdr09

Quote:


> Originally Posted by *Mega Man*
> 
> thanks i went overboard, but i love single slot vgas, so i had to get the 295x2 as they made the 2900x i/o with so much fail
> 
> this is the IO all VGAs in todays world *SHOULD* have
> 
> 
> Spoiler: Warning: Spoiler!


thanks Mega about the info on the 4K TV. i use a 24in TV in one of my rigs and it switches from 1080 30 to 1080 60 when i play a game.

you do have quite a collection.


----------



## xer0h0ur

Quote:


> Originally Posted by *Synthaxx*
> 
> While playing crysis 3, both at 4xaa and 2xaa, 1 core seems to be throttling.
> The cores are all high 60s to medium 70s. This is ofcourse with the +25mv 1100/1600 OC.
> The frames average above 60fps though ...


It really is confusing to me that the air cooled 290X has a much higher thermal throttling temperature than the 295X2. I mean its a 20C difference on the same cores.


----------



## Mega Man

well they have to keep the power useage down on only 2 8pins, i would of preferred 4-5 8pins myself

i am sure the VRMs come into play as well, 2 times as much to cool and keep cool

gonna be ordering waterblocks probably tonight.

but the thermal pad for the back of the socket for the backplate , i am deciding if i want to drop 300 on the ultra extreme or just the extreme,

there is a part of me that says,,,,,, yes and others that say no.... undecided

would make it easy for me to put a block on my CPU rear.... that is the #1 pro .... cost #1 con ( 1.5mm )


----------



## xer0h0ur

Well I went with ultra extreme pads and they are doing their job because its sure as hell transferring that heat into my loop.

I may end up taking a page out of your playbook and use that Koolance radiator mounting bracket with the quick release to stick a 360 rad back there externally. Perhaps even go back to isolating the loops so the GPU loop is completely separated from the CPU's. If I eliminate one of my HDD cages then it would make space for a 2nd pump and reservoir or hell if I removed both cages and just mounted my HDDs my own way then for sure I would have the space.


----------



## Mega Man

dont do that, go 480 in back !!!!!

or go caselabs and never have the need to mount one externally


----------



## Synthaxx

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well I went with ultra extreme pads and they are doing their job because its sure as hell transferring that heat into my loop.
> 
> I may end up taking a page out of your playbook and use that Koolance radiator mounting bracket with the quick release to stick a 360 rad back there externally. Perhaps even go back to isolating the loops so the GPU loop is completely separated from the CPU's. If I eliminate one of my HDD cages then it would make space for a 2nd pump and reservoir or hell if I removed both cages and just mounted my HDDs my own way then for sure I would have the space.


Next time I drain the loop, I'll order some fujipoly pads too.
Or I'm limited by the heat transfer of the current pads or by the copper of the waterblock.

Bf4 is giving me a hard time lately. I disabled crossfire, restarted pc and went into Bf4.
Suddenly I get errors the GPU was lost ... I checked my pc ... nope, still there









frames with 1 card in fire range on 4k, 2xaa are about 70+ average


----------



## xer0h0ur

A 480 sounds like super overkill when I am already using an 80mm thick 120 and a 144 rad. I would only go with a 480 if I stopped using the Monsta.


----------



## Mega Man

you can never have too much overkill !


----------



## xer0h0ur

Well if I was trying to run a dead silent setup then sure I could do that and drop my RPMs to 1000 or less.


----------



## axiumone

Holy crap. I'm not sure I've anyone has seen this, but microcenter is offering a diamond 295x2 for $1,099! You can order online too, it's not an in store only thing. Man, that's some savings over what most of us paid. I didn't expect it to drop so much, so quickly.









Diamond - $1,099
http://www.microcenter.com/product/433537/AMD_Radeon_R9_295X2_PCIE_GDDR5_8_GB_Dual_GPU_Video_Card

Asus - $1,149
http://www.microcenter.com/product/432984/AMD_Radeon_R9_295X2_8GB_PCI-Express_Video_Card

I wonder if this is a labor day sale special.


----------



## Mega Man

Quote:


> Originally Posted by *axiumone*
> 
> Holy crap. I'm not sure I've anyone has seen this, but microcenter is offering a diamond 295x2 for $1,099! You can order online too, it's not an in store only thing. Man, that's some savings over what most of us paid. I didn't expect it to drop so much, so quickly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Diamond - $1,099
> http://www.microcenter.com/product/433537/AMD_Radeon_R9_295X2_PCIE_GDDR5_8_GB_Dual_GPU_Video_Card
> 
> Asus - $1,149
> http://www.microcenter.com/product/432984/AMD_Radeon_R9_295X2_8GB_PCI-Express_Video_Card
> 
> I wonder if this is a labor day sale special.


wow nice ketch


----------



## Rohandy

Quote:


> Originally Posted by *axiumone*
> 
> Holy crap. I'm not sure I've anyone has seen this, but microcenter is offering a diamond 295x2 for $1,099! You can order online too, it's not an in store only thing. Man, that's some savings over what most of us paid. I didn't expect it to drop so much, so quickly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Diamond - $1,099
> http://www.microcenter.com/product/433537/AMD_Radeon_R9_295X2_PCIE_GDDR5_8_GB_Dual_GPU_Video_Card
> 
> Asus - $1,149
> http://www.microcenter.com/product/432984/AMD_Radeon_R9_295X2_8GB_PCI-Express_Video_Card
> 
> I wonder if this is a labor day sale special.


I second that is a really nice catch i picked up my Gigabyte card from one of their stores. And now i have to contemplate if its worth removing my ek water block for the roughly $400 savings


----------



## Rohandy

Quote:


> Originally Posted by *Rohandy*
> 
> I second that is a really nice catch i picked up my Gigabyte card from one of their stores. And now i have to contemplate if its worth removing my ek water block for the roughly $400 savings


It turns out my card is also on sale for 1,149.99







I hope these prices hold up long enough for me to get a price adjustment. Thanks axiumone I'll keep you posted you might have just saved me $350.









http://www.microcenter.com/product/432783/AMD_Radeon_R9_295X2_8192MB_GDDR5_PCIe_30x16_Video_Card


----------



## axiumone

Actually guys, you may want to hold off for a few days. Techpowerup just posted an article about the 295x2 dropping to a $999 price point!

http://www.techpowerup.com/mobile/204802/amd-radeon-r9-295x2-to-get-promotional-price-open-to-all.html


----------



## xer0h0ur

Methinks the club is about to expand. I suddenly wish I had bought a stronger PSU for a 2nd one of these


----------



## Rohandy

Quote:


> Originally Posted by *axiumone*
> 
> Actually guys, you may want to hold off for a few days. Techpowerup just posted an article about the 295x2 dropping to a $999 price point!
> 
> http://www.techpowerup.com/mobile/204802/amd-radeon-r9-295x2-to-get-promotional-price-open-to-all.html


Wow talk about a power move in expectancy for Nvidia's new 900 series! This should be interesting.. Not to mention the Z getting a price drop to about $1800 as well, still putting AMD at an advantage in price point.


----------



## Rohandy

Quote:


> Originally Posted by *axiumone*
> 
> Actually guys, you may want to hold off for a few days. Techpowerup just posted an article about the 295x2 dropping to a $999 price point!
> 
> http://www.techpowerup.com/mobile/204802/amd-radeon-r9-295x2-to-get-promotional-price-open-to-all.html


Well I just got back from microcenter with an extra $380 in my pocket.







If this was my second card I'd wait to see if the price goes lower, But I didn't want to take any chances on missing out on the current savings. Just glad I was still within the return period, And now feel like I got an even better deal at just $1150.00 plus tax of course


----------



## Mega Man

even with dropping to 999 i got 2x 840 evo 500gb and 2x lepa g1600 with mine ( both not each ) so i feel i came out on the top

so i posted my validation but never got added.
:/
you can see in AOD there is only 2 gpus \


http://www.techpowerup.com/gpuz/details.php?id=egwe
http://www.techpowerup.com/gpuz/details.php?id=hn8yr
http://www.techpowerup.com/gpuz/details.php?id=4dc7
http://www.techpowerup.com/gpuz/details.php?id=4cu27


----------



## Skinnered

What do you guys think. Would it make sense to go Haswell-E from Z87? I wonder if the extra PCI lanes will make a didfference. I have two R295x2's and gaming in 4K.


----------



## electro2u

Quote:


> Originally Posted by *Skinnered*
> 
> What do you guys think. Would it make sense to go Haswell-E from Z87? I wonder if the extra PCI lanes will make a didfference. I have two R295x2's and gaming in 4K.


No idea really... These cards are so weird it might make things worse. Theoretically it should help by taking the onboard PLX chips out of the equation, since you'll be running natively at 16x/16x

On an unrelated note: holy crap [email protected] throws out a ton more heat than silly 3D benchmarks like Valley.

The trifire setup doesn't completely overwhelm my loop but it heats up my room like a fireplace even when I drop the core clocks a lot. It will be perfect for a heater in the winter.


----------



## Mega Man

pretty sure it will use the PLX even with x16/x16

also found this ! it says 295x2


----------



## stxe34

hi just a little info..i have two of these cards and had the aqua tuning water blocks installed. the cards were around 65-70c which is still quite hot. i now how the ek blocks and they are so much better. they run at 45-50c at full load for over an hour. so if anyone's blocking there cards i would suggest the ek ones!

just a question can you crossfire 3 or 4 of these cards?
thanks


----------



## Mega Man

you can only CFX 4 GPUs which means in the case of the 295x2 2 physical cards, unless you are talking about 290xs

as to your blocks sounds like either you installed them incorrectly or your pump is not powerful enough you should be maxing @ ~ 40-50c

or you simply dont have enough rad space


----------



## stxe34

Quote:


> Originally Posted by *Mega Man*
> 
> you can only CFX 4 GPUs which means in the case of the 295x2 2 physical cards, unless you are talking about 290xs
> 
> as to your blocks sounds like either you installed them incorrectly or your pump is not powerful enough you should be maxing @ ~ 40-50c
> 
> or you simply dont have enough rad space


i have 2 d5's and 2 480x60 alphacool rads so i think my pumps and rad space is enough
they were mounted correctly thats why i am saying the ek's are so much better!


----------



## electro2u

It's Aquacomputer. Aquatuning is a retailer. These blocks have been professionally tested and compared and at worst they are about equal. Thanks for your info though.


----------



## stxe34

Quote:


> Originally Posted by *electro2u*
> 
> It's Aquacomputer. Aquatuning is a retailer. These blocks have been professionally tested and compared and at worst they are about equal. Thanks for your info though.


well not from my experience!


----------



## xer0h0ur

That is way too large of a temperature variance. Something was wrong in that first setup.


----------



## stxe34

they were mounted several times as i tried different thermal paste! nothing changed maybe 1 or 2 degrees! they were sent back as the nickel started coming away from the block so maybe that has something to do with it


----------



## LegacyLG

Hey all

Could this rig set up run for 5 years without needing upgrading

Id lower settings as time went by to help it last longer

Even last resort overclock

Corsair 750D ATX Full Tower Case

Motherboard..

Asus RAMPAGE V EXTREME EATX LGA2011-3 Motherboard

Processor (CPU)..

Intel Core i7-5820K 3.3GHz 6-Core Processor

Processor (CPU) Cooler..

NZXT Kraken X61 106.1 CFM Liquid CPU Cooler

Memory (RAM)..

G.Skill Ripjaws Series 16GB (4 x 4GB) DDR4-2666 Memory

Video Card (GPU)..

Gigabyte Radeon R9 295X2 8GB Video Card
(Quad fired but less problems than other quadfirea due to no bridge needed)
Gigabyte Radeon R9 295X2 8GB Video Card
(Also out ranks nievda even suprises there fan club)

Storage..

Intel DC S3500 Series 240GB 2.5" Solid State Drive

Seagate Barracuda 3TB 3.5" 7200RPM Internal Hard Drive

Power Supply..

Enermax Platimax 1500W 80+ Platinum Certified Fully-Modular ATX Power Supply

Optinal drive..

Asus BW-16D1HT Blu-Ray/DVD/CD Writer

Operating system

Windows 8.1

Montiors for eyefinty..

ViewSonic VA2246M-LED 60Hz 22.0" Monitor

ViewSonic VA2246M-LED 60Hz 22.0" Monitor

ViewSonic VA2246M-LED 60Hz 22.0" Monitor

All together is £4200


----------



## SPLWF

FYI, AMD just slashed the price of this by $500. It's now $1000 USD

http://www.tomshardware.com/news/amd-r9-295x2-promotion,27588.html


----------



## LegacyLG

If it aint clear I ll be 3-way eyefinty


----------



## LegacyLG

Hey all

Could this rig set up run for 5 years without needing upgrading

Id lower settings as time went by to help it last longerEven last resort overclock

Case..

Corsair 750D ATX Full Tower Case

Motherboard..

Asus RAMPAGE V EXTREME EATX LGA2011-3 Motherboard

Processor (CPU)..

Intel Core i7-5820K 3.3GHz 6-Core Processor

Processor (CPU) Cooler..

NZXT Kraken X61 106.1 CFM Liquid CPU Cooler

Memory (RAM)..

G.Skill Ripjaws Series 16GB (4 x 4GB) DDR4-2666 Memory

Video Card (GPU)..

Gigabyte Radeon R9 295X2 8GB Video Card
(Quad fired but less problems than other quadfirea due to no bridge needed)
Gigabyte Radeon R9 295X2 8GB Video Card
(Also out ranks nievda even suprises there fan club)

Storage..

Intel DC S3500 Series 240GB 2.5" Solid State Drive

Seagate Barracuda 3TB 3.5" 7200RPM Internal Hard Drive

Power Supply..

Enermax Platimax 1500W 80+ Platinum Certified Fully-Modular ATX Power Supply

Optinal drive..

Asus BW-16D1HT Blu-Ray/DVD/CD Writer

Operating system

Windows 8.1

Montiors for eyefinty..

ViewSonic VA2246M-LED 60Hz 22.0" Monitor

ViewSonic VA2246M-LED 60Hz 22.0" Monitor

ViewSonic VA2246M-LED 60Hz 22.0" Monitor

All together is £4200

The ps4 will last the next 4 years

So why wouldnt this set up


----------



## Jpmboy

Quote:


> Originally Posted by *stxe34*
> 
> hi just a little info..i have two of these cards and had the aqua tuning water blocks installed. the cards were around 65-70c which is still quite hot. i now how the ek blocks and they are so much better. they run at 45-50c at full load for over an hour. so if anyone's blocking there cards i would suggest the ek ones!
> just a question can you crossfire 3 or 4 of these cards?
> thanks


AQ blocks are fine. You should check the mount. I have the koolance and always ~ ambient +30C

no, you can't "hexa or octa-fire" ... and even if you culd, the PCIE bridge would choke. (no external CFX bridge to unload it to)


----------



## MunneY

FYI. The X99 is on $999.99 On the egg


----------



## joeh4384

At a $1000 bucks, it is tempting to sell my 780ti and pick one up. I guess I will keep monitoring Amazon lol.


----------



## Mega Man

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *electro2u*
> 
> It's Aquacomputer. Aquatuning is a retailer. These blocks have been professionally tested and compared and at worst they are about equal. Thanks for your info though.


Quote:


> Originally Posted by *stxe34*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> you can only CFX 4 GPUs which means in the case of the 295x2 2 physical cards, unless you are talking about 290xs
> 
> as to your blocks sounds like either you installed them incorrectly or your pump is not powerful enough you should be maxing @ ~ 40-50c
> 
> or you simply dont have enough rad space
> 
> 
> 
> i have 2 d5's and 2 480x60 alphacool rads so i think my pumps and rad space is enough
> they were mounted correctly thats why i am saying the ek's are so much better!
Click to expand...

Quote:


> Originally Posted by *stxe34*
> 
> Quote:
> 
> 
> 
> Originally Posted by *electro2u*
> 
> It's Aquacomputer. Aquatuning is a retailer. These blocks have been professionally tested and compared and at worst they are about equal. Thanks for your info though.
> 
> 
> 
> well not from my experience!
Click to expand...

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is way too large of a temperature variance. Something was wrong in that first setup.


Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stxe34*
> 
> hi just a little info..i have two of these cards and had the aqua tuning water blocks installed. the cards were around 65-70c which is still quite hot. i now how the ek blocks and they are so much better. they run at 45-50c at full load for over an hour. so if anyone's blocking there cards i would suggest the ek ones!
> just a question can you crossfire 3 or 4 of these cards?
> thanks
> 
> 
> 
> AQ blocks are fine. You should check the mount. I have the koolance and always ~ ambient +30C
> 
> no, you can't "hexa or octa-fire" ... and even if you culd, the PCIE bridge would choke. (no external CFX bridge to unload it to)
Click to expand...




yep


----------



## SynchroSCP

Very tempted, disappointed with rumored specs on the next gen from team green which I've been waiting on to go sli and dual 1440p or 4k.

My evga 850w can provide 71 amps on the 12v rail, from what I've read this card needs 28amps per 8 pin connector which should be fine. Those of you that have this card will this work out or is a higher rated supply really necessary?


----------



## Mega Man

with one you would be ok

depending on your CPU and oc from there

with 2 i trip OCP on 1kw without even trying, cant even try heaven within .5s it shuts down


----------



## Sideways8LV

Looks like newegg.com just dropped the Gigabyte and XFX models down to $1000. Nice.

XFX Core Edition


----------



## joeh4384

Out of stock instantly.


----------



## Sideways8LV

LOL damn. If the price holds, this is a definite contender for me.


----------



## BrotherBeast

Ordered and XFX model last night.Haven't gotten a shipping notice yet.Hope they have enough stock to fulfill my order.I've always been a fan of the dual gpu cards.I'm running a 5970 right now in my back up rig.


----------



## Mega Man

i have to say i am unimpressed,

they claim it is unlocked, but where is said voltage control?

if only the 290x had the same i/o or better no dvi and just 6 dp..... so much better and you would have voltage control


----------



## xer0h0ur

Not sure if misunderstanding you but the voltage controls are in the overdrive tab of the ccc. At least with the RC3 14.7 drivers.


----------



## joeh4384

I just ordered the XFX 295x2. I think I am going to run it in trifire with my MSI Gaming 290x. I am assuming I should run the 295x2 in the top slot.


----------



## Sideways8LV

*jealous* I need a couple weeks saving and I don't know when this temporary price drop will end


----------



## soulwrath

I am considering to pick up a 295x2 also ;D


----------



## Sketchus

Hello 295x2 owners







Looking for some advice, here's my situation:

I currently have two 7970 in Xfire. I play on a single 1440p monitor. I have another monitor and I'm considering adding a third. I'll probably on have each screen at 1080p. Starting tomorrow the 295x2 SHOULD get a big price drop in the UK (about £300-400). I'm sorely tempted by it. Do you guys think it's worth the upgrade? I know Nvidia supposedly have their new cards coming soon but I can't for the life of me see two 980s competing with this considering they'll likely be £400 each.

What do you think? Wait or buy? I know it's probably pertinent to wait, but here the drop is a limited time offer.

Thanks all









Forgot to add. I have a Corsair HX 750W. Apparently that's the bare minimum Wattage for that GPU, but I heard as long as it's a good PSU it's OK?


----------



## xer0h0ur

Since the 5960X has 40 PCI-E 3.0 lanes am I right to assume two 295X2's would be running at 16x each?


----------



## Syceo

Oh lordy lord..., just picked up a 290x to tri fire with this 295x2, a 4K Asus monitor and all the other bits and bobs for watercooling , rads, fgittings ect ect.... first time watercooling guys, just wanted to ask a quick one. Can I use any distilled water ( just bought 5 litres off Amazon) and a silver coil. My tubing is black so no dyes. So nervous about doing this loop , but hey ho , here we go.


----------



## Sideways8LV

Wow good luck! Do a build thread with lots of pics!


----------



## Syceo

Cheers Buddy.


----------



## Syceo

Does anyone think i may need an additional rad for this furnace ( 295x2 + 290x in tri fire), currently got on order 1x Hardware Labs Black Ice Nemesis Radiator GTS 240 and 1x Hardware Labs Black Ice Nemesis Radiator GTS 360. All gpu's will be running at stock both rads in push pull

Any recommendations would be great.


----------



## electro2u

Trifire with a 360+240/280 is pretty peachy. Just don't try folding on all 3 and leave the house


----------



## Syceo

Thanks ..

Just out of curiosity electro... did you use a Bitspower G 1/4" Matte Black Triple Rotary 90 Degree G 1/4" Adapter for your flow bridge ?


----------



## pompss

Quote:


> Originally Posted by *Syceo*
> 
> Does anyone think i may need an additional rad for this furnace ( 295x2 + 290x in tri fire), currently got on order 1x Hardware Labs Black Ice Nemesis Radiator GTS 240 and 1x Hardware Labs Black Ice Nemesis Radiator GTS 360. All gpu's will be running at stock both rads in push pull
> 
> Any recommendations would be great.


You should be fine.








Be sure you flush the rads form hardware labs 5-6 times. I hear they are pretty dirty inside.


----------



## Syceo

Quote:


> Originally Posted by *pompss*
> 
> You should be fine.
> 
> 
> 
> 
> 
> 
> 
> 
> Be sure you flush the rads form hardware labs 5-6 times. I hear they are pretty dirty inside.


Thanks,,,,

A hot water flush from the tap? or boil up some distilled ??


----------



## pompss

Quote:


> Originally Posted by *Syceo*
> 
> Thanks,,,,
> 
> A hot water flush from the tap? or boil up some distilled ??


Tap Hot water its fine.Also any DI water from publix or from any super market its fine and its the same think but much cheaper.
Also before you install the rad be sure you flush it again with DI Water (distilled water) two more times.


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Thanks ..
> 
> Just out of curiosity electro... did you use a Bitspower G 1/4" Matte Black Triple Rotary 90 Degree G 1/4" Adapter for your flow bridge ?


Can you believe that worked?! I could hug you for noticing. Was not easy to get the rotary connectors tight. In fact my fingers partly discolored the knobby part because bitspower rotaries are extremely stiff but work well. It leaked along with the fitting line from the CPU to MOSFET block during the test but they were fairly easy to fix and the leaks were small. My advice to any first time water cooler working with bitspower fittings is to go tighter than you think you should. I was thinking they could be just kinda barely tight. No. They will leak from being not tight enough and I have had 0 problems with fittings being too tight.

The best part of the whole noobie first fill process was me repeatedly overfilling the funnel with DI h2o and pouring a bunch on my head.


----------



## Mega Man

Quote:


> Originally Posted by *Sketchus*
> 
> Hello 295x2 owners
> 
> 
> 
> 
> 
> 
> 
> 
> Looking for some advice, here's my situation:
> 
> I currently have two 7970 in Xfire. I play on a single 1440p monitor. I have another monitor and I'm considering adding a third. I'll probably on have each screen at 1080p. Starting tomorrow the 295x2 SHOULD get a big price drop in the UK (about £300-400). I'm sorely tempted by it. Do you guys think it's worth the upgrade? I know Nvidia supposedly have their new cards coming soon but I can't for the life of me see two 980s competing with this considering they'll likely be £400 each.
> 
> What do you think? Wait or buy? I know it's probably pertinent to wait, but here the drop is a limited time offer.
> 
> Thanks all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Forgot to add. I have a Corsair HX 750W. Apparently that's the bare minimum Wattage for that GPU, but I heard as long as it's a good PSU it's OK?


it will be pushing it depending on cpu oc
it will be a big upgrade imo ( i have both quad 7970s quad 290xs ( 5 of each actually ) and 2x295x2s )
Quote:


> Originally Posted by *xer0h0ur*
> 
> Since the 5960X has 40 PCI-E 3.0 lanes am I right to assume two 295X2's would be running at 16x each?


yea it can run any 2 cards @ 16/16
Quote:


> Originally Posted by *Syceo*
> 
> Oh lordy lord..., just picked up a 290x to tri fire with this 295x2, a 4K Asus monitor and all the other bits and bobs for watercooling , rads, fgittings ect ect.... first time watercooling guys, just wanted to ask a quick one. Can I use any distilled water ( just bought 5 litres off Amazon) and a silver coil. My tubing is black so no dyes. So nervous about doing this loop , but hey ho , here we go.


if you use nickel block i would recommend using biocide instead but alot of people use nickel + silver, personal pref really
Quote:


> Originally Posted by *Syceo*
> 
> Does anyone think i may need an additional rad for this furnace ( 295x2 + 290x in tri fire), currently got on order 1x Hardware Labs Black Ice Nemesis Radiator GTS 240 and 1x Hardware Labs Black Ice Nemesis Radiator GTS 360. All gpu's will be running at stock both rads in push pull
> 
> Any recommendations would be great.


i would recommend it, 2xgpu x120 each +120

( general idea behind watercooling is 120+ 120x each component ( not inc mobo/vrms/ram ect ) for best temps )


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> Can you believe that worked?! I could hug you for noticing. Was not easy to get the rotary connectors tight. In fact my fingers partly discolored the knobby part because bitspower rotaries are extremely stiff but work well. It leaked along with the fitting line from the CPU to MOSFET block during the test but they were fairly easy to fix and the leaks were small. My advice to any first time water cooler working with bitspower fittings is to go tighter than you think you should. I was thinking they could be just kinda barely tight. No. They will leak from being not tight enough and I have had 0 problems with fittings being too tight.
> 
> The best part of the whole noobie first fill process was me repeatedly overfilling the funnel with DI h2o and pouring a bunch on my head.


reckon im gonna try that one myself...









Are you running a 4k monitor, if so which one? im having issues with the display port on this card and the monitor... absolutely fuming, how did it work out for you?


----------



## BrotherBeast

Finally in the club.


----------



## Syceo

Is anyone using dp 1.1 to achieve 60hz on a 4k or will i need a 1.2 mini display port cable for that, im about to throw this monitor out of the window lol


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> reckon im gonna try that one myself...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are you running a 4k monitor, if so which one? im having issues with the display port on this card and the monitor... absolutely fuming, how did it work out for you?


I'm at 1440p. All is not paradise though. The r9 series could do with some driver work if I'm to be honest. Lots of problems with voltage idling when I don't want it to and in general there is almost no support for dx9 in win 8.1. If it works it works if not. Not.

As an example. Try running Unigine Valley in dx9 mode. Try it in win 7 or 8.1. On my system crossfire doesn't work for dx9 in Unigine valley period.


----------



## Syceo

Is anyone using a regular 1.1 display port cable to get 4K @60 hz or do I really need a 1.2 cable?

Im about to throw this monitor out of the window lol







Quote:


> Originally Posted by *electro2u*
> 
> I'm at 1440p. All is not paradise though. The r9 series could do with some driver work if I'm to be honest. Lots of problems with voltage idling when I don't want it to and in general there is almost no support for dx9 in win 8.1. If it works it works if not. Not.
> 
> As an example. Try running Unigine Valley in dx9 mode. Try it in win 7 or 8.1. On my system crossfire doesn't work for dx9 in Unigine valley period.


I hear you , i didn't realise the amount of time i would have to spend just trying to get even the basics right with this setup. I knew there would be occasions when I would simply have to bite my lip with Amd, however, i do feel as though im getting somewhere now.

Gonna give your ingenious flow bridge a go, and see how far i get.... I'm hoping for a leak free result


----------



## Vaultik

Good evening all.
Just picked up a 295x2 diamond at local microcenter. However seem to be having issues. Currently have it connected to my new x99 asus deluxe board with a 5960x and 1000w platinum evga psu

Ive tried 14.4 and 14.7 but for some reason im getting artifacting in battlefield 4 right away. I have not been able to test any other games but no matter what I do it immediately starts artifacting soon as the game starts.

Tried increasing the voltage in afterburner and didn't help at all. Tried reseating it etc no luck.

Could it just be a bad card maybe?


----------



## electro2u

Quote:


> Originally Posted by *Vaultik*
> 
> Good evening all.
> Just picked up a 295x2 diamond at local microcenter. However seem to be having issues. Currently have it connected to my new x99 asus deluxe board with a 5960x and 1000w platinum evga psu
> 
> Ive tried 14.4 and 14.7 but for some reason im getting artifacting in battlefield 4 right away. I have not been able to test any other games but no matter what I do it immediately starts artifacting soon as the game starts.
> 
> Tried increasing the voltage in afterburner and didn't help at all. Tried reseating it etc no luck.
> 
> Could it just be a bad card maybe?


Does sound that way unfortunately.


----------



## xer0h0ur

As far as I know you have to have a displayport 1.2 cable or else 4K at 60Hz won't work.


----------



## Jpmboy

Quote:


> Originally Posted by *Mega Man*
> 
> it will be pushing it depending on cpu oc
> it will be a big upgrade imo ( i have both quad 7970s quad 290xs ( 5 of each actually ) and 2x295x2s )
> yea it can run any 2 cards @ 16/16
> *if you use nickel block i would recommend using biocide instead but alot of people use nickel + silver, personal pref really*
> i would recommend it, 2xgpu x120 each +120
> 
> ( general idea behind watercooling is 120+ 120x each component ( not inc mobo/vrms/ram ect ) for best temps )


I would agree, nickel plating and a silver coil are not a great mix. Better off using PT nuke+plain DW, or a premix. I use DW + redline water wetter (~<5%) and never any corosion or discoloring of blocks. Some tubing may get cloudy if you pour the redline directly into it.


----------



## Jpmboy

Quote:


> Originally Posted by *xer0h0ur*
> 
> As far as I know you have to have a displayport 1.2 cable or else 4K at 60Hz won't work.


yes. HDMI will be 4K30Hz.


----------



## joeh4384

I would like to join. Now I am just waiting for the 2nd hand 1440p Korean monitors I ordered on /r/hardwareswap.


----------



## Syceo

So I have been going through thread after thread trying to find a way to disable ULPS in the tri fire config. I have tried the reg edit ( setting it to zero) I have tried the auto tool ( but that gives me an error stating that he ULPS reg file cannot be located) I would just like to ask you guys if its is necessary to disable it, and if so has anyone else experienced issues with disabling this ? The feature is not even present in Afterburner or Catalyst so im a bit lost right now.

Any ideas guys


----------



## Mega Man

I can tell you when I use tri fire the option is there. Maybe reinstall drivers?


----------



## Syceo

Quote:


> Originally Posted by *Mega Man*
> 
> I can tell you when I use tri fire the option is there. Maybe reinstall drivers?


Just updated afterburner, that seems to have done the trick, cheers


----------



## ANGELPUNISH3R

Was moving some parts around in my rigs as im about to water cool my 295 but i just thought i would try this for a laugh with my old 690. It actual works.


----------



## Sketchus

Well I just became one of you guys today. Ordered mine


----------



## Zettman

Hello guys,

I just have a short question, I want to buy a R9 295x2 but I'm a little scared that I play mainly DX9 games and I have heard that AMD drivers have some issues with DX9. Is there any chance that AMD willl fix this within the next year or is this "DX9 games are outdated" statement official? Sorry I don't know that much about AMD cause I always used Nvidia cards. The decision to switch over to AMD is because I currently use a triple monitor setup (3240x1920) and wanted to get a LG 34 UM95 (3440x1440) and AMD seems to do better at higher resolutions (as long as it is DX11 or anything else except DX9).

With best regards,
Zettman


----------



## electro2u

You hear that AMD?


----------



## Mega Man

i have no issues with dx9 games sorry :/

all mine work fine

in eyefinity


----------



## electro2u

Mega man do me a favor. Try running Unigine valley in dx9 mode

Until that works. There's a problem. If it works for you then I need to change something

I stopped paying for FFxiv because only 1 GPU worked in win 8.1. Not eager to pay for a month to test if it's been fixed in win 8.1. Worked fine in win 7 but Unigine valley dx9 mode fails in 7 & 8.1 last I checked. Which was about a month ago


----------



## Mega Man

i will. but it will just cause my pc to shut down LOL ( atm running one one 1k psu, been to lazy to change over )


----------



## electro2u

Well let's take a poll real quick from anyone with a functioning 295x2 system, please can you say if you are able to get CF working with Unigine valley in Directx9 mode?


----------



## Mega Man

yep reset tomorrow i am running a 50a 208 line to my pcs which is when ill start mainlining my dual psus

how about 2 questions.

the second one is who is having problems in other things besides valley that use DX9?

the reason i ask is how do you know it is not a issue with the program and not the drivers ?


----------



## boredmug

Quote:


> Originally Posted by *Mega Man*
> 
> yep reset tomorrow i am running a 50a 208 line to my pcs which is when ill start mainlining my dual psus
> 
> hwo about 2 questions.
> 
> the second one is who is having problems in other things besides valley that use DX9?
> 
> the reason i ask is how do you know it is not a issue with the program and not the drivers ?


LOL. You are kidding right? 208v? Got three phase power in that house of yours?


----------



## Mega Man

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> yep reset tomorrow i am running a 50a 208 line to my pcs which is when ill start mainlining my dual psus
> 
> hwo about 2 questions.
> 
> the second one is who is having problems in other things besides valley that use DX9?
> 
> the reason i ask is how do you know it is not a issue with the program and not the drivers ?
> 
> 
> 
> LOL. You are kidding right? 208v? Got three phase power in that house of yours?
Click to expand...

hate to break it to you

208v is easily made

https://www.apc.com/products/resource/include/techspec_index.cfm?base_sku=SMT3000RMT2U

wont even get into the fact that 208-220-230 is really the same thing just with in each others tolerances

there is 208 single phase

and 460 single phase as well to add


----------



## boredmug

Well yes, relatively the same thing, except they come in certain flavors you might say.. 230/240v is only derived from a single phase scenario. A and b phase. 208 is derived from a three phase delta/y center tapped windings, either A and B phase(single phase) or three phase(A,B,C). Also derived from a delta service which is a corner tapped winding where as there is a high leg that reads 208 phase(B) to ground or neutral. 240 between all other phases.

So to as to my post you can clearly see there is no way to derive 208 from a single phase residential service. If you live in a really old part of town or a rural area you might have a delta service, but I wouldn't recommend hooking that stinger leg up your pc unless you like smoke


----------



## Mega Man

sorry for the double post but with the people acting as they have i felt the need to detail what i mean.
not a double as you already posted

i KNOW there is a defined difference between the 3 voltages.
208 230 and 220, and a difference in how they are made ( transformers are different )

with this said my point was that my PC does not care.

another of my points are in some areas yes a home can get 3 phase power,

my last point is a simple UPS ( not the delivery service ) or 3 phase generator could easily make up the difference

idk when OCN became inflamed with such people that rather then teach and build up they try to tear each other down over even the slightest of statements

( not directed at you, just been getting tons of pms/other statements thought out the thread, and frankly sick of it, recently )
also to correct what you are saying, there are several server centers in my are that use 208v, it is perfectly fine.

once i get back home tonight ( just was called out on call )

ill post you an article if i can find it


----------



## joeh4384

Woooosh was all that electrical engineering going over my head.


----------



## boredmug

I was clarifying..That's all. It was a joke sort of question that was going to lead to the suggestion that a l5 30 220/240 volt single phase twist lock and a 30 amp two pole would probably be all that you need and pretty cheap. Would balance your load better too.

Maybe don't be so uptight when someone points out something to you and tries to help??


----------



## boredmug

And to be honest I have a thread right now without a response smartass or serious. I'd take either one at this point.


----------



## electro2u

Seriously mega man? Why don't you do some research. You're starting to sound like you think AMD can do no wrong. If you read my succinct post on this matter carefully you'll see a big complaint about Final Fantasy XIV not using both GPUs In win 8.1 JUST LIKE VALLEY. If there's something wrong with the drivers... Then AMD needs to fix it. If there's something wrong with the FFXIV software why is it working for crossfire 290x and not my 295x2?

Anyway there's obviously a problem with directx9 and crossfire and I proved it by pointing out none of us can run a simple directx9 benchmark like Unigine valley with more than 1 GPU active. Furthermore 2x290 crossfire is better supported (like in FFxiv) than 295x2 crossfire. Again, I haven't tested this in a month or so because it actually costs money to find out if it's been fixed. And then I know it hasn't because AMD has come out and SAID they won't work on DX9 anymore.

As for the peevish suggestion that it's my system that is somehow configured incorrectly, everything else works on my system. The only software I have any problems with is dx9 based afaik. The cards run like crap with chromium wheel browser add-on too. Goes randomly back and forth between smooth and choppy frame rate. On my 295x2 if I turn PCIE link power saving off it stops messing up. With my 290x on top for trifire in this config the issue cannot be solved. With a single 290x or 290 I cannot fix it.


----------



## electro2u

Chrome64 w wheel smoothing (voltage seesaw effect is problem here, cards don't idle correctly), world of Warcraft, and final fantasy XIV all have serious problems with the 295x2.

IMO AMD was usually really great about backwards compatibility since I like to play old games. That's completely changed.


----------



## BrotherBeast

Finally put this system together.The cards are nicely built and huge.The massive Supermicro X8DTG-QF board is actually wider than the cards.


----------



## electro2u

Did you install that system in a cold environment? It looks like a folding machine. But when I trifire fold in my house the loop functions but after 3 hours or so it overwhelms the AC and the temperature in my room gets out of control. Not sure what to do about it.


----------



## Mega Man

Quote:


> Originally Posted by *boredmug*
> 
> I was clarifying..That's all. It was a joke sort of question that was going to lead to the suggestion that a l5 30 220/240 volt single phase twist lock and a 30 amp two pole would probably be all that you need and pretty cheap. Would balance your load better too.
> 
> Maybe don't be so uptight when someone points out something to you and tries to help??


not at all, but no i need 50a

i am going to push over 6kw in pc power and again this is not toward you .

@electro2u

would you like to show me where i attempt to and i quote "peevish suggestion " feel free to let me know. i know i can run a simple DX 9 like heaven, as i did it when i first got the cards

and i looped it like crazy, in DX9

so again all i asked is how do you know it isnt the software and not the drivers. as i have no issues with multiple dx9 games
Quote:


> Originally Posted by *boredmug*
> 
> And to be honest I have a thread right now without a response smartass or serious. I'd take either one at this point.


sorry i was at work and again it was not pointed at you, and for what it is worth wifes family is in town... for 3 months, on 1 month and not handling well


----------



## boredmug

6000/240=25amps. What kind of computer do you have that can pull 50 amps? I thought i saw something about 2 295x2's? Those are what? 500 watts per card? lets be generous and say they pull 600 watts. so 1200 watts. I'm gunna be generous again and say you have some massive overclock on like a 9590 and give you 400watts. So now we are at 1600watts. I'll give you another 200watts on misc. just because. So 1800watts. That's being generous as well.

Is all this for ONE computer or you have some sort of mining operation? Not trying to be a smart ass, just wondering what you have that pulls 6kw?


----------



## boredmug

Quote:


> Originally Posted by *BrotherBeast*
> 
> Finally put this system together.The cards are nicely built and huge.The massive Supermicro X8DTG-QF board is actually wider than the cards.


Specs please? That thing looks AWESOME. Why didn't you watercool the cpu's too?


----------



## Mega Man

Quote:


> Originally Posted by *boredmug*
> 
> 6000/240=25amps. What kind of computer do you have that can pull 50 amps? I thought i saw something about 2 295x2's? Those are what? 500 watts per card? lets be generous and say they pull 600 watts. so 1200 watts. I'm gunna be generous again and say you have some massive overclock on like a 9590 and give you 400watts. So now we are at 1600watts. I'll give you another 200watts on misc. just because. So 1800watts. That's being generous as well.
> 
> Is all this for ONE computer or you have some sort of mining operation? Not trying to be a smart ass, just wondering what you have that pulls 6kw?


it is np

i never said one computer, i have 290x quadfire 7970 quadfire and 295x2 x2

ironically no mining, i dont know how even lol
i just like hardcore hardware5

what i am doing is bring power down from main box

into a distro box and then installing breakers and feeding out from there, my distro box will be connected to my desk not my house

my amd has 2x g1600 this one has 2 lepa 1k leadex going to try to import 2x 1600 plat leadex for this rig

the 7970s are staying in my 8350/9590 love the way the blocks look

290xs will be going back into this

295x2s i will either do 8350 or a10 build with idk yet


----------



## Zettman

Quote:


> Originally Posted by *Mega Man*
> 
> the second one is who is having problems in other things besides valley that use DX9?
> 
> the reason i ask is how do you know it is not a issue with the program and not the drivers ?


This review points out that the R9 295x2 is having huge problems with tunnning a DX9 titel.

Link: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-TITAN-Z-Review/Skyrim
Quote:


> At 4K though things look bad for AMD. *The company has stated quite plainly that the fixes for frame pacing with DX9 just are never coming, and as a result, Skyrim shows some dramatic frame time variance in our testing.* This pushes average frame rates (when removing the runt frames) down to about 90 FPS on the R9 295X2 while both NVIDIA solutions sit at about 120 FPS.




The issue only seems to appear at high resolutions compared with CF. I want to play IL2 BoS which is based on DX9 and because older Simulator titels where also based on DX9 many players have already switched to Nvidia cause they were having problems with AMD.

The R9 295x2 is just 900€ at the moment and looks to be the better option compared to two GTX780Ti for about 1160€, but this DX9 issue is all that stops me from buying it at the moment.

Zettman


----------



## stxe34

ok so i have a problem. i have two r9 295x2's and for some reason all of a sudden one of the gpu's on the second card has disappeared. i have tried uninstalling and re-installing the drivers with no luck! i also noticed that the pci switch on the rampage extreme now only shows one lane occupied! on ccc it shows i have 3 linked adapters so dont know whats going on here!


----------



## rdr09

Quote:


> Originally Posted by *Zettman*
> 
> This review points out that the R9 295x2 is having huge problems with tunnning a DX9 titel.
> 
> Link: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-TITAN-Z-Review/Skyrim
> 
> 
> The issue only seems to appear at high resolutions compared with CF. I want to play IL2 BoS which is based on DX9 and because older Simulator titels where also based on DX9 many players have already switched to Nvidia cause they were having problems with AMD.
> 
> The R9 295x2 is just 900€ at the moment and looks to be the better option compared to two GTX780Ti for about 1160€, but this DX9 issue is all that stops me from buying it at the moment.
> 
> Zettman


if you play DX9 games or run Heaven using DX9, i'd suggest 780 6GB or Titan Z (however many) to run that rez.

edit: i forgot about the 980 or the 970 with the former looking like will be the fastest.


----------



## Zettman

Quote:


> Originally Posted by *rdr09*
> 
> if you play DX9 games or run Heaven using DX9, i'd suggest 780 6GB or Titan Z (however many) to run that rez.
> 
> edit: i forgot about the 980 or the 970 with the former looking like will be the fastest.


And I really thought I could finally make the jump to the red team. The R9 295x2 now listed for 878€ sounds like a really good deal if you look at the results of other DX11 titles. It is a shame that DX9 is still not fixed completely.

Anyway, thanks for the answers.

With best regards,
Zettman


----------



## rdr09

Quote:


> Originally Posted by *Zettman*
> 
> And I really thought I could finally make the jump to the red team. The R9 295x2 now listed for 878€ sounds like a really good deal if you look at the results of other DX11 titles. It is a shame that DX9 is still not fixed completely.
> 
> Anyway, thanks for the answers.
> 
> With best regards,
> Zettman


it really depends on the games you play. I play battlefield, so I have to stick to amd.


----------



## stxe34

ok so it seems one of the cards is not working. when installed in the mb on its own it does not display at boot. when paired with the other r9 295x2 it shows 3 gpus.


----------



## BrotherBeast

Quote:


> Originally Posted by *boredmug*
> 
> Specs please? That thing looks AWESOME. Why didn't you watercool the cpu's too?


This is just my temp setup.I'm waiting on Haswell-EP and C612 boards to reveal themselves.Once you go dual socket you never go back









CPU: Intel Xeon X5680's
Motherboard:Supermicro X9DTG-QF
Memory:Cosair XMS DDR3 1600MHZ 12GB kit
Graphics Cards:OBVIOUS...lol
Network adapter:Intel X540-T2 10Gbe
Power Supply:Corsair AX1500i 1500 Watt MONSTER!!!
OS Drive:Crucial M500 960GB SSD
Storage Drives:2X WD Black 2TB FAEX hard drives.
Optical Driveioneer Blu-ray burner
Chassis:Lian LI V2120X


----------



## BrotherBeast

Quote:


> Originally Posted by *stxe34*
> 
> ok so it seems one of the cards is not working. when installed in the mb on its own it does not display at boot. when paired with the other r9 295x2 it shows 3 gpus.


That's not good.Did you try a different pci-express slot?What Motherboard and PSU are you using?


----------



## stxe34

Quote:


> Originally Posted by *BrotherBeast*
> 
> That's not good.Did you try a different pci-express slot?What Motherboard and PSU are you using?


sivler strider 1600w asus rampage extreme iv. i tried the faulty card in the slot of the good one and it didn't display at boot.


----------



## BrotherBeast

Quote:


> Originally Posted by *stxe34*
> 
> sivler strider 1600w asus rampage extreme iv. i tried the faulty card in the slot of the good one and it didn't display at boot.


Sadly my friend it's RMA time.


----------



## stxe34

Quote:


> Originally Posted by *BrotherBeast*
> 
> Sadly my friend it's RMA time.


already boxed to go..dont know how long until i will get the replacement


----------



## Santho

Could any1 point out a M-ATX Or M-ITX case that can fit r9 295x2 and a 240 rad ?







Thx in advance


----------



## Mega Man

case labs can5


----------



## kalijaga

Quote:


> Originally Posted by *stxe34*
> 
> ok so it seems one of the cards is not working. when installed in the mb on its own it does not display at boot. when paired with the other r9 295x2 it shows 3 gpus.


Hi,

I had the same problem when I installed the second 295x2 in my system, and it was sorted out by reinstalling the driver after slotting the card.
Please reinstall the Catalyst driver and see.

good luck.


----------



## LegacyLG

Has any one notice these can be 6 way crossfired? I know drivers dont exsist yet but could this be the futer of gaming


----------



## sugarhell

Quote:


> Originally Posted by *LegacyLG*
> 
> Has any one notice these can be 6 way crossfired? I know drivers dont exsist yet but could this be the futer of gaming


Oh god no. First fix tri-fire/3 way,quadfire/4-way then just do it. Also without dx12 its impossible


----------



## LegacyLG

Not extatly its down to software they will fix it soon as hardware reachs a dead end they like to hold back to make PR stunts to pull people in like latest is dual gpu with watercool next could be software update all companys do these things to keep media eye


----------



## Zerothaught

I am going to be getting on of these this week, I didn't see any topics regarding this issue, but I was wondering if there is a way to replace the tubing with a different color. Has anyone tried this?


----------



## Syceo

It's been traumatic and I guess enjoyable . So here's my first go at a custom loop


----------



## Elmy

https://www.facebook.com/club3d/photos/a.171026459631160.41757.162718280461978/715710368496097/?type=1&theater

Some photos of my Dual Club3D 295X2 setup from PAX Prime last weekend.

Also in the photo is my 5 Asus VG248QE 144Hz 1ms monitors running a resolution of 5400X1920.


----------



## ocvn

Quote:


> Originally Posted by *Elmy*
> 
> https://www.facebook.com/club3d/photos/a.171026459631160.41757.162718280461978/715710368496097/?type=1&theater
> 
> Some photos of my Dual Club3D 295X2 setup from PAX Prime last weekend.
> 
> Also in the photo is my 5 Asus VG248QE 144Hz 1ms monitors running a resolution of 5400X1920.


Hi, i run 2 x 295x2 with 5x1 eyefinity but the driver after 13.12 are broken for CFx and eyefinity. Tested with 3dmakr and heaven. So curiously what driver you are using? Any problem when run benchmark like heaven 4.0 or 3dmark?


----------



## Elmy

Quote:


> Originally Posted by *ocvn*
> 
> Hi, i run 2 x 295x2 with 5x1 eyefinity but the driver after 13.12 are broken for CFx and eyefinity. Tested with 3dmakr and heaven. So curiously what driver you are using? Any problem when run benchmark like heaven 4.0 or 3dmark?


I havent run 3dmark or heaven....I just play BF4....I just was able to run all 4 GPU's in BF4 with the latest 14.7 RC3 and 14.8 drivers up until that point nothing worked. I can try to run those benchmarks when I get home tonight and let you know.

as always make sure you report all your driver issues to the AMD driver report forms.


----------



## ocvn

Quote:


> Originally Posted by *Elmy*
> 
> I havent run 3dmark or heaven....I just play BF4....I just was able to run all 4 GPU's in BF4 with the latest 14.7 RC3 and 14.8 drivers up until that point nothing worked. I can try to run those benchmarks when I get home tonight and let you know.
> 
> as always make sure you report all your driver issues to the AMD driver report forms.


I followed this thread since post 1 and someone same problem as mine. The only way to run them is driver hack 13.12 mod. I run 14.7rc3 atm. 4 GPUs load 80-100% perfectly with heaven 4.0, however, screen tearing/ lagging between 5 screens (disable Cfx, everything run smooth). With 13.12 driver hack, everything smooth except core only can run 785 Hz. Haven't try 14.8 yet.


----------



## axiumone

Quote:


> Originally Posted by *ocvn*
> 
> I followed this thread since post 1 and someone same problem as mine. The only way to run them is driver hack 13.12 mod. I run 14.7rc3 atm. 4 GPUs load 80-100% perfectly with heaven 4.0, however, screen tearing/ lagging between 5 screens (disable Cfx, everything run smooth). With 13.12 driver hack, everything smooth except core only can run 785 Hz. Haven't try 14.8 yet.


I'm the one having that issue. 14.8 didn't improve the situation on my particular rig. Did you mod the 13.12 yourself? I had the 785mhz core bug as well. If you reinstall the moded drivers a bunch of times, you should be able to get the full core speed. Have to do a clean reinstall.

Also... there may or may not be a some kind of fix on the way at the end of this month. So we shall see.


----------



## ocvn

Quote:


> Originally Posted by *axiumone*
> 
> I'm the one having that issue. 14.8 didn't improve the situation on my particular rig. Did you mod the 13.12 yourself? I had the 785mhz core bug as well. If you reinstall the moded drivers a bunch of times, you should be able to get the full core speed. Have to do a clean reinstall.
> 
> Also... there may or may not be a some kind of fix on the way at the end of this month. So we shall see.


Yeah, follow here and guru3d for mod the 13.12 but sometime it worked, sometime it didn't work. 2 months with this issued. 1 question, after install driver mod, what catalyst you use for best stability and how many time you re-install for get the full speed? I use DDU for clean install.

Count me in for this club:


----------



## axiumone

I used the catalyst control center that comes with 13.12. After installing the driver manually, I install all of the applications in the CCC folder that comes with 13.12.

That gave me the best results.

Edit - You attached pictures after my post. Very nice set up!


----------



## BrotherBeast

Quote:


> Originally Posted by *ocvn*
> 
> Yeah, follow here and guru3d for mod the 13.12 but sometime it worked, sometime it didn't work. 2 months with this issued. 1 question, after install driver mod, what catalyst you use for best stability and how many time you re-install for get the full speed? I use DDU for clean install.
> 
> Count me in for this club:


Water cooled E5-2697V2's, custom blocks on the 295X2's and 5 big arse monitors


----------



## LegacyLG

How much did those 5 screens cost?


----------



## ocvn

Quote:


> Originally Posted by *axiumone*
> 
> I used the catalyst control center that comes with 13.12. After installing the driver manually, I install all of the applications in the CCC folder that comes with 13.12.
> 
> That gave me the best results.
> 
> Edit - You attached pictures after my post. Very nice set up!


Quote:


> Originally Posted by *axiumone*
> 
> I used the catalyst control center that comes with 13.12. After installing the driver manually, I install all of the applications in the CCC folder that comes with 13.12.
> 
> That gave me the best results.
> 
> Edit - You attached pictures after my post. Very nice set up!


Thanks bro. Download your driver and try tmr.


----------



## pompss

guyz selling acquacomputer waterblock r9 295x2 used only for two weeks. Pm me if someone interested








http://www.overclock.net/t/1512006/fs-aquacomputer-kryographics-vesuvius-radeon-r9-295x2-full-coverage-liquid-cooling-block-nickel-acrylic-like-new


----------



## TheSilentCircus

Joining the club! Oooohh so sweet


----------



## Mega Man

@NavDigitalStorm

sooo what do i have to do to be in ? pics ? my system is a mess atm


----------



## marsha11

Im looking at getting one of these cards for 1440p gaming. Ive only ever had Nvidia cards in the past.

I currently have a XFX black edition 750w psu so i guess i will be needing to upgrade..... 850??? 1000???

How are crossfire/AMD drivers ? Are they frequently updated like Nvidias?


----------



## Earth Dog

For 1440p there is no good reason for a $1000 GPU. Grab a 290x or 780/780ti and call it a day.

You would need to upgrade your PSU. If you run stock all around, 850W is enough. If you overclock, I would go 1KW.

As far as drivers, single, dual, tri, quad, its all the same drivers. SLI/CFx always has its hits and misses.


----------



## marsha11

Quote:


> Originally Posted by *Earth Dog*
> 
> For 1440p there is no good reason for a $1000 GPU. Grab a 290x or 780/780ti and call it a day.
> 
> You would need to upgrade your PSU. If you run stock all around, 850W is enough. If you overclock, I would go 1KW.
> 
> As far as drivers, single, dual, tri, quad, its all the same drivers. SLI/CFx always has its hits and misses.


Thanks..just sold my 780. Certainly not enough to give 60+ fps at max settings on some games i play. Im wanting an uncomprimising card ...no dialing back settings and tweaking to get good frames.


----------



## Earth Dog

Oh.. 'one of those' (Must haz 60 FPS or elsez!)


----------



## soulwrath

if you have the funds andd can get the 295x2 I would get it. Depending on your system if you do not even have a water loop setup than the 295x2 CLC will be great. Wont have to pay for pump, reserv, block, fittings, tubing to setup a WC for 290x (VRMs usually get throttled due to heat)


----------



## Mega Man

Quote:


> Originally Posted by *marsha11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Earth Dog*
> 
> For 1440p there is no good reason for a $1000 GPU. Grab a 290x or 780/780ti and call it a day.
> 
> You would need to upgrade your PSU. If you run stock all around, 850W is enough. If you overclock, I would go 1KW.
> 
> As far as drivers, single, dual, tri, quad, its all the same drivers. SLI/CFx always has its hits and misses.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks..just sold my 780. Certainly not enough to give 60+ fps at max settings on some games i play. Im wanting an uncomprimising card ...no dialing back settings and tweaking to get good frames.
Click to expand...

if you fill out rig builder we can help much better ( all threads can see my sig )
Quote:


> Originally Posted by *soulwrath*
> 
> if you have the funds andd can get the 295x2 I would get it. Depending on your system if you do not even have a water loop setup than the 295x2 CLC will be great. Wont have to pay for pump, reserv, block, fittings, tubing to setup a WC for 290x (VRMs usually get throttled due to heat)


i disagree, if he has room go with 2 290xs,

this card is cool for single slot solution, but with 290xs you have the ability to overvolt ( easily ) ! and with water the heat is a non issue


----------



## marsha11

Thanks guys. I could go crossfire but i dont want to have to water cool. The 295x2 suites me better.

With a single GPU things have been hassle free to a point. Is this the case with this card? I dont mind a little work every now and then but i dont want to find half my games aint got crossfire support or performance issues due to 2GPUS.


----------



## TheSilentCircus

Wow I'm super impressed by this card, I'm gaming at [email protected] and it has been a buttery experience aside from Dead Rising 3









I'm seriously impressed by this cooler, I used to have an XFX R9 290X Double Dissipation series and even Dota 2 would cause it to rise to about 80 degrees Celsius. I haven't seen the 295x2 go beyond 65 and that's just crazy! Well worth the money!


----------



## Zerothaught

I am having trouble finding the information, but does anyone know if a Thermaltake TR2 RX 1000w will be fine for this GPU? I know it has the correct wattage, but I've heard something regarding the amount of amperage needed for each 8 Pin PCI-e.


----------



## TheSilentCircus

Quote:


> Originally Posted by *Zerothaught*
> 
> I am having trouble finding the information, but does anyone know if a Thermaltake TR2 RX 1000w will be fine for this GPU? I know it has the correct wattage, but I've heard something regarding the amount of amperage needed for each 8 Pin PCI-e.


That PSU has two 12v rails at 50A each. You should be fine, you need at least 28A for each 8-pin input.

http://www.amd.com/Documents/Selecting-a-System-Power-Supply-for-the-AMD-Radeon-R9-295X2-Graphics-Card.pdf


----------



## marsha11

Good job i checked looks like this card is to big for my case ?

If i do crossfire with 2 290x, do i really need to look at water cooling?


----------



## stxe34

Quote:


> Originally Posted by *TheSilentCircus*
> 
> Wow I'm super impressed by this card, I'm gaming at [email protected] and it has been a buttery experience aside from Dead Rising 3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm seriously impressed by this cooler, I used to have an XFX R9 290X Double Dissipation series and even Dota 2 would cause it to rise to about 80 degrees Celsius. I haven't seen the 295x2 go beyond 65 and that's just crazy! Well worth the money!


dead rising 3 runs very very well with the mod to unlock the 30fps limit! just google it bud!


----------



## TheSilentCircus

Quote:


> Originally Posted by *stxe34*
> 
> dead rising 3 runs very very well with the mod to unlock the 30fps limit! just google it bud!


I've already uncapped it, I get about 45-60fps, it dips down to mid 20s-30s during cutscenes. I tried using the Skyrim crossfire profile as what some have stated and it seems to help it a bit.


----------



## Sketchus

Anyone tell me how well Watch Dogs should be running? On ultra settings it's sinking to like 25-30ps just driving around...


----------



## Vongoros

I have just purchased an R9 295x2, in fact it just arrived today! I will be installing it later this evening, but I am sure I will love it! Great to be in the club!


----------



## xer0h0ur

I thought Watchdogs was a game that was coded to be optimized for Nvidia's cards. Something AMD claimed was sabotage since they put a gun to the developers head saying you can't show the source code to anyone.


----------



## joeh4384

What driver are you using for watchdogs. It works better with 14.7 but some of the bushes flicker which is slightly annoying.


----------



## Sketchus

I was using the 14.7 beta.

For reference the rest of my system is:
12GB RAM
ASUS Maximus Hero VII
i7 4790k.

What kinda performance do you guys get?


----------



## 4K-HERO

fifa 15 demo is crashing with this card as soon as you select the language. Is anyone having the same problem?


----------



## Mega Man

Quote:


> Originally Posted by *marsha11*
> 
> Thanks guys. I could go crossfire but i dont want to have to water cool. The 295x2 suites me better.
> 
> With a single GPU things have been hassle free to a point. Is this the case with this card? I dont mind a little work every now and then but i dont want to find half my games aint got crossfire support or performance issues due to 2GPUS.


i have not had any issues yet


----------



## marsha11

Im reading alot about water cooling in this thread, i realise many of you guys bench. But mine would be for gaming. Will the cards cooling system be ok on its own?

Also i have a crossfire motherboard. The top PCI card slot is quite close to my CPUs cooler fan an im concerned the card might be to wide and touch it or not fit. Can you run this card in the lower second slot?


----------



## Mega Man

never recommended but yes please be sure it is a x16 wired ( not just physical ) slot !

cooling works ok, i use it as i always test my card, i did put 2 GTs 4250 on each rad, helped alot the fan on it sucks, mine are controlled by a aq6xt though in pwm they are wispier quiet


----------



## marsha11

Quote:


> Originally Posted by *Mega Man*
> 
> never recommended but yes please be sure it is a x16 wired ( not just physical ) slot


How will i know if its x16 or physical?


----------



## BIG 5

Just installed two R9 295x2 on Asus X99 Deluxe. I'm have clearance issues with the card and the board. The Crystal sound portion below the rear panel is keeping the card from being 100% seated in pci-e x16. Has anyone else installed R9 295x2 on Asus x99 Deluxe, or had a similar issue on another board?


----------



## xer0h0ur

Its just a bloody piece of plastic. Remove the stupid thing. Or you can alternatively use a riser cable and set up the card elsewhere if it really comes down to it. Me personally I wouldn't give a crap about some white plastic cover.


----------



## Mega Man

Quote:


> Originally Posted by *marsha11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> never recommended but yes please be sure it is a x16 wired ( not just physical ) slot
> 
> 
> 
> How will i know if its x16 or physical?
Click to expand...

sorry if you have rigbuilder I am on mobile atm. What mobo is it


----------



## marsha11

Its P8 Z77 P LX. Ive checked, i think its ok. Plus it looks like it will fit it the main PCI e slot.

Im having to buy a new case to fit this card in, and a new psu but im about to pull the plug.

Reading about Dual GPUs .... have they eliminated the stuttering associated with crossfire 2 card setups?


----------



## Mega Man

i dont think that has x16 in the second slot, it looks to be x8


----------



## marsha11

Quote:


> Originally Posted by *Mega Man*
> 
> i dont think that has x16 in the second slot, it looks to be x8


Ahh, ok thanks for that.


----------



## ragulih

Hello! May I join your very exclusive club!? =)



I've had this beast for a few days now, veery good performance.

Only thing I've noticed, that when I try to OC, some games gets worse performance than stock clocks. Really don't get it







I drive iRacing.com World Championship Series. There is Suzuka on next saturday and it demands quite a lot from GPU, so I've tried some OC stuff but the performance gets worse. I don't know if it is also the problem, but Afterburner shows GPU usages dropping to 0 at some point when I'm driving, and then it gets up to 100% again. In game I can't notice anything clear differences, maybe just a bit micro stuttering but performance is the same.

When the GPU usages falls to 0%, voltage shows it stays stable all the time. When the GPU is loaded, the voltage starts to drop a bit.

Anyways, I've been benchmarking with stock cooling and was able to push this to 1100Mhz/1625Mhz w/ +60mV. Temps were fine(...?), 71-73 to both GPUs.

I've been reading this forum for few weeks and I've been seeing some GPU Bios flash stuff, what does it do for GPU? Does it add performance?


----------



## ocvn

Quote:


> Originally Posted by *ragulih*
> 
> Hello! May I join your very exclusive club!? =)
> 
> 
> 
> I've had this beast for a few days now, veery good performance.
> 
> Only thing I've noticed, that when I try to OC, some games gets worse performance than stock clocks. Really don't get it
> 
> 
> 
> 
> 
> 
> 
> I drive iRacing.com World Championship Series. There is Suzuka on next saturday and it demands quite a lot from GPU, so I've tried some OC stuff but the performance gets worse. I don't know if it is also the problem, but Afterburner shows GPU usages dropping to 0 at some point when I'm driving, and then it gets up to 100% again. In game I can't notice anything clear differences, maybe just a bit micro stuttering but performance is the same.
> 
> When the GPU usages falls to 0%, voltage shows it stays stable all the time. When the GPU is loaded, the voltage starts to drop a bit.
> 
> Anyways, I've been benchmarking with stock cooling and was able to push this to 1100Mhz/1625Mhz w/ +60mV. Temps were fine(...?), 71-73 to both GPUs.
> 
> I've been reading this forum for few weeks and I've been seeing some GPU Bios flash stuff, what does it do for GPU? Does it add performance?


71-73 is not good temp because 75 is throtling point, that might be your problem of dropping GPU. With stock wc, i think best temp should be 60-65.


----------



## ragulih

Yeah. But the problem occurs also with only mild OC, temps way under 70.


----------



## joeh4384

Does adding a 2nd fan improve temps? I noticed I get pretty close to the throttle point occasionally. I suspect I am running warmer due to cross-firing with a 290x. I think AMD should have set the throttle point a little bit higher maybe 80-85.


----------



## Syceo

Quote:


> Originally Posted by *joeh4384*
> 
> Does adding a 2nd fan improve temps? I noticed I get pretty close to the throttle point occasionally. I suspect I am running warmer due to cross-firing with a 290x. I think AMD should have set the throttle point a little bit higher maybe 80-85.


To get any sort of decent temps with the stock cooler I had to do a push pull config with the Notcua industrial 2000rpm fans... worked a treat. But when i added a 290x into the mix, I was left with no alternative than to do a custom loop. Best decision i ever made to be honest.


----------



## ragulih

I guess I'm bottlenecked by CPU, i7-3820 @ 4,6Ghz. Well, off to wait to get money for 5960X


----------



## Mega Man

i use push pull on both of mine, but i amusing high speed fans, and i turn them way down !


----------



## Papuz

Hello, I'm buying a screen 4k + sli titan black, I have always been loyal to Nvidia but I have seen the power of R9 295x2 and a few days the price has almost halved, I would use two R9 295x2,
advice? I'm going in the right direction?
problems?

Main usage, 3d cad (autocad, archicad, revit, solidworks, cinema 4d ecc), rendering and gaming


----------



## BrotherBeast

Quote:


> Originally Posted by *Papuz*
> 
> Hello, I'm buying a screen 4k + sli titan black, I have always been loyal to Nvidia but I have seen the power of R9 295x2 and a few days the price has almost halved, I would use two R9 295x2,
> advice? I'm going in the right direction?
> problems?
> 
> Main usage, 3d cad (autocad, archicad, revit, solidworks, cinema 4d ecc), rendering and gaming


Are those applications better optimized for Nvidia or AMD graphics?Does the performance scale positively with multiple cards?You are going to have to research those applications.


----------



## Syceo

Quote:


> Originally Posted by *BrotherBeast*
> 
> Are those applications better optimized for Nvidia or AMD graphics?Does the performance scale positively with multiple cards?You are going to have to research those applications.


if your in the UK I think overclockers are doing them for a ridiculously low price, grab 2 of these and your set. But consider the heat and Psu. I like many here have slapped a block on this beasts, you may want to do the same


----------



## vinnyv11

My card literally arrived at my house 10 minutes ago. I am pumped to get it thrown in my system but a little worried about the install because of the waterblock. Any tips for install? I'm relatively new to these high end cards so any advice for checking temps and installing would be helpful.


----------



## Syceo

Quote:


> Originally Posted by *vinnyv11*
> 
> My card literally arrived at my house 10 minutes ago. I am pumped to get it thrown in my system but a little worried about the install because of the waterblock. Any tips for install? I'm relatively new to these high end cards so any advice for checking temps and installing would be helpful.


What case are you using? .The first thing i done was change the fans on the rad, i threw 2 sp120's in push pull and then put it in the exhaust position at the back of the case and job done


----------



## vinnyv11

Quote:


> Originally Posted by *Syceo*
> 
> What case are you using? .The first thing i done was change the fans on the rad, i threw 2 sp120's in push pull and then put it in the exhaust position at the back of the case and job done


I have a Coolermaster Haf 932 http://www.coolermaster.com/case/full-tower/haf-932/. I was thinking of replacing the rear 140 fan with the rad in the back of the system since I have the other 3 fans in the unit to cool the other components (probably going to replace my CPU fan with watercooling). Is replacing the fans recommended/essential? Still a little new to the watercooling stuff so I get the concept of push/pull just not the execution.


----------



## Syceo

What I would do is see how you get on in the stock configuration, me personally I found the stock inadequate for the amount of heat the card was generating. I often found the card reaching its threshold very often, it was then I decided to swop out the stock fan and use better suited ones in a push pull config, this instantly made a difference. Dont forget the ambient temps in the room will affect temps.

I went down the full custom water-cooling loop simply because i added a 290x into the mix. So my recommendation is to get it installed as is and then see how you get on .


----------



## xer0h0ur

Adding a second fan to the stock radiator on the 295X2 will make a few degrees difference. As for radiator fans I look for static pressure improvement not CFM.


----------



## Syceo

Quote:


> Originally Posted by *vinnyv11*
> 
> I have a Coolermaster Haf 932 http://www.coolermaster.com/case/full-tower/haf-932/. I was thinking of replacing the rear 140 fan with the rad in the back of the system since I have the other 3 fans in the unit to cool the other components (probably going to replace my CPU fan with watercooling). Is replacing the fans recommended/essential? Still a little new to the watercooling stuff so I get the concept of push/pull just not the execution.


Just has a look at the case, should be alight in that especially with that side mounted fan


----------



## vinnyv11

Quote:


> Originally Posted by *Syceo*
> 
> Just has a look at the case, should be alight in that especially with that side mounted fan


Thanks for the prompt reply. I'm thinking of using that rear fan slot as discussed until my water cooling is put in then may go to a top mount with the CPU raditor as it has a 3 120M fan slot on top (radiator for CPU I want is the Corsair 100i).

The push/pull configurations are interesting. Just to confirm is that just a matter of putting a fan on each side of the rad?

Also just as a high level question what software are people using to monitor temps when testing (CPU/GPU etc.)

Sorry for any basic questions but this is a complete rebuild in the works here.

Current Build:

Asus Maximus VII Hero Board
Intel Core i7 - 4790K 4.0GHz
12 gb DDR3 Ram
256 GB Intel SSD
1TB WD 7200 Rpm
Sapphire R9 295X2
Sound Blaster Z
Coolermaster Haf 932 Case
Corsair GS800 V1
Samsung U28D590D 4K Monitor
Astro A50 Headphones

Upgrades Pending:

Additional SSD for Raid with current
RAM update (potential if new memory is compatible with 1150 board)
CPU water Cooling System
Custom Loop Water Cooling System (Down the line dream)


----------



## Syceo

Quote:


> Originally Posted by *vinnyv11*
> 
> Thanks for the prompt reply. I'm thinking of using that rear fan slot as discussed until my water cooling is put in then may go to a top mount with the CPU raditor as it has a 3 120M fan slot on top (radiator for CPU I want is the Corsair 100i).
> 
> The push/pull configurations are interesting. Just to confirm is that just a matter of putting a fan on each side of the rad?
> 
> Also just as a high level question what software are people using to monitor temps when testing (CPU/GPU etc.)
> 
> Sorry for any basic questions but this is a complete rebuild in the works here.
> 
> Current Build:
> 
> Asus Maximus VII Hero Board
> Intel Core i7 - 4790K 4.0GHz
> 12 gb DDR3 Ram
> 256 GB Intel SSD
> 1TB WD 7200 Rpm
> Sapphire R9 295X2
> Sound Blaster Z
> Coolermaster Haf 932 Case
> Corsair GS800 V1
> Samsung U28D590D 4K Monitor
> Astro A50 Headphones
> 
> Upgrades Pending:
> 
> Additional SSD for Raid with current
> RAM update (potential if new memory is compatible with 1150 board)
> CPU water Cooling System
> Custom Loop Water Cooling System (Down the line dream)


Thats right, adding a fan on either side ( making sure they are both blowing in the same direction ) . You can use a splitter and either run the fans straight off the card or your mobo. I started with the sp120s as I had them from a previous build then i replaced those with 2x Noctua NFF12 2000 rpm fans (but i didnt enjoy the noise), in terms of monitoring you can use afterburner , which will pretty much give you an overview of your temps on the gpu , i personally use this http://www.cpuid.com/softwares/hwmonitor-pro.html but that's just my own personal preference as its more comprehensive.


----------



## Mega Man

Quote:


> Originally Posted by *vinnyv11*
> 
> My card literally arrived at my house 10 minutes ago. I am pumped to get it thrown in my system but a little worried about the install because of the waterblock. Any tips for install? I'm relatively new to these high end cards so any advice for checking temps and installing would be helpful.


for temps use HWINFO

as to install you want the hoses on the bottom ( mount it horizontal or above the GPU ) per the manual


----------



## pompss

Selling r9 295x2 waterblock from acquacomputer used for two week like new !!!

http://www.overclock.net/t/1512006/fs-aquacomputer-kryographics-vesuvius-radeon-r9-295x2-full-coverage-liquid-cooling-block-nickel-acrylic-like-new


----------



## xer0h0ur

Quote:


> Originally Posted by *pompss*
> 
> Selling r9 295x2 waterblock from acquacomputer used for two week like new !!!
> 
> http://www.overclock.net/t/1512006/fs-aquacomputer-kryographics-vesuvius-radeon-r9-295x2-full-coverage-liquid-cooling-block-nickel-acrylic-like-new


So what did you end up doing with the card after all? Did you RMA it? Did you sell it?


----------



## ocvn

Quote:


> Originally Posted by *ragulih*
> 
> Yeah. But the problem occurs also with only mild OC, temps way under 70.










disable ulps maybe, try all possible way inc reinstall driver. also compare with custom block and ref wc, i found out the drop might due to vrm because before i tested with heaven 4.0, quite a lot of drop out frenquency/ usage with stock fun of wc. I opened the shroud, and connected the vrm fan to 4 pin 12 V to get maximum fun speed of vrm, the temp of card cooler but sometime, it still dropped. After replacing the wc block, no more.
Quote:


> Originally Posted by *joeh4384*
> 
> Does adding a 2nd fan improve temps? I noticed I get pretty close to the throttle point occasionally. I suspect I am running warmer due to cross-firing with a 290x. I think AMD should have set the throttle point a little bit higher maybe 80-85.


No, with 70 degree, if you touch your rad, it already hot, not warm anymore, so if 80-85, i think it will burn or have a problem with the loop (Affect the internal pump also). Push pull fan is help a few degree:thumb:


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> So what did you end up doing with the card after all? Did you RMA it? Did you sell it?


I sold the card for $930 on Craigslist after Xfx replace me the card . Thanks god the guy didn't know about the price drop.







. The replacement from XFX didn't fixed my problems
In my experience this card was a complete Disaster. A lot of stuttering and horizontal line coming up every 20-30 sec.... something like interference.
Gaming experience was bad for me. Core Speed drop frequently even with the waterblock and even if vrm temperature was only 45c.
One of my friends bought this card from new egg and he had my same exact problems. He returned and planning to buy two gtx 780 ti Kingpin.
I will go back to my gtx 780 ti kingpin or wait the new gtx 980.
Playing at 1440p its gonna be enough with that card until the new Acer 27'' ips 144hz comes out.Then i wanna go SLI.


----------



## Satchmo0016

Has anybody tried putting a 240mm rad on the stock cooling system? I'm not sure how to go about fittings on the OEM tubes.

My 2nd replacement sapphire also r9 295x2 throttles at 75c under normal 1440p gaming. The temp on the rad reads ~60c via thermal gun.

I don't really want to get a whole WC setup for the video card because I already have a AIO for CPU. But I think a bigger rad would make the difference.


----------



## pompss

Quote:


> Originally Posted by *Satchmo0016*
> 
> Has anybody tried putting a 240mm rad on the stock cooling system? I'm not sure how to go about fittings on the OEM tubes.
> 
> My 2nd replacement sapphire also r9 295x2 throttles at 75c under normal 1440p gaming. The temp on the rad reads ~60c via thermal gun.
> 
> I don't really want to get a whole WC setup for the video card because I already have a AIO for CPU. But I think a bigger rad would make the difference.


Even with a 360 rad you will not fix the issues as the vrm's need to be watercooled.U need to go with WC so THE VRM's are cooled down and maybe you will get rid of the Throttling or at least get less.
I'm selling one for $180 if you interested here in OCN,
Put my temperature down to 45-50c on full load and 45-55 vrm's in overclocked mode but my card was still having little throttling issues but was a significant improvement .

My suggestion is if you don't wanna put a water block just sell the card on ebay or return it because the problem its not the core but the vrm's that need to be cooled efficiently


----------



## electro2u

Quote:


> Originally Posted by *pompss*
> 
> Even with a 360 rad you will not fix the issues as the vrm's need to be watercooled.U need to go with WC so THE VRM's are cooled down and maybe you will get rid of the Throttling or at least get less.
> I'm selling one for $180 if you interested here in OCN,
> Put my temperature down to 45-50c on full load and 45-55 vrm's in overclocked mode but my card was still having little throttling issues but was a significant improvement .
> 
> My suggestion is if you don't wanna put a water block just sell the card on ebay or return it because the problem its not the core but the vrm's that need to be cooled efficiently


Quote:


> Originally Posted by *pompss*
> 
> I sold the card for $930 on Craigslist after Xfx replace me the card . Thanks god the guy didn't know about the price drop.
> 
> 
> 
> 
> 
> 
> 
> . The replacement from XFX didn't fixed my problems
> In my experience this card was a complete Disaster. A lot of stuttering and horizontal line coming up every 20-30 sec.... something like interference.
> Gaming experience was bad for me. Core Speed drop frequently even with the waterblock and even if vrm temperature was only 45c.
> One of my friends bought this card from new egg and he had my same exact problems. He returned and planning to buy two gtx 780 ti Kingpin.
> I will go back to my gtx 780 ti kingpin or wait the new gtx 980.
> Playing at 1440p its gonna be enough with that card until the new Acer 27'' ips 144hz comes out.Then i wanna go SLI.


Kinda bizarre. No one else has all these issues besides you and your "friend". Quite frankly the temp stuff quoted here is utter nonsense and the fact you had 2 cards with same "issues" points at your system and not the 2 cards. If you were getting temps that low you shouldn't have been throttling. Also really doubting the load temp validity altogether

Seems to me people spend too much time looking at graphs of GPU usage in AB and don't actually know what it means. Cards not supposed to be 100% GPU usage all the time. If the card was actually lowering it's core speeds, that's throttling.

I would not be taking advice on setting this card up from someone who RMAd one and then still couldn't get the replacement working right, apparently.

If you want to stop throttling replace the thermal pads on the vrms with fujipoly ultra

Also there is no way to monitor vrm temps on these cards to my knowledge.


----------



## pompss

Quote:


> Originally Posted by *electro2u*
> 
> Kinda bizarre. No one else has all these issues besides you and your "friend". Quite frankly the temp stuff quoted here is utter nonsense and the fact you had 2 cards with same "issues" points at your system and not the 2 cards. If you were getting temps that low you shouldn't have been throttling. Also really doubting the load temp validity altogether
> 
> Seems to me people spend too much time looking at graphs of GPU usage in AB and don't actually know what it means. Cards not supposed to be 100% GPU usage all the time. If the card was actually lowering it's core speeds, that's throttling.
> 
> I would not be taking advice on setting this card up from someone who RMAd one and then still couldn't get the replacement working right, apparently.
> 
> If you want to stop throttling replace the thermal pads with fujipoly ultra


you don't know what you talking about if you dont even know how to check vrm temps( never hear about infrared temp gun ) ??.this card have serious issues.Stop been fan boy please.
I don't know if we follow the same thread but i see a lot of people complain of throttling problems here in this thread and in the web in general.
the think that you need to buy fujipoly ultra makes clear that this card have serious issues with vrms overheat.
I was using r9 290x no issues, after gtx 780 and no issues, then gtx 780 ti no issues.i really doubt its my system but most like its the card that have serious problems.
Also this card have serious stuttering issues making the game experience even worst.
U Think its normal that the core speed drops from 100% to 20% and fps form 60 to 30 fps every 10 sec plus stuttering issues and you need to open the card (void warranty) to install the extreme pad or a water block to fix the issues ??
Let be serious and not fan boy please.


----------



## electro2u

Yah you and your temp gun are full of it. I have a crap ton of complaints about the 295x2 but yours are FUD. I'm the guy who can't play ffxiv in Crissfire with the card and wants to throw it out the window. But bs is bs. Your definition of throttling is wrong your understanding of how the card works is wrong and there are vrms on top and bottom of the card and you can't monitor them in real time period. I don't believe your stories and you should take your revolving attempts to sell your abandoned block out of the thread


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> Yah you and your temp gun are full of it. I have a crap ton of complaints about the 295x2 but yours are FUD


What are all these complaints? am i mssing something here . My card is fine lol should i be worried ?


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> What are all these complaints? am i mssing something here . My card is fine lol should i be worried ?


Most of my complaints about the 295x2 have to do with crossfire support. 2 290x work in crossfire for FFxiv but my 295x2 only uses 1 GPU in that game, which sucks balls because it's the game I bought it for and then I couldn't return it even though it didn't work correctly. Dx9 with 295x2 is not necessarily going to work well. Then I try other things and they work fine. Software compatibility with the 295x2 is hit or miss. That's a valid complaint.

Besides that the stock cooling is not adequate which is a common enough complaint, but if your vrm temps are too high while under water (temp gun my nads) it means you went cheap on the thermal pads bro


----------



## electro2u

Furthermore I specifically asked pompss (pretty sure it was him, looking for the post) to help me troubleshoot FFxiv on the 295x2 when he replied to a post I made months ago about it, confirming that he too had issues with FFxiv. His response: can't comps in the shop. If dude had a defective card he wouldn't have sold it. Would be asking for a refund experience. Now if it wasn't working with a specific game or something, that might be a little more on the money. To be fair FFxiv works fine on my 295x2 in win7. In win 8.1 it only uses 1 gpu

Here's the dingdingding reply after pompss let it slip that he even had problems with the 780ti kingpin... Which he also RMAd and presumably sold the replacement:
Quote:


> Originally Posted by *hotrod717*
> 
> With all those recent failed cards, I'd be looking at something else in your system. 3 bad cards in a row seems to be very,very high odds.


----------



## electro2u

Quote:


> Originally Posted by *pompss*
> 
> guyz have a problem.
> my second gpu doesn't get more then 300 mhz when playing crysis 3 and fps is 32-45 .
> My psu is seasonic ss-850 km (850 watt) also i have plug monitor where i can see how much watt my pc is using when gaming.
> The max watt i get is 600 when playing so its not the psu .
> Any advise?? and can someone do some test with crysis 3 and tell me if the second cpu si the same as the first gpu please?


He had it in windowed mode that time. There was nothing wrong.
Quote:


> Originally Posted by *pompss*
> 
> I still have problems with my r9 295x2.
> The fan is off and I need to push it in order to function.
> Also with a old tn 24 monitor at full HD resolution I get 30 fps in watch dogs.
> I disable the ULPS and still doesn't function well.
> Also in tomb raider and crisis 3 I have frame drops from 100 fps to 32 for 20sec and crisis 3 from 65 fps to 30 fps.This happen every 30seconds.
> This is my second r9 295x2 that gives me problems.
> Well lesson learned never ever again AMD cards.


Quote:


> Originally Posted by *albertokr*
> 
> Im using a MSI z97 gaming 5, unfortunately i sent the pc to she shop to fix it and i cant try anything more


No I was wrong, this is the person I was thinking of, pompss posts are memorable for fanboy accusations and other reasons though


----------



## pompss

Quote:


> Originally Posted by *electro2u*
> 
> Most of my complaints about the 295x2 have to do with crossfire support. 2 290x work in crossfire for FFxiv but my 295x2 only uses 1 GPU in that game, which sucks balls because it's the game I bought it for and then I couldn't return it even though it didn't work correctly. Dx9 with 295x2 is not necessarily going to work well. Then I try other things and they work fine. Software compatibility with the 295x2 is hit or miss. That's a valid complaint.
> 
> Besides that the stock cooling is not adequate which is a common enough complaint, but if your vrm temps are too high while under water (temp gun my nads) it means you went cheap on the thermal pads bro
> 
> Yah you and your temp gun are full of it. I have a crap ton of complaints about the 295x2 but yours are FUD. I'm the guy who can't play ffxiv in Crissfire with the card and wants to throw it out the window. But bs is bs. Your definition of throttling is wrong your understanding of how the card works is wrong and there are vrms on top and bottom of the card and you can't monitor them in real time period. I don't believe your stories and you should take your revolving attempts to sell your abandoned block out of the thread


I don't know with what you played all you life but the r9 295x2 doesn't offer optimal games exp.I had the 2 replacement like Satchmo0016 and a lot of people still having issues.
Also that was my first crossfire card and i didn't know about the windowed mode which was kind of disappointing.I guess you never own a gtx 690 so you couldn't possible understand the difference. but seems you like buying and keep garbage stuff.
I also have temp probe connected with my fan controller which is pretty much accurate.There is many way to check vrm's temp and the probe its pretty accurate.
So if you don't know about probe temp and infrared heat gun doesn't even worth spending more time explain to you.
Even in the amd driver log it's mention that they try to fix stuttering issues and was still there when i was playing games.
The kingpin i receive time ago had overheating issues but after evga replace me the card i never had any issues with that.
I wish i never sold that card for the r9 295x2 !!!

And please stop quoting me when answer other people !! Ignore me and i will do the same !!


----------



## electro2u

Quote:


> Originally Posted by *pompss*
> 
> Ignore me and i will do the same !!


I don't need to ignore anything and I will be looking to correct any further FUD posts you make in this thread. You have your for sale forum available to get rid of sad lonely used block. I will keep quoting your embarrassing posts if you keep making fanboy accusations and offensive comments. Quite frankly you were trying to sell someone your block on this thread by claiming they need to go water to avoid throttling. This is incorrect. The stock thermal pads aren't very good and replacing them helps immensely. The throttling comes most likely from overheating vrms and going custom Waterloop isn't the only solution.

The point stands you are quoting vrm temps that make no sense and claiming the card you sold would throttle when temps were fine. It's not rocket science.


----------



## ragulih

Quote:


> Originally Posted by *ocvn*
> 
> 
> 
> 
> 
> 
> 
> 
> disable ulps maybe, try all possible way inc reinstall driver. also compare with custom block and ref wc, i found out the drop might due to vrm because before i tested with heaven 4.0, quite a lot of drop out frenquency/ usage with stock fun of wc. I opened the shroud, and connected the vrm fan to 4 pin 12 V to get maximum fun speed of vrm, the temp of card cooler but sometime, it still dropped. After replacing the wc block, no more.:


Yep the game where I need performance gets VRM rly hot. I'm getting pagefaults, and I guess it's because VRM throttling. Checked the backplate's temps with my finger after few hours of racing, man it was hot. The card works fine with few other games tho. I reinstalled windows and drivers all of it, changed place of the card to the lower PCIe spot (temps got way cooler, at least with GPUs), but didn't have any effect to the real issue.

I have race today so later this evening I'll try your vrm fan mod and propably get proper wc block later on =) Thanks for the help!

Is this EK-KIT L240 (http://www.ekwb.com/shop/kits-cases/kits/ek-kit-l240.html) enough for GPU+CPU at first? Later on I'll get second R9 295x2 and was planning to cool GPUs' with this and buy separate wc for CPU. I have never tried nor got familiarized with this custom wc stuff so it would be easy to start with starter kit if it's possible


----------



## ocvn

Quote:


> Originally Posted by *ragulih*
> 
> Yep the game where I need performance gets VRM rly hot. I'm getting pagefaults, and I guess it's because VRM throttling. Checked the backplate's temps with my finger after few hours of racing, man it was hot. The card works fine with few other games tho. I reinstalled windows and drivers all of it, changed place of the card to the lower PCIe spot (temps got way cooler, at least with GPUs), but didn't have any effect to the real issue.
> 
> I have race today so later this evening I'll try your vrm fan mod and propably get proper wc block later on =) Thanks for the help!
> 
> Is this EK-KIT L240 (http://www.ekwb.com/shop/kits-cases/kits/ek-kit-l240.html) enough for GPU+CPU at first? Later on I'll get second R9 295x2 and was planning to cool GPUs' with this and buy separate wc for CPU. I have never tried nor got familiarized with this custom wc stuff so it would be easy to start with starter kit if it's possible


I dont know the other but my exp, it wont be enough, for cpu is good but with gpu, like my dual 295x2, 1 rad 360ek + thin ek 240 with koolance block, i can keep delta T ard 30. Next week, i will replace them with dual monsta 480 to see i can lower it more or not. Thats my exp?


----------



## tiefox

Guys. i recently ordered a 295x2 from Amazon and should get it next week. Anyone knows the correct thickness for the replacement thermal pads for the VRMs ? ( Planning on getting the Fujipoly Ultra )


----------



## pompss

Quote:


> Originally Posted by *ragulih*
> 
> Yep the game where I need performance gets VRM rly hot. I'm getting pagefaults, and I guess it's because VRM throttling. Checked the backplate's temps with my finger after few hours of racing, man it was hot. The card works fine with few other games tho. I reinstalled windows and drivers all of it, changed place of the card to the lower PCIe spot (temps got way cooler, at least with GPUs), but didn't have any effect to the real issue.
> 
> I have race today so later this evening I'll try your vrm fan mod and propably get proper wc block later on =) Thanks for the help!
> 
> Is this EK-KIT L240 (http://www.ekwb.com/shop/kits-cases/kits/ek-kit-l240.html) enough for GPU+CPU at first? Later on I'll get second R9 295x2 and was planning to cool GPUs' with this and buy separate wc for CPU. I have never tried nor got familiarized with this custom wc stuff so it would be easy to start with starter kit if it's possible


You can buy some thermal pads from Fujipoly and se how it goes .
For this card i would buy a wc in my opinion as the vrm's fan doesn't cool down the vrms properly


----------



## xer0h0ur

Quote:


> Originally Posted by *tiefox*
> 
> Guys. i recently ordered a 295x2 from Amazon and should get it next week. Anyone knows the correct thickness for the replacement thermal pads for the VRMs ? ( Planning on getting the Fujipoly Ultra )


Nothing is a definite really. You can get the pads according to the install instructions for the brand of waterblock you're using but that won't guarantee perfect contact on first install. I know on my EK block I used 0.5mm, 1mm and 1.5mm pads. The thing is that you should test fit it just to make sure you're getting contact between the block and the components its supposed to be cooling. For instance I had to double up the pads in two places because there was no contact being made. You will only see that by putting it together then taking it apart again.
Quote:


> Originally Posted by *ragulih*
> 
> Yep the game where I need performance gets VRM rly hot. I'm getting pagefaults, and I guess it's because VRM throttling. Checked the backplate's temps with my finger after few hours of racing, man it was hot. The card works fine with few other games tho. I reinstalled windows and drivers all of it, changed place of the card to the lower PCIe spot (temps got way cooler, at least with GPUs), but didn't have any effect to the real issue.
> 
> I have race today so later this evening I'll try your vrm fan mod and propably get proper wc block later on =) Thanks for the help!
> 
> Is this EK-KIT L240 (http://www.ekwb.com/shop/kits-cases/kits/ek-kit-l240.html) enough for GPU+CPU at first? Later on I'll get second R9 295x2 and was planning to cool GPUs' with this and buy separate wc for CPU. I have never tried nor got familiarized with this custom wc stuff so it would be easy to start with starter kit if it's possible


I would not use any less than a 360 on that 295X2. You basically need 120 per GPU plus the extra 120 for added performance or overclocking. That means for your CPU + 295X2 you should lean towards a 480 or two 240's or split it any way you like.


----------



## Velict

Is the tubing long enough on this card to fit in the front of a mercury s5?


----------



## DarwinTheCat

Guys. I recently upgraded from a MAXIMUS V EXTREME to an ASUS x99-Deluxe / i5930k / DDR4 2800. My Sapphire R9 295x2 worked like a charm on the previous system. On my new Haswell-E 2011-3 system, I'm experience an extremely weird issue:

In a couple of demanding games like Watch Dogs and ARMA 3, whenever I ALT-TAB to the desktop, my PC suddenly reboots. No blue screen, it just reboots. The weird thing is that within the game itself, no matter how long I'm playing, I don't experience the crash - it is only when alt-tabbing back to Windows.

- My ASUS x99-Deluxe has the latest bios.
- AIDA64 system stress benchmark (with CPU, memory and GPU for 30 minutes) runs rock solid.
- The R9 295x2 isn't overclocked at all.
- PSU is an AX1500i 1500w. I also followed Corsair's blog suggestion of switching the PSU to single rail mode.
- The crash happens in 14.4 as well as 14.7RC3 Catalyst drivers.

What could be causing the ALT-TAB crash? I'm lost. The resolution / refresh rate switch when alt-tabbing seems to be what's causing the problem. Other than that, I have no clue as what's going on.

Help


----------



## electro2u

Quote:


> Originally Posted by *DarwinTheCat*
> 
> Guys. I recently upgraded from a MAXIMUS V EXTREME to an ASUS x99-Deluxe / i5930k / DDR4 2800. My Sapphire R9 295x2 worked like a charm on the previous system. On my new Haswell-E 2011-3 system, I'm experience an extremely weird issue:
> 
> In a couple of demanding games like Watch Dogs and ARMA 3, whenever I ALT-TAB to the desktop, my PC suddenly reboots. No blue screen, it just reboots. The weird thing is that within the game itself, no matter how long I'm playing, I don't experience the crash - it is only when alt-tabbing back to Windows.
> 
> - My ASUS x99-Deluxe has the latest bios.
> - AIDA64 system stress benchmark (with CPU, memory and GPU for 30 minutes) runs rock solid.
> - The R9 295x2 isn't overclocked at all.
> - PSU is an AX1500i 1500w. I also followed Corsair's blog suggestion of switching the PSU to single rail mode.
> - The crash happens in 14.4 as well as 14.7RC3 Catalyst drivers.
> 
> What could be causing the ALT-TAB crash? I'm lost. The resolution / refresh rate switch when alt-tabbing seems to be what's causing the problem. Other than that, I have no clue as what's going on.
> 
> Help


Can't. Don't alt-tab. Unfortunately, crossfirex only works in full screen mode--you simply cannot go to the desktop while in game. It's a very big disadvantage to SLI. Can't say I've ever seen mine reboot over it but it's just a no-no. Not sure how your previous system worked like a charm.


----------



## DarwinTheCat

Quote:


> Originally Posted by *electro2u*
> 
> Can't. Don't alt-tab. Unfortunately, crossfirex only works in full screen mode--you simply cannot go to the desktop while in game. It's a very big disadvantage to SLI. Can't say I've ever seen mine reboot over it but it's just a no-no. Not sure how your previous system worked like a charm.


Uh? This was never an issue with my previous cross fired 7970s. On the MAXIMUS V EXTREME and R9 295x2 I had no issues alt-tabbing, at all.


----------



## electro2u

Quote:


> Originally Posted by *tiefox*
> 
> Guys. i recently ordered a 295x2 from Amazon and should get it next week. Anyone knows the correct thickness for the replacement thermal pads for the VRMs ? ( Planning on getting the Fujipoly Ultra )


It depends on what cooler you are using but I realize this concerns the stock setup and just changing the pads out. You can't really go "too thick" easily as thicker Fuji pads get very expensive but they will squish down when they are a bit thick. My advice is actually to use the 11kwh rated pads not the pricey 17 unless you aren't worried about the expense, and go with 1mm thickness over the .5 to improve connectivity. That's what I'm using. There are VRMs on both top and bottom. You don't need to replace the pads on the RAM chips. The EK schematics show very well the areas that need to be replaced. The VRM/mosfets are what need to be cooled so those are all the #2s on the EK instructions.
http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869055.pdf
The number 3areas I used the stock pads which are a little thicker at 1.5mm iirc but dont quote me on that. EK calls those areas between the vrms "PCB" cooling and I would assume they are slightly less critical.


----------



## electro2u

Quote:


> Originally Posted by *DarwinTheCat*
> 
> Uh? This was never an issue with my previous cross fired 7970s. On the MAXIMUS V EXTREME and R9 295x2 I had no issues alt-tabbing, at all.


I feel ya. Very weird card with spotty software and crossfirex support. Exactly my frustration. These are legitimate complaints.

Google crossfire alt-tab you'll see it's been going on a long time.


----------



## pompss

Quote:


> Originally Posted by *DarwinTheCat*
> 
> Uh? This was never an issue with my previous cross fired 7970s. On the MAXIMUS V EXTREME and R9 295x2 I had no issues alt-tabbing, at all.


Amd made the drivers for this card with the feet. I remember the 7970s was an amazing card.


----------



## Syceo

Hi Guys , can anyone tell me if i should upgrade my mobo ,

Currently using the Asus pro z87 and as i understand it the 295x2 requires a 16 lane slot all to itself. Im running tri-fire at the moment but on this board (if im correct) the 295x2 and the 290 with be sharing bandwidth which i presume will cost me in performance. Any suggestions would be greatly appreciated

And gold themed board or yellow and black ( that will allow for multiple 3.0 x 16x ) suggestions please guys


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Hi Guys , can anyone tell me if i should upgrade my mobo ,
> 
> Currently using the Asus pro z87 and as i understand it the 295x2 requires a 16 lane slot all to itself. Im running tri-fire at the moment but on this board (if im correct) the 295x2 and the 290 with be sharing bandwidth which i presume will cost me in performance. Any suggestions would be greatly appreciated
> 
> And gold themed board or yellow and black ( that will allow for multiple 3.0 x 16x ) suggestions please guys


The 295x2 has a PLX chip onboard, which means you don't need to upgrade... It only needs 8x PCIE

but your gold motherboard question makes me think you will anyway.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> The 295x2 has a PLX chip onboard, which means you don't need to upgrade... It only needs 8x PCIE
> 
> but your gold motherboard question makes me think you will anyway.


Totally had a feeling you'd come to the rescue, how are you doing buddy







, I think I have pumped enough money into this for now , cant justify any more expenses unless its absolutely necessary







which it will probably end up being.


----------



## JoeGuy

Hey chaps,
I was wondering if you could answer a bit of an experience question.

I have the option of picking one of these up for €800, which is a huge €500 drop and I already like AMD cards from past gens, so it's just about the experience.

Do you feel like the performance, acoustic levels and Xfire compatibility has been something you've enjoyed for that sort of money? Or alternatively has there been any major draw back that a different solution would resolve?

Thanks very much.


----------



## Mega Man

personally i prefer the single card solutions,

for needing dual gpus in single cards, it works great

i dont seem to have CFX issues but some do, as some do with sli


----------



## electro2u

I'm happy with my 295x2 but it's been a struggle. I wish crossfirex were different (like SLI) and I wish directx 9 support was there from AMD but other than that it's great hardware IMO. It's not really for overclocking though and I think my assessment is pretty fair.


----------



## pompss

Quote:


> Originally Posted by *JoeGuy*
> 
> Hey chaps,
> I was wondering if you could answer a bit of an experience question.
> 
> I have the option of picking one of these up for €800, which is a huge €500 drop and I already like AMD cards from past gens, so it's just about the experience.
> 
> Do you feel like the performance, acoustic levels and Xfire compatibility has been something you've enjoyed for that sort of money? Or alternatively has there been any major draw back that a different solution would resolve?
> 
> Thanks very much.


I would suggest you to wait the new gtx 980 is out (oct/nov) This could cause amd to drop the price of the 290x and maybe another drop for the 295x2.
What i hear is that Amd its planning to price the 290x at $449.00 which will be much better solution then the r9 295x2
The r9 295x2 was giving me a lot of headache and i see more and more people having throttling problems that maybe you be able to fix it.


----------



## DMatthewStewart

Quote:


> Originally Posted by *EliteReplay*
> 
> R9 295x2 ITX Build


What is that sweet little case? Youre not going to cram a full 'puter in that, are you? Or more importantly, can you? It looks like it would be a sweet video card case.


----------



## Syceo

Quote:


> Originally Posted by *JoeGuy*
> 
> Hey chaps,
> I was wondering if you could answer a bit of an experience question.
> 
> I have the option of picking one of these up for €800, which is a huge €500 drop and I already like AMD cards from past gens, so it's just about the experience.
> 
> Do you feel like the performance, acoustic levels and Xfire compatibility has been something you've enjoyed for that sort of money? Or alternatively has there been any major draw back that a different solution would resolve?
> 
> Thanks very much.


I can only speak for myself. I previously has a pair of GTX 780's in SLI. There was absolutely nothing wrong with them aside from the occasional heat issues in the configuration i was running. In any case I decided to take the plunge and go for a single card solution with the 295x2 . Having reservations about the heat ( given that the chips naturally run hot) and stories of driver issues and blah blah blah I still went ahead with AMD and this monster of a card . I can say wholeheartedly I am happy with my decision. I personally have not had any major issues that i have not been able to find a fix ( overclock community here is great







) . In terms of performance ... well this card does not disappoint , so much so that i went ahead and paired it with a 290x to get even more bang for the buck . I am not a fanboy having been with Nvidia, but i have to say amd done well on this beastly card. OK some may be experiencing issues with this card, but to be honest your going to get that whichever way you go. People say wait wait wait for the 900 series, but what's the point because you will always be chasing the next card... that's the nature of the game. I say if you have your eye set on a 295x2 and at the rediculous prices they are now being sold at then go for it, what the hell .. pick up 2 of them and your set for the next few years easy.

Just my opinion


----------



## JoeGuy

I appreciate peoples feedback.

The difficulty I have is the €800 price promotion seems to end on the 19th, the same day Nvidia releases the 9XX series NDA's.

So I don't want to miss the savings if if can give me a manageable Xfire solution. I don't think I could cool 2 x 290X's without blocks and don't like the fan noise of 2 cards. Unless SLI 980's are 15% faster, quiet and <€900, this is my only affordable 2-way solution at this perf.

I just want to gauge if people feel like it gives you pretty much Xfire performance with a reasonable experience? Do you feel like it has created any additional hassle compared to 2-way Xfire?


----------



## pompss

People complain also about stuttering issues.
In my case playing crysis 3 a 1440p i had frame drop from 60 to 30 all the time even with a full waterblock on it. I couldn't fix that and most like it's driver problem which amd its not really good in providing good drivers.
I didn't have any problems with synthetic bench but when its comes playing game was not the best experience.
I would suggest anyway to give it a try if the price its good. Be sure you have 30 days return policy in case you don't like it .


----------



## Syceo

Quote:


> Originally Posted by *JoeGuy*
> 
> I appreciate peoples feedback.
> 
> The difficulty I have is the €800 price promotion seems to end on the 19th, the same day Nvidia releases the 9XX series NDA's.
> 
> So I don't want to miss the savings if if can give me a manageable Xfire solution. I don't think I could cool 2 x 290X's without blocks and don't like the fan noise of 2 cards. Unless SLI 980's are 15% faster, quiet and <€900, this is my only affordable 2-way solution at this perf.
> 
> I just want to gauge if people feel like it gives you pretty much Xfire performance with a reasonable experience? Do you feel like it has created any additional hassle compared to 2-way Xfire?


All in all it comes down to your budget and expectations,

If you can afford to live with a stock 295x2 and add a better fan solution ( as in my opinion the stock solutions simply isnt adequate) and also be prepared to live with an increased room temperature and niggling driver issues (in some instances) then go for it

If you can afford to get a 295x2 and add a custom cooling loop and block then go for it

If all you want to do is game at 1440 res and the highest settings then there are tons of solutions that are less expensive than the above.. then go for it

If you want performance thats rumoured to be less than that of a GTX 780Ti but more than a GTX 780 but is predicted to be more efficient than previous cards then wait for 980

You may be disappointed if you think that a 980 SLI is going to be 15% faster, quieter and less than $450 per card in comparison to a 295x2 (quieter maybe , but cheaper + 15% faster...this im not sure about)

I persoanlly have not experienced any stuttering during gameplay at 4k, so i really cant comment on Pompss experience

As mentioned before its all about your expectations and how deep your pockets are.


----------



## pompss

Gtx 980 coming out this Month for sure.
Just wait you may get two o this for 800 euro.


----------



## boss9967

Hi there , need advice

what Brand do you recommended for a new 295x2 ?
I'm talking about warranty and customer service .


----------



## Syceo

Quote:


> Originally Posted by *boss9967*
> 
> Hi there , need advice
> 
> what Brand do you recommended for a new 295x2 ?
> I'm talking about warranty and customer service .


Id go MSI, you can stick a block on it and apparently still retain the warranty.


----------



## joeh4384

I went XFX but if I was more patient, I would have picked MSI because of the positive RMA experience I had with my 290x.


----------



## JoeGuy

Thanks very much for the advice.
I decided to order the PowerColor 295X2 and see how I like it.
I can always use their 14 day return policy if the 9XX series blows everyone away for price or I'm unsatisfied with the Xfire drivers..

Cheers.


----------



## electro2u

Quote:


> Originally Posted by *JoeGuy*
> 
> Thanks very much for the advice.
> I decided to order the PowerColor 295X2 and see how I like it.
> I can always use their 14 day return policy if the 9XX series blows everyone away for price or I'm unsatisfied with the Xfire drivers..
> 
> Cheers.


If newegg had a 14 day return policy on these I would have. I'm living with it but win 8.1 and my setup don't work well with the 295x2. Win7 is more forgiving.


----------



## Mega Man

Quote:


> Originally Posted by *Syceo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boss9967*
> 
> Hi there , need advice
> 
> what Brand do you recommended for a new 295x2 ?
> I'm talking about warranty and customer service .
> 
> 
> 
> Id go MSI, you can stick a block on it and apparently still retain the warranty.
Click to expand...

Quote:


> Originally Posted by *joeh4384*
> 
> I went XFX but if I was more patient, I would have picked MSI because of the positive RMA experience I had with my 290x.


either one will ( xfx or MSI ) IN THE US not world wide


----------



## pompss

Quote:


> Originally Posted by *JoeGuy*
> 
> Thanks very much for the advice.
> I decided to order the PowerColor 295X2 and see how I like it.
> I can always use their 14 day return policy if the 9XX series blows everyone away for price or I'm unsatisfied with the Xfire drivers..
> 
> Cheers.


Keep an eye on the gtx 980 . Maybe you can get a sli gtx 980 configuration with the same money.One gtx 980 will consume 175 watt which its pretty amazing and have the same performance of the gtx 780 ti or maybe more faster.The new Card will be available around 19th and end of the month!!
In Europa you definitely need to will have a video card that consume less power possible as i was living in Italy and electricity was pretty expensive .


----------



## Syceo

Hi everyone quick question. I just recieved the missing backplate for my card. I didnt realise it doesnt come with a thermal pad in the box. I dont have the time to wait 2 days for a thermalpad just for the backplate, so can i use the pads that were on the original card before i added the waterblock.

Thanks


----------



## Syceo

Do i even need thermal pads between the backplate and the card?


----------



## electro2u

Yes. You do. There are VRMs on the back of the card. Stock pads are meh but they will be fine for now.


----------



## Syceo

The manual that came with the card doesn't mention applying and pads... Is that and absolute must?


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> The manual that came with the card doesn't mention applying and pads... Is that and absolute must?


The card comes with the pads already installed on the stock backplate, doesn't it? If not I'm too young to lose my mind.

Yeah they wouldn't put anything about disassembling the card in the manual. Just use the stock pads from the stock backplate or wait til you can get some pads.


----------



## tiefox

Has anyone tested Arma 3 running tri-fire on 1440p ?

Could only find some benchmarks online for Arma 3 on a single 295x2.

My XFX 295x2 will arrive this week at my friends home in Orlando and will be there on vacation next month to pick up, thinking about picking up a 290x also to trifire. ( I'm from Brazil and prices here are prohibitive.)


----------



## boss9967

Thanks for the advice Syceo , I already place my order on newegg MSI 295x2

Syceo do you recommended trifire with a 290x ? ....


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> The card comes with the pads already installed on the stock backplate, doesn't it? If not I'm too young to lose my mind.
> 
> Yeah they wouldn't put anything about disassembling the card in the manual. Just use the stock pads from the stock backplate or wait til you can get some pads.


Yeah sorted, cheers , the pads on the stock shroud was like 0.5 mm thick so i had to double up, but i reckon it will do
Quote:


> Originally Posted by *boss9967*
> 
> Thanks for the advice Syceo , I already place my order on newegg MSI 295x2
> 
> Syceo do you recommended trifire with a 290x ? ....


Yup sure did, I paired the 295x2 up with a 290x



@electro, couldnt get the flowbridge so I opted for an alternative


----------



## electro2u

That's a neat setup Syceo. I think I'll be using EK blocks on my next build.


----------



## Chip Pippins

Just picked one up. Saw the performance numbers from Maxwell and wasn't impressed. For $999 this card is certainly not a bad deal (relatively speaking, who knows how cheap it was to make). I'm upgrading from a single r9 270x.


----------



## pompss

Quote:


> Originally Posted by *Chip Pippins*
> 
> Just picked one up. Saw the performance numbers from Maxwell and wasn't impressed. For $999 this card is certainly not a bad deal (relatively speaking, who knows how cheap it was to make). I'm upgrading from a single r9 270x.


Gtx 980 performance are just rumors and speculation. Waiting 2 day more and see the real performance no ehh ??
Buying this card now its not very smart as the gtx 980 its coming out. if the price will be $499 its a must buy as you can get two of them for $999 and get more performance + less consumption.
remember the the gtx 980 consume only 170 watt which is very impressive


----------



## rdr09

Quote:


> Originally Posted by *pompss*
> 
> Gtx 980 performance are just rumors and speculation. Waiting 2 day more and see the real performance no ehh ??
> Buying this card now its not very smart as the gtx 980 its coming out and if the price will be $499 its a must buy as you can get two of them for $999 and get more performance + less consumption.
> remember the the gtx 980 consume only 170 watt which is very impressive


if they are rumors and speculations, how do you know it will outperform hawaiis?


----------



## pompss

Quote:


> Originally Posted by *rdr09*
> 
> if they are rumors and speculation how do you know it will outperform hawaiis?


Not 100% sure but Do you think nvidia will release a new videocard which will perform less then 290x which now is priced as $449 ???
I'm just saying to wait 2 days and see instead of jumping to an $999 card.


----------



## rdr09

Quote:


> Originally Posted by *pompss*
> 
> Not 100% sure but Do you think nvidia will release a new videocard which will perform less then 290x which no is priced as $449 ???
> I'm just saying why not wait 2 days and see instead of jumping to this card??


i don't want to speculate either. you want me to?


----------



## pompss

Quote:


> Originally Posted by *rdr09*
> 
> i don't want to speculate either. you want me to?


Sure go ahead


----------



## rdr09

Quote:


> Originally Posted by *pompss*
> 
> Sure go ahead


the 980 will be slower than the 780Ti. one thing is for sure . . . it is designed for 1080.


----------



## pompss

Quote:


> Originally Posted by *rdr09*
> 
> the 980 will be slower than the 780Ti. one thing is for sure . . . it is designed for 1080.


Lol.... designed for 1080








If this happens we will see nvidia get killed by AMD







That's for sure !!


----------



## pompss

Maybe slower then gtx 780 ti classified yes.
This is why i think its better to wait and see instead of buying any card right now


----------



## Syceo

If Nvida really make this card perform slower than a 780TI all hell will break loose thats for sure. I personally cannot see them making such an epic fail.. it just wouldnt make sense.


----------



## Chip Pippins

Quote:


> Originally Posted by *pompss*
> 
> Gtx 980 performance are just rumors and speculation. Waiting 2 day more and see the real performance no ehh ??
> Buying this card now its not very smart as the gtx 980 its coming out. if the price will be $499 its a must buy as you can get two of them for $999 and get more performance + less consumption.
> remember the the gtx 980 consume only 170 watt which is very impressive


If the performance is that much different from what has been leaked so far, I would consider a return. I'm suspecting at most we'll see a 10-15% increase in performance over a set of GTX 780s (for the 980). The GTX 980s will probably retail somewhere between $500-$600. So for a set of two it would cost between $1000-1200. The 295x2 cost $1005 with shipping.

Granted they will have lower power consumption than a 295x2, but it's not a huge issue for me power is very cheap here and I have the PSU to pump the watts required.

That being said, I'll keep her in the box until I see a few 980 benchmarks just to be sure.


----------



## pompss

One Of my friends that work in Pc components distribution told me that the retail price will be $450 for the gtx 980








That's really good price


----------



## Rayleyne

So i was looking at two of these to do 7680x1440, Question is can i find traditional waterblocks, No CLC for me


----------



## mojobear

Hi guys, for those living in Canada - tigerdirect has the R9 295x2 for $1120, which is almost exactly the US price of 999 + exchange. Thats pretty amazing for us







A friend just got one today for that price muahaha.









http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=8971203&CatId=7387


----------



## pompss

Quote:


> Originally Posted by *Rayleyne*
> 
> So i was looking at two of these to do 7680x1440, Question is can i find traditional waterblocks, No CLC for me


yes you can buy universal water block or of full block from ek or acquacomputer.


----------



## doctakedooty

So here it is guys a 295x2 stuffed into the corsair 250d.


----------



## electro2u

Quote:


> Originally Posted by *doctakedooty*
> 
> So here it is guys a 295x2 stuffed into the corsair 250d.


Makes me think of


Like, uh... How's he supposed to fit THAT in THERE?


----------



## doctakedooty

Quote:


> Originally Posted by *electro2u*
> 
> Makes me think of
> 
> 
> Like, uh... How's he supposed to fit THAT in THERE?


I dont know it did though but I have not seen anyone cram it into the 250d yet so figured I would be the first. Once I get my block in from frozencpu and my maximus vii impact in then I can get this build done. Next is running a custom loop and making a custom midplate and putting acrylic window in the top and the side.


----------



## joeh4384

I ended up pulling my 290x. It ended up causing the 295x2 to drop its clocks down to 300 mhz for seconds causing nasty stuttering. I suspect the VRMs didnt like the blast of heat from the 290x. 1 295x2 is enough for 1440p so I think I will just run with 1 card. Do you think installing a cooling mod like a kraken would help prevent the 290x from causing the 295x2 VRMs to over heat?


----------



## electro2u

Can't help but wonder whatever happened to HoneyBadger


----------



## xer0h0ur

Quote:


> Originally Posted by *Chip Pippins*
> 
> Just picked one up. Saw the performance numbers from Maxwell and wasn't impressed. For $999 this card is certainly not a bad deal (relatively speaking, who knows how cheap it was to make). I'm upgrading from a single r9 270x.


Lol I also went from a 270 to this beast. It was worth the upgrade even though I paid the higher price with a "free" 500GB Evo SSD.


----------



## shadow85

Quote:


> Originally Posted by *Chip Pippins*
> 
> Just picked one up. Saw the performance numbers from Maxwell and wasn't impressed. For $999 this card is certainly not a bad deal (relatively speaking, who knows how cheap it was to make). I'm upgrading from a single r9 270x.


How can you justify this when the card isn't even released. You wont know the real performance till it's released. I was close to buying a 295X2 myself but definatly will wait for 980 SLI results now.


----------



## Chip Pippins

Quote:


> Originally Posted by *shadow85*
> 
> How can you justify this when the card isn't even released. You wont know the real performance till it's released. I was close to buying a 295X2 myself but definatly will wait for 980 SLI results now.


Idk, was basing it off leaked performance benchmarks. If they are cheap and better than 780tis I'll probably send back the 295. Was also kind of an impulse buy I really want a better card.


----------



## joeh4384

Looks like Nvidia is releasing the 980 at 600 and 970 at 400. I am laughing at all the people expecting Nvidia to release at low prices.


----------



## shadow85

Quote:


> Originally Posted by *Chip Pippins*
> 
> Idk, was basing it off leaked performance benchmarks. If they are cheap and better than 780tis I'll probably send back the 295. Was also kind of an impulse buy I really want a better card.


Yes I know the feeling, I have done it mysef in the past, just buying a GPU for the sake of wanting something new. But I have learned and held off this time, thankfully.


----------



## xer0h0ur

Quote:


> Originally Posted by *joeh4384*
> 
> I ended up pulling my 290x. It ended up causing the 295x2 to drop its clocks down to 300 mhz for seconds causing nasty stuttering. I suspect the VRMs didnt like the blast of heat from the 290x. 1 295x2 is enough for 1440p so I think I will just run with 1 card. Do you think installing a cooling mod like a kraken would help prevent the 290x from causing the 295x2 VRMs to over heat?


Hmm, so both of your cards are on stock coolers? A friend of mine is selling me his reference 290X he barely used for like a couple of weeks so I am going to tri-fire it with my waterblocked 295X2. The 290X is going to be air cooled though until I can afford to extend the loop and drop an EK block on it. Do you know for certain its VRM temps causing throttling? Your cores aren't reaching the throttling temps? Also which driver are you using that you're experiencing this issue?


----------



## joeh4384

Both coolers were stock. I was running 14.7. I had a MSI Gaming 290x so the heat is dumped right into the case. The clocks would drop to 300mhz for a few seconds after a good 15-20 minutes of heavy use. Since then, I removed the 290x and was able to stress tess and the clocks never dropped. I suspect it would work ok for a reference 290x since the heat is exhausted out the back.


----------



## xer0h0ur

Okay, thanks for the heads up. I expect the package today so when I get off work tonight I will let you know my own results/experience.


----------



## pompss

Guys just wait and buy some gtx 970 in sli instead the r9 295x2. Same performance, less power consumption , less heat , no throttling and 40% cheaper









Just to let you Know


----------



## Syceo

Quote:


> Originally Posted by *pompss*
> 
> Guys just wait and buy some gtx 970 in sli instead the r9 295x2. Same performance, less power consumption , less heat , no throttling and 40% cheaper
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just to let you Know


yeah ... you have been letting everyone know for at least a week how much you hate the 295x2







zzzzzzzz

pretty sure everyone in this owners club gets the message .......


----------



## pompss

Quote:


> Originally Posted by *Syceo*
> 
> yeah ... you have been letting everyone know for at least a week how much you hate the 295x2
> 
> 
> 
> 
> 
> 
> 
> zzzzzzzz
> 
> pretty sure everyone in this owners club gets the message .......


Funny thing is that i planning to the get the ASUS ARES 3.

i dont hate the R9 295X2 . Its a love and hate thing........


----------



## Syceo

I might get myself a 980 and do a mini ITX build to use as a regular pc and then just game and render stuff on the big boy.


----------



## xer0h0ur

If I wanted to play the waiting game I would wait for the next generation of AMD cards which will again kill the 9xx series. I wanted a high performance three core setup using two cards at 16x each. This was the option on the table and I took it.


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> If I wanted to play the waiting game I would wait for the next generation of AMD cards which will again kill the 9xx series. I wanted a high performance three core setup using two cards at 16x each. This was the option on the table and I took it.


New amd card will most likely facing the big maxwell not the gtx 980 . The new cards are expecting to be released somewhere in the first Q1 of the next year probably. So gtx 980 will be the card to go for the end of this year and the beginning of the first Q1 2015


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> If I wanted to play the waiting game I would wait for the next generation of AMD cards which will again kill the 9xx series. I wanted a high performance three core setup using two cards at 16x each. This was the option on the table and I took it.


Going to totally have to agree with you on that one.

Once you get stuck in the waiting game , you forever find yourself waiting. You cannot go wrong with a 295x2 + 290x


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> If I wanted to play the waiting game I would wait for the next generation of AMD cards which will again kill the 9xx series. I wanted a high performance three core setup using two cards at 16x each. This was the option on the table and I took it.


Did you pick up that second card?


----------



## xer0h0ur

Yeah I am trading PC parts with a friend for his 290X. He was disappointed by the AMD drivers and bought two GTX 780 Ti's to replace it. I just checked the tracking number on the package and it shows as delivered so its waiting for me at home right now. Unfortunately another 5 hours and change before I get outta work today.


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I am trading PC parts with a friend for his 290X. He was disappointed by the AMD drivers and bought two GTX 780 Ti's to replace it. I just checked the tracking number on the package and it shows as delivered so its waiting for me at home right now. Unfortunately another 5 hours and change before I get outta work today.


i feel you friend .......


----------



## xer0h0ur

Yeah I am no fanboy of green or red team. Frankly I wish there was still a 3rd party competitor to push both green and red into releasing more powerful products at lower prices. AMD takes far too long to update drivers. I can't believe how long its been since the last approved driver release. Its borderline pathetic. Nvidia price gouges out the you know what and often nerfs its products to extend the lifetime of the series with higher clocked versions over time. It is what it is. Both have good and bad.


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I am no fanboy of green or red team. Frankly I wish there was still a 3rd party competitor to push both green and red into releasing more powerful products at lower prices. AMD takes far too long to update drivers. I can't believe how long its been since the last approved driver release. Its borderline pathetic. Nvidia price gouges out the you know what and often nerfs its products to extend the lifetime of the series with higher clocked versions over time. It is what it is. Both have good and bad.


Me too im not a fanboy of green or red team.

R9 290x its a amazing card.i had problem with kingpin and no problem with 290x. But the 295x2 gave some much problem that i almost hate it


----------



## electro2u

Going to try my 295x2 on an x79 build and see if it fixes my crossfire issue with FinalFantasy XIV in windows 8.1.

RIVB + 4820k incoming!


----------



## Syceo

Anyone using Cat 14.8 ? and if so , any noticeable improvments


----------



## xer0h0ur

I am completely confused. I can boot with the 295X2. I can boot with the 290X. However I can't boot with both cards connected to the PSU. A moment after powering up I get the 6 beep code which means video card failure. I mean its not like I re-installed the driver and benched or stress tested the 290X. I only know that it boots, shows video output and loads windows with the 295X2 unpowered. It can't possibly be that the PSU can't handle it, can it?


----------



## shadow85

Any reviews comparing SLI GTX 980 vs 295x2? I have seen one with SLI GTX 970 vs 295x2, and it does very well against it here:
http://www.techpowerup.com/mobile/reviews/NVIDIA/GeForce_GTX_970_SLI/20.html

so SLI 980 should do better but I can't find any reviews for SLI vs 295x2


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am completely confused. I can boot with the 295X2. I can boot with the 290X. However I can't boot with both cards connected to the PSU. A moment after powering up I get the 6 beep code which means video card failure. I mean its not like I re-installed the driver and benched or stress tested the 290X. I only know that it boots, shows video output and loads windows with the 295X2 unpowered. It can't possibly be that the PSU can't handle it, can it?


That's the only reason I can think of that might happen. It makes sense given the scenario. Lots of bad units from that particular model says Newegg... even though they somehow come out with 4/5 eggs but 20% of the reviews are 1 egg... shenanigans.


----------



## xer0h0ur

I just finished reinstalling the driver and using the 290X. I ran firestrike and firestrike extreme. Checked it with GPU-Z and did notice that the subvendor thingamabobber said ATI so I don't know if I should flash the BIOS on it. I mean this crap makes no sense. Both PCIe slots are clearly working since I didn't remove the 295X2 from the top slot. I just plugged in the 290X into the bottom slot.

How do you do the PSU trick to use a second PSU on the 290X for the sake of seeing if it is after all my PSU?


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> I just finished reinstalling the driver and using the 290X. I ran firestrike and firestrike extreme. Checked it with GPU-Z and did notice that the subvendor thingamabobber said ATI so I don't know if I should flash the BIOS on it. I mean this crap makes no sense. Both PCIe slots are clearly working since I didn't remove the 295X2 from the top slot. I just plugged in the 290X into the bottom slot.
> 
> How do you do the PSU trick to use a second PSU on the 290X for the sake of seeing if it is after all my PSU?


I wouldn't do it without an adapter; you CAN... but it requires splicing wires from one psu connector to the other and it's iffy on big PSUs.

http://www.frozencpu.com/products/13815/ele-933/Add2PSU_Multiple_Power_Supply_Adapter.html


----------



## ragulih

lol all of my problems seemed to be software related to iRacing.. So the card is running godlike! (At first we thought that VRM temps was getting too high)

I fixed few things with my case airflow few days ago. Today I did Unigine Valley Benchmark, Direct3D11 1920x1080 8xAA fullscreen Extreme HD preset:

FPS: 115.8
Score: 4847
Min FPS: 35.5
Max FPS: 188.0

I was playing around with my OC settings just for the lulz as I couldn't get dual GPU working at first, I forgot Power Limit +50%, Core 1100Mhz Mem 1625Mhz. Max temps: GPU1 67, GPU2 69. No artifacts.

Is it good?









My friend scored w/ factory OC GTX970 2400.

Edit: I am doing this with stock cooling.


----------



## Syceo

Humm this is strange, is anyone experiencing better firestrike results when testing on stock rather than with an overclock, or am i doing something wrong here

OC @ power limit +50 , core clock 1018, mem clock 1625



Stock speeds :


----------



## electro2u

Pretty sure CoolMike did a lot of testing and found +20 power limit to be about the cutoff for usefulness but the core is what will make your score in fire strike go up. Memory overclocking isn't very useful for 3dmark for whatever reason.

I don't run with an overclock normally but I remember getting 1150cores stable with +35volts. Huge difference in heat output there, though. I don't like seeing my cores over 60c


----------



## Syceo

Yeah, ive been running everything on stock aside from the cpu. Cant imagine why I would need any more power than what is here already. Just fancied giving it a go to see what all the hu ha is about.


----------



## electro2u

That's right, it was 1125 on the 295x2 not 1150









@+35v

Then 1137Core on the 290x @+25v
w\
No memory overclock:


So, yah... no idea what the headroom is. This was just playing around without adding much voltage.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> I just finished reinstalling the driver and using the 290X. I ran firestrike and firestrike extreme. Checked it with GPU-Z and did notice that the subvendor thingamabobber said ATI so I don't know if I should flash the BIOS on it. I mean this crap makes no sense. Both PCIe slots are clearly working since I didn't remove the 295X2 from the top slot. I just plugged in the 290X into the bottom slot.
> 
> How do you do the PSU trick to use a second PSU on the 290X for the sake of seeing if it is after all my PSU?


simple. plug psu into wall. plug 6/8 pins into card, jump green wire from 24 pin to a black wire ( power switch on psu back off,

turn on pc / psu switch at same time, ( if your wires are block you want ps on will have to google a 24pin diagram )

you can break your hardware m,ake sure you know what you are doing


----------



## boss9967

question ? I was monitoring my 295x2 with MSI afterburner show me both gpu's runing a diferents usage % , both gpus up and down all time , thats is normal ? I used a amd quadfire before all gpus ran almost at the same usage % .

on google found this video looks like my video card run ,is normal ?


----------



## electro2u

Quote:


> Originally Posted by *boss9967*
> 
> question ? I was monitoring my 295x2 with MSI afterburner show me both gpu's runing a diferents usage % , both gpus up and down all time , thats is normal ? I used a amd quadfire before all gpus ran almost at the same usage % .


Not working as it should. Sounds like what Pompss has had problems with. Driver/software issue.

Is this in all applications/games or just 1 in particular? Might try DDU (display driver uninstaller) and 14.4 catalyst again.


----------



## boss9967

Well some bench like 3dmark and Heaven 4.0 both gpu's stay 100% almost all the time I guess is the crossfire profile on some games the problem .


----------



## Syceo

Well thats me done, swapped out the blue ram sticks and threw in some Avexir Gold blitz, looks much better





while i was at it, i figured id' give this a go











so thats the delid done too, so im done .....


----------



## ViRuS2k

oooosh







lol

http://i.imgur.com/p7MoLGV.jpg


----------



## xer0h0ur

Well, I feel like an idiot. All I had to do was swap the cards out. Wah, wah, waaaaaaaaaaaaaaaaaaah. Little did I know this motherboard's primary PCIe slot is the bottom slot while the top slot is the secondary. It needed to have the 295X2 in the primary slot to boot. Just got the 14.7 RC3 Cat installed and disabled ULPS again. Working on benching now.


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well, I feel like an idiot. All I had to do was swap the cards out. Wah, wah, waaaaaaaaaaaaaaaaaaah. Little did I know this motherboard's primary PCIe slot is the bottom slot while the top slot is the secondary. It needed to have the 295X2 in the primary slot to boot. Just got the 14.7 RC3 Cat installed and disabled ULPS again. Working on benching now.


LOL... sorry but i just couldn't resist







but the good news is you got it sorted


----------



## electro2u

Dude
Quote:


> Originally Posted by *xer0h0ur*
> 
> Well, I feel like an idiot. All I had to do was swap the cards out. Wah, wah, waaaaaaaaaaaaaaaaaaah. Little did I know this motherboard's primary PCIe slot is the bottom slot while the top slot is the secondary. It needed to have the 295X2 in the primary slot to boot. Just got the 14.7 RC3 Cat installed and disabled ULPS again. Working on benching now.


Oh dang... Now I feel stupid too. Mine would boot with the 295x2 on top though. It just would only show the 290x in bios... Maybe it's the same reason.


----------



## dcombs108

Just got my diamond r9 295....no cables included....having trouble getting 4k resolution out of my Asus pb287q...I have tried mini display port hdmi(non active) dvi to hdmi....can only get 1080p....I have 2 different mini dp to full size dp in the mail....any suggestions?


----------



## dcombs108

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am completely confused. I can boot with the 295X2. I can boot with the 290X. However I can't boot with both cards connected to the PSU. A moment after powering up I get the 6 beep code which means video card failure. I mean its not like I re-installed the driver and benched or stress tested the 290X. I only know that it boots, shows video output and loads windows with the 295X2 unpowered. It can't possibly be that the PSU can't handle it, can it?


I have a eggs supernova 1000 watt....I booted with the 295 and a windforce 290...it didn't even kick out of eco mode....I think total system power draw underload is only like 800-850...perhaps your pcie slot?


----------



## Syceo

Quote:


> Originally Posted by *dcombs108*
> 
> Just got my diamond r9 295....no cables included....having trouble getting 4k resolution out of my Asus pb287q...I have tried mini display port hdmi(non active) dvi to hdmi....can only get 1080p....I have 2 different mini dp to full size dp in the mail....any suggestions?


To get 4k out of this monitor at 60hZ you have to use the cable that came with the monitor (this is what i did) . Then click the menu button , scroll down to input select then choose the mini display option. After you have clicked on the mini display option go back to the main menu and scroll down to system setup , select system setup then select display port stream and choose DP1.2 and then your all set ...4K @ 60hz


----------



## dcombs108

Quote:


> Originally Posted by *Syceo*
> 
> To get 4k out of this monitor at 60hZ you have to use the cable that came with the monitor (this is what i did) . Then click the menu button , scroll down to input select then choose the mini display option. After you have clicked on the mini display option go back to the main menu and scroll down to system setup , select system setup then select display port stream and choose DP1.2 and then your all set ...4K @ 60hz


Its is a full size display port....and the 295 has only mini....anyoneget 4k from mini to hdmi?


----------



## Syceo

Quote:


> Originally Posted by *dcombs108*
> 
> Just got my diamond r9 295....no cables included....having trouble getting 4k resolution out of my Asus pb287q...I have tried mini display port hdmi(non active) dvi to hdmi....can only get 1080p....I have 2 different mini dp to full size dp in the mail....any suggestions?


Sorry mate edit on last message... use a full size display port cable to mini display cable


----------



## Syceo

Quote:


> Originally Posted by *dcombs108*
> 
> Its is a full size display port....and the 295 has only mini....anyoneget 4k from mini to hdmi?


If you use a mini dp to hdmi you wont get 60HZ you will only get 30HZ so you can forget about enjoyable gaming because 30hz just wont cut it... so use the method above and your good to go.


----------



## dcombs108

Bah....I wish there was Sunday delivery...thanks


----------



## ImperialOne

Quote:


> Originally Posted by *boss9967*
> 
> question ? I was monitoring my 295x2 with MSI afterburner show me both gpu's runing a diferents usage % , both gpus up and down all time , thats is normal ? I used a amd quadfire before all gpus ran almost at the same usage % .
> 
> on google found this video looks like my video card run ,is normal ?


So here is what to do:
1- in MSI Afterburner (you do have it, right?), disable ULPS. It's a box to checkmate in the General tab. You may need to restart after this.

2- open up Catalyst Control Center, under preferences and make sure Enable System Tray Menu and Advanced View.

3- open the Performance tab on the left side column of the CCC. Click AMD CrossfireX. Make sure to activate (check mark) Enable AMDCrossfireX. Only Win8'users will get the further option of Enabling CrossfireX for programs that have no profiles... SELECT THIS. if you need to disable this for a program, you will need to Should you wish to disable crossfire for a specific game you need to create a specific profile within 3d applications and select the game exe, add it, then disable crossfire and click apply.
Apply changes.

4- Click the Gaming tab in the left-aided column then click 3D Applications. Verify that Frame Pacing is enabled. Click apply and X out. (Close CCC).

5- Go to Right Click the CCC icon in your taskbar. Go to 1.AMD Radeon R9 295x2 >> AMD Crossfire Settings>>>Show CrossfireX Status icon. Make sure it's checked.

Now, whenever a 3D application loads, if Crossfire is working, a logo will appear in the upper right corner of your screen. All your GPUs should be running at same % rate at each cards's maximum boost and clock.


----------



## Syceo

Quote:


> Originally Posted by *ImperialOne*
> 
> So here is what to do:
> 1- in MSI Afterburner (you do have it, right?), disable ULPS. It's a box to checkmate in the General tab. You may need to restart after this.
> 
> 2- open up Catalyst Control Center, under preferences and make sure Enable System Tray Menu and Advanced View.
> 
> 3- open the Performance tab on the left side column of the CCC. Click AMD CrossfireX. Make sure to activate (check mark) Enable AMDCrossfireX. Only Win8'users will get the further option of Enabling CrossfireX for programs that have no profiles... SELECT THIS. if you need to disable this for a program, you will need to Should you wish to disable crossfire for a specific game you need to create a specific profile within 3d applications and select the game exe, add it, then disable crossfire and click apply.
> Apply changes.
> 
> 4- Click the Gaming tab in the left-aided column then click 3D Applications. Verify that Frame Pacing is enabled. Click apply and X out. (Close CCC).
> 
> 5- Go to Right Click the CCC icon in your taskbar. Go to 1.AMD Radeon R9 295x2 >> AMD Crossfire Settings>>>Show CrossfireX Status icon. Make sure it's checked.
> 
> Now, whenever a 3D application loads, if Crossfire is working, a logo will appear in the upper right corner of your screen. All your GPUs should be running at same % rate at each cards's maximum boost and clock.


I just tried this myself on tomb raider, and still all 3 gpus are fluctuating with the usage, 295x2 up and down and the same goes for the 290x. I should be able to do 60 fps im sure of it with this rig but im getting 45fps on average , anyone goit a fix for this?


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> I just tried this myself on tomb raider, and still all 3 gpus are fluctuating with the usage, 295x2 up and down and the same goes for the 290x. I should be able to do 60 fps im sure of it with this rig but im getting 45fps on average , anyone goit a fix for this?


Turn PCIE link power saving off in windows power settings?


----------



## Syceo

Hi mate, are you asking me if I have done that or are you suggesting that it should be done in order to remedy this issue ?

Cheers


----------



## xer0h0ur

I just ran Tombraider to check tri-fire performance. I see what you're saying about the GPU usage not remaining at 100% usage. The only thing I worry about is making sure the clocks remain at full speed because the core usage is rarely if ever constantly at full usage but my framerates are high though since I don't yet have my 4K monitor. I am still confined to 1600x1200 plus maxed out settings. What resolution are you playing at?


----------



## Syceo

I'm playing at 4k at the moment. I lowered the textures and graphic settings. That seems to have raised the fps to 60. But I would like it consistent.


----------



## Syceo

I'm guessing high/medium settings on 4k would be the equivalent of ultra/high respectively on 1440 so I shouldn't really be complaining


----------



## xer0h0ur

I have only created a crossfire profile for one game before since its in Alpha (DayZ) and its literally terribly coded right now. Basically no optimizations yet for crossfire or sli. I never really did manage to get higher usage out of the 2nd or 3rd GPUs for that matter. However I did notice that once you create the profile if you select it and scroll to the bottom there is an option called "AMD CrossFireX Mode." Have you tried out the different options to see if any of them give you a result?


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> I'm guessing high/medium settings on 4k would be the equivalent of ultra/high respectively on 1440 so I shouldn't really be complaining


I just have no frame of reference for what 4k should be doing.
Funny though, I was testing tomb raider earlier and on a single 1440p/120hz display the system runs @120fps and doesn't need full usage to do it at all, consequently my gpus all run in the 40s.

I don't have any issues outside of dx9, but my hunch is you should be able to EASILy do 60fps in that game so there is an issue.

I would suggest turning pcie link settings off from within Windows, it.

I have found that the BIOS settings for PCIE ASPM have an affect on drivers as well.


----------



## Syceo

Yeah I'll give that a shot. On a side note, what do you think of a 3 way evga hydro copper 980 Sli for 4k, or would it be better to hold out for the rumoured 390x and trifire that with the 295x2. I sense AMD is hard at work preparing to once again drop the bomb in responce to the 980


----------



## Syceo

Oh hold on just a minute. Just had a light bulb moment. Is it possible that my mildly overclocked 4770k @ 4.3 could be bottlenecking, hence holding back the gpu's. Should I consider a refresh? I can't get any more than 4.4 on this cpu, believe me I've tried


----------



## xer0h0ur

My friend that is an Nvidia die hard fan got drunk on release night because he was pissed at the information on the 980 lol. He said 256 bit memory bus and a lower transistor count than the 780 Ti's he has makes the 980 pointless. I just figured that Nvidia was nerfing the hell out of Maxwell on purpose to later release a stronger card when AMD ups the ante.


----------



## xer0h0ur

Quote:


> Originally Posted by *Syceo*
> 
> Oh hold on just a minute. Just had a light bulb moment. Is it possible that my mildly overclocked 4770k @ 4.3 could be bottlenecking, hence holding back the gpu's. Should I consider a refresh? I can't get any more than 4.4 on this cpu, believe me I've tried


Wouldn't it be more of a product of your 4770K only having 16 PCIe lanes so your 295X2 and your 290X are each running at 8x?


----------



## cennis

I am encountering some issues with firestrike EXTREME preset.

on firestrike, my card performs as normal, around 20000 graphics score.

in extreme , I can see severe dropping of frame rates and utilization in Graphics test 2, giving me 9000 gpu score.

3dmark11 seems fine too.

Lights in my room flicker on that test too.

tried 2 power supplies, ax1200 and xfx 1250


----------



## xer0h0ur

If your lights are flickering in your room on high power draw, which is what Firestrike Extreme does, then it sounds more like you're drawing too much power from a single breaker on your circuit?


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Wouldn't it be more of a product of your 4770K only having 16 PCIe lanes so your 295X2 and your 290X are each running at 8x?


I had thought initially that it was due to the mobo only having 1x 16 lane pcie, but I was reassured that this would not be an issue because of the plx chip


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Yeah I'll give that a shot. On a side note, what do you think of a 3 way evga hydro copper 980 Sli for 4k, or would it be better to hold out for the rumoured 390x and trifire that with the 295x2. I sense AMD is hard at work preparing to once again drop the bomb in responce to the 980


I'm gonna be totally honest with you. I went to Red team because Nvidia cards don't do dithering in Windows. This usually doesn't matter because monitors do. But my monitor is a Korean OC with no brain in it, and when you use a color profile to make it look right (I used an X-rite id3 pro colorimeter) with Nvidia cards you end up with color banding:

which is fixed by using AMD cards instead.

For people with normal monitors this isn't an issue. And if I had a "normal" monitor that knew what to do with color bands, I would be on Nvidia cards.

Power savings mean very little to me, but power = heat so I'd probably get the 980s for 4k if you have room for all those cards.

But I love my setup the way it is. I love my monitor to death and I'm not interested in 4k at all until we can push 120FPS in 4k, which will be years from now (if I even live to see it).

Pixel persistence, high refresh rate, and high framerate are key to enjoying my gaming experience. If I were to use a 4k display it would really probably only be for 4k movies.


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> I had thought initially that it was due to the mobo only having 1x 16 lane pcie, but I was reassured that this would not be an issue because of the plx chip


I have a hunch some people's PLX chips are overheating. Did you put a thermal pad on the PLX chip?


----------



## Mega Man

Quote:


> Originally Posted by *cennis*
> 
> I am encountering some issues with firestrike EXTREME preset.
> 
> on firestrike, my card performs as normal, around 20000 graphics score.
> 
> in extreme , I can see severe dropping of frame rates and utilization in Graphics test 2, giving me 9000 gpu score.
> 
> 3dmark11 seems fine too._*
> 
> Lights in my room flicker on that test too.*_
> 
> tried 2 power supplies, ax1200 and xfx 1250


ok that is weird and bad
have you thought about your heat, extreme heats up the gpus more,

power delivery? ( IE make a rigbuilder for yourself, see my sig )
you told us your psu but nothing else.
and you know you will get less score on extreme right ?


----------



## xer0h0ur

After the first couple of install attempts of slapping the EK block on the 295X2 I noticed that using a 1mm pad was too thick and caused pcb warping so instead of using a 0.5mm pad I just elected to use PK-3 on the PLX chip.


----------



## psyclist

Quote:


> Originally Posted by *Syceo*
> 
> I just tried this myself on tomb raider, and still all 3 gpus are fluctuating with the usage, 295x2 up and down and the same goes for the 290x. I should be able to do 60 fps im sure of it with this rig but im getting 45fps on average , anyone goit a fix for this?


I'd wipe drivers and start over perhaps, my 2 290X's get an average of 54.5 fps @4K at stock speeds in TR3 you should be pegged at 60 with your setup


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> If your lights are flickering in your room on high power draw, which is what Firestrike Extreme does, then it sounds more like you're drawing too much power from a single breaker on your circuit?


Quote:


> Originally Posted by *Mega Man*
> 
> ok that is weird and bad
> have you thought about your heat, extreme heats up the gpus more,
> 
> power delivery? ( IE make a rigbuilder for yourself, see my sig )
> you told us your psu but nothing else.
> and you know you will get less score on extreme right ?


thanks for the replies, i tried different outlets and i have utilization between 30~60% on that extreme GT2.

tried different outlets.

4770k, Z87 impact, 2400mhz ram, watercooled, aquacomputer block. temps are low 50s


----------



## electro2u

if your lights are flickering, I'd quit messing with that kind of power draw and take a look at the breaker you're feeding those outlets from. It should be at least a 20Amp breaker. 15A wont cut it. the trifire people are right up against needing a 30 amp breaker. A 1500Watt UPS battery backup system wants a 30Amp circuit breaker.


----------



## cennis

Quote:


> Originally Posted by *electro2u*
> 
> if your lights are flickering, I'd quit messing with that kind of power draw and take a look at the breaker you're feeding those outlets from. It should be at least a 20Amp breaker. 15A wont cut it. the trifire people are right up against needing a 30 amp breaker. A 1500Watt UPS battery backup system wants a 30Amp circuit breaker.


i live in this apartment building and ive tried both plugs available to me, what can i do?


----------



## electro2u

Quote:


> Originally Posted by *cennis*
> 
> i live in this apartment building and ive tried both plugs available to me, what can i do?


Totally reasonable to inquire about an upgrade to the electrical system with the management. I'd just take it easy for a bit and avoid settings labeled "EXTREME" XD

But yah, the power draw on these is prohibitive.


----------



## cennis

Quote:


> Originally Posted by *cennis*
> 
> i live in this apartment building and ive tried both plugs available to me, what can i do?


got my issues fixed, light is still flickering but i was on firestrike 1.0 which didnt work well.


----------



## xer0h0ur

Both runs were at 1018MHz GPU 1500MHz vRAM


----------



## cennis

Quote:


> Originally Posted by *xer0h0ur*
> 
> 
> 
> 
> Both runs were at 1018MHz GPU 1500MHz vRAM


ah, I would love to compare benchmarks but I am only running 295x2. since you have professional version could you change the number of GPUS in the "Help" page?

http://www.3dmark.com/3dm/4118697

heres mine, system scanner didnt work correctly but im running at 1175 1625


----------



## xer0h0ur

I am running a tri-fired setup but its a bit ghetto rigged right now, by my standards at least. Since I only have an EK block on the 295X2 and I have to have that card in the bottom PCIe slot then the 290X is in the top slot and there is literally no clearance between the cards. The damn thing is 1mm away from the backplate of the 295X2. I am going to take off the plastic cover to basically give it a little bit of breathing room until I get around to extending the loop and dropping an EK block on the 290X. System went from relatively silent to loud again with this thing lol.


----------



## electro2u

I get very similar score on extreme as Xero-hour, but on regular Firestrike he scores much better. When I overclock my cards I can beat his stock cards just barely. I guess that shows the difference between X79 and Z97 PCIE lane restrictions?

Also, Firestrike Extreme is insane on power load. Overclocking my 3 GPUs to ~1180MHz core and 1500Mhz RAM, +100mV +50 power handling, I was fine until the droid sword fight, where I experienced a shut down--tripped OCP protection I assume and down she went. First power related crash I've had since I got the 295x2.


----------



## Syceo

OK so im starting to think my mobo is not up to task, its has 2x16 PCIE 3.0, (8x8) when shared and 1x 2.0 PCIE with no mention of a PLX chip. So my thoughts are ... if im running effectively 3 cards with only the possibility of at maximum 8x8 lanes on the 295x2 and 8x8 lanes on the 290x (with the absence of a plx chip) this surely has to be a bandwidth inhibitor or am i just babbling on ? would it be worth while me grabing the MSI Z97 XPower AC as its also has a delid guard.

Any thoughts guys, as as always thanks in advance


----------



## CaptainJakes

Here with my bench and to verify my ownership 
http://www.3dmark.com/fs/2729819


----------



## CaptainJakes

Hey guys is anybody having a problem with Memory leak in Mantle wile playing BF4?


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> OK so im starting to think my mobo is not up to task, its has 2x16 PCIE 3.0, (8x8) when shared and 1x 2.0 PCIE with no mention of a PLX chip. So my thoughts are ... if im running effectively 3 cards with only the possibility of at maximum 8x8 lanes on the 295x2 and 8x8 lanes on the 290x (with the absence of a plx chip) this surely has to be a bandwidth inhibitor or am i just babbling on ? would it be worth while me grabing the MSI Z97 XPower AC as its also has a delid guard.
> 
> Any thoughts guys, as as always thanks in advance


There is actually a PLX chip on the 295x2 PCB itself, which is pretty neat. I originally thought I was going to need a board with a PLX chip to run trifire on z97 but it isn't the case. I actually bought that board you are looking at; it's a fantastic board now that it has a proper BIOS written for it, but it did not function any better than the board I'm using now for this setup.

The PLX chip IS the bottleneck on the 295x2 , as I understand it. Generally, a motherboard with a PLX chip is not as desirable for tri/quad GPU setups as the x79/99 platforms with their 40PCIE lanes available (watch out for the 5820k, it's been seriously nerfed and only has 28 lanes iirc). And unfortunately, with the 295x2, there is no way to get around it so far as I know. Even if you have the 295x2 by itself in a 16x PCIE 3.0 slot, the PLX chip will still need to function, and it will make the slot perform as 2 PCIE3.0 x16 slots. There will still be increased latency because of the PLX chip and the way it functions sending info from the 2 GPUs to the CPU, so there really isn't anywhere to go with the z97 platform for us. Wait until I get a chance to set up my X79 system this week and I will be able to tell you if it makes any difference for trifire--for a single 295x2 in crossfire, it is completely irrelevant how many lanes the CPU has available. I'm only testing x79 because Axiumone says FFXIV works properly in Windows 8.1 on X79 and that's what I want to get working right now. I'm stubborn about stuff like that. But if it does turn out to work, that means there's some DX9 stuff going on with the platform chipsets, windows 8.1 (preferrable OS imo) and the 295x2, because on z97 there are definitely some issues with certain games... like iracing apparently.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> There is actually a PLX chip on the 295x2 PCB itself, which is pretty neat. I originally thought I was going to need a board with a PLX chip to run trifire on z97 but it isn't the case. I actually bought that board you are looking at; it's a fantastic board now that it has a proper BIOS written for it, but it did not function any better than the board I'm using now for this setup.
> 
> The PLX chip IS the bottleneck on the 295x2 , as I understand it. Generally, a motherboard with a PLX chip is not as desirable for tri/quad GPU setups as the x79/99 platforms with their 40PCIE lanes available (watch out for the 5820k, it's been seriously nerfed and only has 28 lanes iirc). And unfortunately, with the 295x2, there is no way to get around it so far as I know. Even if you have the 295x2 by itself in a 16x PCIE 3.0 slot, the PLX chip will still need to function, and it will make the slot perform as 2 PCIE3.0 x16 slots. There will still be increased latency because of the PLX chip and the way it functions sending info from the 2 GPUs to the CPU, so there really isn't anywhere to go with the z97 platform for us. Wait until I get a chance to set up my X79 system this week and I will be able to tell you if it makes any difference for trifire--for a single 295x2 in crossfire, it is completely irrelevant how many lanes the CPU has available. I'm only testing x79 because Axiumone says FFXIV works properly in Windows 8.1 on X79 and that's what I want to get working right now. I'm stubborn about stuff like that. But if it does turn out to work, that means there's some DX9 stuff going on with the platform chipsets, windows 8.1 (preferrable OS imo) and the 295x2, because on z97 there are definitely some issues with certain games... like iracing apparently.


Cheers electro...

You and your FFXIV, i have a feeling your not gonna let that go lol....

In anycase, i just squeezed a 4.5 out of this ****ty 4770K , hasn't blue screened yet . I am now getting a solid 60fps on ultra on tomb raider.

had monitoring up via afterburner and it showed gpu 3 as inactive on screen but active in afterburner which was strange ( i have ulps disabled) . I think the cards will not under any circumstance operate at 100% unless there is an absolute need to ie BF4 + mantle ( so i can accept that) . As long as I am getting 60 fps at 4 K im happy ( since i bloody paid for it) . Having loved 1440 resolution, I am finding it hard to even consider going back to it, and this tri-fire setup ( that i love even with all this niggly issues) truly does look amazing at 4k.

I am however very happy that I opted for a loop, because if i encounter any game changing issues with this set up, il be straight back to the green team for a 3 way 980 hydro copper set up. Will wait to see your results on the x79 before i jump in to early and change board and cpu.


----------



## xer0h0ur

LOL well taking off the plastic cover was only a way of expediting overheating while gaming. It eliminates much of the airflow across the aluminum fins. Basically its only any cooler at idle not at load with the cover taken off. Oh well. Going to have to keep the cover on until I have the money for the block, pci pass-thru adapter, radiator and external rad mount.


----------



## estu87

hi, i just purchased a 295x2 and installed it with AMD Catalyst 14.8 WHQL (14.201.1008 August 12) from 3dguru. and i can't find crossfire option in CCC, and hardware info section show the second adapter is disabled. also crossfire is disabled in tpu gpuz. is this normal?


----------



## Syceo

Quote:


> Originally Posted by *estu87*
> 
> hi, i just purchased a 295x2 and installed it with AMD Catalyst 14.8 WHQL (14.201.1008 August 12) from 3dguru. and i can't find crossfire option in CCC, and hardware info section show the second adapter is disabled. also crossfire is disabled in tpu gpuz. is this normal?


Its under the gaming tab in CCC on the left


----------



## dcombs108

Quote:


> Originally Posted by *Syceo*
> 
> Its under the gaming tab in CCC on the left


or performance....


----------



## Syceo

Quote:


> Originally Posted by *dcombs108*
> 
> or performance....


either one will do ...


----------



## dcombs108

Quote:


> Originally Posted by *Syceo*
> 
> either one will do ...


sorry I wasn't correcting.... Just other options


----------



## estu87

thanks for your response but i can't find the option, here is my screenshot


Spoiler: ss









maybe there is problem from my end?, i use win 7 64, 8gb ram, intel i5, asus P8Z68V LE mobo.

another ss from performance tab


----------



## Syceo

@electro2u

Thanks for the heads up about the sound card backplate , goes nicely with the cards




.... let me know when you have run the X79 set up with the trifire setup, as i have my finger on the trigger ready to swap out this Z87-pro


----------



## Syceo

Quote:


> Originally Posted by *dcombs108*
> 
> sorry I wasn't correcting.... Just other options


No worries at all mate, we are all here to help one another so its cool


----------



## Syceo

Quote:


> Originally Posted by *estu87*
> 
> thanks for your response but i can't find the option, here is my screenshot
> 
> 
> Spoiler: ss
> 
> 
> 
> 
> 
> 
> 
> 
> 
> maybe there is problem from my end?, i use win 7 64, 8gb ram, intel i5, asus P8Z68V LE mobo.
> 
> another ss from performance tab


PSU?


----------



## estu87

Quote:


> Originally Posted by *Syceo*
> 
> PSU?


a corsair tx750, not sufficient?


----------



## Syceo

Quote:


> Originally Posted by *estu87*
> 
> a corsair tx750, not sufficient?


The first thing i would do is a "clean install of ccc 14.8 .

Use this>>>> http://www.guru3d.com/files-details/display-driver-uninstaller-download.html to remove all previous traces , then reinstall the drivers. Make sure your putting the 295x2 in the 16x lane slot on your motherboard. Make sure you restart after.


----------



## Syceo

Quote:


> Originally Posted by *estu87*
> 
> a corsair tx750, not sufficient?


Your going to need a minimum of 1000W PSU to get the best results. The 295x2 had a TDP of 500W , so your not leaving anything else for your CPU and any other components you might have .


----------



## electro2u

I've had issues with 14.8. Using 14.4 right now.

@Syceo, card looks great man! Cold Zero is awesome.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> I've had issues with 14.8. Using 14.4 right now.
> 
> @Syceo, card looks great man! Cold Zero is awesome.


Yeah, makes a whole lot of difference. Cold zero got it to me in 2 days , pretty damn quick .


----------



## shadow85

What is AMD releasing on the 25th? A new Radeon Card? Or something else.


----------



## electro2u

I think they are going to release an AIO cooled single GPU 295x and maybe even a 300x to commemorate 30th anniversary. Wild guesses though. Would expect a counter punch to 970/980 release but it might not be ready yet, no idea.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> I think they are going to release an AIO cooled single GPU 295x and maybe even a 300x to commemorate 30th anniversary. Wild guesses though. Would expect a counter punch to 970/980 release but it might not be ready yet, no idea.


Totally agree, could be looking at a response to the 900's. If they can outshine the 980 then things may get interesting:


----------



## estu87

Quote:


> Originally Posted by *Syceo*
> 
> Your going to need a minimum of 1000W PSU to get the best results. The 295x2 had a TDP of 500W , so your not leaving anything else for your CPU and any other components you might have .


yea psu upgrade already in the buy list but need some time saving money after i burnt my wallet buying this 295









so i uninstalled again and use CCC 14.4, switched the card to pcie 16_1, and crossfire work now but no specific menu on ccc except in 3d app setting but not a big deal since i won't oc it anyway. thx for your help


----------



## Syceo

Quote:


> Originally Posted by *estu87*
> 
> yea psu upgrade already in the buy list but need some time saving money after i burnt my wallet buying this 295
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so i uninstalled again and use CCC 14.4, switched the card to pcie 16_1, and crossfire work now but no specific menu on ccc except in 3d app setting but not a big deal since i won't oc it anyway. thx for your help


Your welcome , no worries....

Also make sure you Disable ULPS in MSI afterburner and enable crossfire ... (it should now appear under the gaming tab)


----------



## Mega Man

amd and all other big companies need to stop feeding that darn patent troll company


----------



## Syceo

Has anyone here tried a 295x2 or trifire with an Asus ROG Swift PG278Q, and if so what are your thoughts, thinking of getting one as a second screen just to game off, and use the 4k for editing and work.


----------



## Velict

Okay, seriously. Every build I see uses x79 or x99 for quadfire r9 295x2's. Is there an sort of bottleneck running at x8 for each physical card? IE: z97 with 4690k.


----------



## doctakedooty

Quote:


> Originally Posted by *Velict*
> 
> Okay, seriously. Every build I see uses x79 or x99 for quadfire r9 295x2's. Is there an sort of bottleneck running at x8 for each physical card? IE: z97 with 4690k.


Not really the reason you see everyone doing the x79 or x99 chipset is so they can full utilize the x16 pcie slots and if your dropping well back when the price was higher $3 grand you probably could afford a grand for the cpu and motherboard to do x79. Will you notice a difference between the two yes.Only thing you will notice is some fps not as high as you would on the x79 or x99 chipset. Now with that said you will probably need to overclock your cpu pretty good to not see a bottleneck on the cpu side. I will say my 4 780ti on a 4930k with all my ti at around 1400 core I was hitting a bottleneck on my cpu with it oc to 4.7 ghz.


----------



## Mega Man

i also use AMD for quadfire, so no.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Has anyone here tried a 295x2 or trifire with an Asus ROG Swift PG278Q, and if so what are your thoughts, thinking of getting one as a second screen just to game off, and use the 4k for editing and work.


I recently bought this monitor to use with Trifire 295x2/290x and it is FREAKING AMAZING.
Specs:
Asus Maximus Hero VII
4790K stock and Undervolted (.997 vcore)
16GB GSkill Trident @ 1866 8-8-9-24 2N
295X2/290X Trifire @ 1060 Core/1325 Mem (Stock volts, +50 Power Limit)

All custom water cooled

The monitor is super crisp and very fluid with the setup I have (+70FPS in all games maxed out)
It's expensive, but I believe it's worth it.

Only thing that bugs me is MSI AB 4.0.0 doesn't load my overclocks when it boots up even though it's a saved profile. Only 1 card gets the overclock then I have to manually apply my other profile. Any ideas why that happens? Even if the cards are "synced"?


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> I recently bought this monitor to use with Trifire 295x2/290x and it is FREAKING AMAZING.
> Specs:
> Asus Maximus Hero VII
> 4790K stock and Undervolted (.997 vcore)
> 16GB GSkill Trident @ 1866 8-8-9-24 2N
> 295X2/290X Trifire @ 1060 Core/1325 Mem (Stock volts, +50 Power Limit)
> 
> All custom water cooled
> 
> The monitor is super crisp and very fluid with the setup I have (+70FPS in all games maxed out)
> It's expensive, but I believe it's worth it.
> 
> Only thing that bugs me is MSI AB 4.0.0 doesn't load my overclocks when it boots up even though it's a saved profile. Only 1 card gets the overclock then I have to manually apply my other profile. Any ideas why that happens? Even if the cards are "cynced"?


Great, mine will be here tomorrow morning, im a little concerned though that your only getting 70fps at 1440 since im getting 60fps at 4k ?????


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Great, mine will be here tomorrow morning, im a little concerned though that your only getting 70fps at 1440 since im getting 60fps at 4k ?????


It depends on the game.
I say 70+ because that's the lowest I've ever seen my FPS go (Crysis 3 8x MSAA)
Tomb Raider 2013 nails me 100+FPS Average with 2x SSAA and 80+ average with 4X SSAA
Heaven maxed out at 1440P is 94FPS Average
3Dmark Firestrike Extreme gets me a score of 11200-ish

Also, your CPU is clocked higher than me. I disabled turbo so I'm only at 4GHz

BTW, what's your load wattage?
I'm pulling 950 peak system load while gaming in all the games I've played.


----------



## Syceo

I haven't measures the power usage yet. Have you experienced any stutter or tearing with you setup? What I'm planning on doing is seeing how things go with the Asus Rog, and if it's not uptopar with trifire configuration, then I may consider a pair of 980s, but I'm unsure as I absolutely love this trifire setup


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> I haven't measures the power usage yet. Have you experienced any stutter or tearing with you setup? What I'm planning on doing is seeing how things go with the Asus Rog, and if it's not uptopar with trifire configuration, then I may consider a pair of 980s, but I'm unsure as I absolutely love this trifire setup


I RARELY see stuttering and I get no tearing at all. Super happy with this build.
BTW, I'm the guy that talked to you on your youtube video about the pump you're running. Freaking small world LOL


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> I RARELY see stuttering and I get no tearing at all. Super happy with this build.
> BTW, I'm the guy that talked to you on your youtube video about the pump you're running. Freaking small world LOL


Hahah, well indeed it is lol how are you mate ?

Tbh i thought not having Gsync would be a disadvantage with the Asus Rog given that im using an amd setup . How are you finding the overall performance (visually) given that its a TN ?


----------



## stxe34

anyone know when the next driver release will be? the latest are the 14.7 beta although the 14.8 is floating around but not from amd official website


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Hahah, well indeed it is lol how are you mate ?
> 
> Tbh i thought not having Gsync would be a disadvantage with the Asus Rog given that im using an amd setup . How are you finding the overall performance (visually) given that its a TN ?


I'm good. Doing something different by undervolting things to get lower temps. i only have 4x 120mm radiators 91 per block) and I don't want to push them too hard.
Honestly, the cards to really well with the Swift. g-Sync isn't needed. it's super smooth. I wouldn't worry about it.
To me, gSync is only for those who have 1 card running at high resolutions. Since we have multicard, we have a much lower chance of getting tearing. I have yet to see tearing while I play.


----------



## Syceo

Great...
Quote:


> Originally Posted by *ljreyl*
> 
> I'm good. Doing something different by undervolting things to get lower temps. i only have 4x 120mm radiators 91 per block) and I don't want to push them too hard.
> Honestly, the cards to really well with the Swift. g-Sync isn't needed. it's super smooth. I wouldn't worry about it.
> To me, gSync is only for those who have 1 card running at high resolutions. Since we have multicard, we have a much lower chance of getting tearing. I have yet to see tearing while I play.


Great, im looking forward to some decent frames...


----------



## dcombs108

Does anyone have a recommendation for a good case for the r9 295....it is the only aio I have I prefer aircooling for could.... And it's huge... Phantek something or other... 7ish inches tall..I have loads of brand new red led 120mm fans...


----------



## dcombs108

Double post sorry


----------



## Elmy

My latest BF4 video 5400X1920 using 2 Club3D 295X2's


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> I'm good. Doing something different by undervolting things to get lower temps. i only have 4x 120mm radiators 91 per block) and I don't want to push them too hard.
> Honestly, the cards to really well with the Swift. g-Sync isn't needed. it's super smooth. I wouldn't worry about it.
> To me, gSync is only for those who have 1 card running at high resolutions. Since we have multicard, we have a much lower chance of getting tearing. I have yet to see tearing while I play.


I must admit, you were quite right about this monitor. Think ill use it as the main one and have the 4K for work on the side, Jesus those frames are insane (for me its insane having never used anything higher than a 60hz monitor)


----------



## shadow85

Quote:


> Originally Posted by *Syceo*
> 
> I must admit, you were quite right about this monitor. Think ill use it as the main one and have the 4K for work on the side, Jesus those frames are insane (for me its insane having never used anything higher than a 60hz monitor)


What 4k monitor is that?


----------



## Syceo

Quote:


> Originally Posted by *shadow85*
> 
> What 4k monitor is that?


The one on the left is a Asus PB287Q 4K and the one on the right is a Asus ROG 144HZ PG278Q


----------



## yifeng3007

I finally decided to join the club! Count me in, please









9tg4QLdGagQ.jpg 634k .jpg file


QTb5x7f1kBM.jpg 738k .jpg file


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> I must admit, you were quite right about this monitor. Think ill use it as the main one and have the 4K for work on the side, Jesus those frames are insane (for me its insane having never used anything higher than a 60hz monitor)


Glad you're enjoying the monitor. After a couple hours of using it, go to tftcentral and use their calibration settings. They works pretty good for me.
Btw, when running a game maxed out, what's your load temps for your gpus and cpu?
I'm at 68-71c for my cpu and 60-62c for my gpu when playing Crysis 3 maxed out. My GPUs are clocked at 1069/1375 stock volts, 0 power limit.
Cpu is 4790k 4.4ghz @ 1.115v, LLC Level 1


----------



## Syceo

At the moment I'm just playing fifa 15. Load is 52 across gpu's and the cpu is 38, I'll test out tomb raider or a more demanding game. And let u know
Quote:


> Originally Posted by *ljreyl*
> 
> Glad you're enjoying the monitor. After a couple hours of using it, go to tftcentral and use their calibration settings. They works pretty good for me.
> Btw, when running a game maxed out, what's your load temps for your gpus and cpu?
> I'm at 68-71c for my cpu and 60-62c for my gpu when playing Crysis 3 maxed out. My GPUs are clocked at 1069/1375 stock volts, 0 power limit.
> Cpu is 4790k 4.4ghz @ 1.115v, LLC Level 1


----------



## ImperialOne

I have 4770K at 4.4gHz @ 1.28v; temp under Crysis 3 maxed out is 48... using a Frostbyte 360mm radiator.
The 295x2 is at 64c at 1100/1400(x4) +45 power, I don't OC my 290x at 1080/1400.


----------



## ljreyl

Quote:


> Originally Posted by *ImperialOne*
> 
> I have 4770K at 4.4gHz @ 1.28v; temp under Crysis 3 maxed out is 48... using a Frostbyte 360mm radiator.
> The 295x2 is at 64c at 1100/1400(x4) +45 power, I don't OC my 290x at 1080/1400.


That's a lot lower than mine... And you upped the voltage. Are you in a custom loop?
I have a feeling that I either need to reseat my cpu block or my 120+360mm radiator set up is close to maxing out on its cooling potential with 4 blocks. Hmmm...


----------



## xer0h0ur

From what I can gather you're supposed to have 120 per component be it GPU, CPU, Chipset, RAM or whatever. Since were dealing with dual GPU cards that means 240 per 295X2 block.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> From what I can gather you're supposed to have 120 per component be it GPU, CPU, Chipset, RAM or whatever. Since were dealing with dual GPU cards that means 240 per 295X2 block.


That's pretty much what I went with. Before I added the 295, I just had a cpu and gpu with the 360+120 set up.
The only thing that bothers me is that my cpu temp seems higher than what other people get, and I'm on undervolted stock turbo clocks.
It could be that most users or reviewers are running the cpu in a closed loop so that eliminates other sources of heat.
Or my cpu block needs to be reseated.
Or my entire loop is almost maxed out. D5 is at setting 5 too. Setting it lower raises temps.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> That's pretty much what I went with. Before I added the 295, I just had a cpu and gpu with the 360+120 set up.
> The only thing that bothers me is that my cpu temp seems higher than what other people get, and I'm on undervolted stock turbo clocks.
> It could be that most users or reviewers are running the cpu in a closed loop so that eliminates other sources of heat.
> Or my cpu block needs to be reseated.
> Or my entire loop is almost maxed out. D5 is at setting 5 too. Setting it lower raises temps.


Have you considered de lidding it? i shaved off a whole 19 degrees at load


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Have you considered de lidding it? i shaved off a whole 19 degrees at load


I've been thinking about that since the day after I installed it lol. Might do it tomorrow. Still debating. Im pretty afraid of messing up even though I love doing that stuff


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> I've been thinking about that since the day after I installed it lol. Might do it tomorrow. Still debating. Im pretty afraid of messing up even though I love doing that stuff


Im not even just saying this ... but if i can do it then anyone can lol , i dint it in about 10 mins with a regular razor blade then another 15-20 mins cleaning the resin off and applying the liquid silver then job done. The only way i can see someone messing up is if they go at it like mike Tyson or something. A sharp blade cuts through it like cheese, just play BF4 or Crysis first to warm up that bad boy... then your done


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Im not even just saying this ... but if i can do it then anyone can lol , i dint it in about 10 mins with a regular razor blade then another 15-20 mins cleaning the resin off and applying the liquid silver then job done. The only way i can see someone messing up is if they go at it like mike Tyson or something. A sharp blade cuts through it like cheese, just play BF4 or Crysis first to warm up that bad boy... then your done


So I do have to use something like CLU?
Gelid extreme won't do?
Now that's even scarier lol.


----------



## Syceo

Coollaboratory Liquid Pro Liquid is what I used. And a cotton bud and job done


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Coollaboratory Liquid Pro Liquid is what I used. And a cotton bud and job done


I don't have any of that stuff around my area lol. Guess I'll order and do more research.
Seems like a 10c difference average with the 4790k


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> So I do have to use something like CLU?
> Gelid extreme won't do?
> Now that's even scarier lol.


Just out of curiosity, what slot is your 295X2 in


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Just out of curiosity, what slot is your 295X2 in


Just for reference, mobo is a Hero VII
295 is top PCIE 3 slot and 290x is bottom slot. Separated by 2 slots in the middle. Both run at PCIE x8.
I'll take a picture when I get home and upload my set up.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Just for reference, mobo is a Hero VII
> 295 is top PCIE 3 slot and 290x is bottom slot. Separated by 2 slots in the middle. Both run at PCIE x8.
> I'll take a picture when I get home and upload my set up.


the reason im asking is because i believe my motherboard does not have a plx chip hence this could be reason the 290x card in the second lane is under performing.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> the reason im asking is because i believe my motherboard does not have a plx chip hence this could be reason the 290x card in the second lane is under performing.


What mobo do you have again?


----------



## electro2u

I've got my RIVBE up and running and of course FFXIV is still doing this weird crossfire one time only thing to me. lol. Love the board and the 4820k chip is blazing fast at 4.7ghz, though. Kind of disappointed still having problems but that's life.


----------



## Sgt Bilko

So it's been a while since i've been in here and i'm thinking about grabbing one of these since the prices have dropped









Does it have voltage control yet and how have people been going with 295x2 + 290/x crossfire?


----------



## electro2u

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So it's been a while since i've been in here and i'm thinking about grabbing one of these since the prices have dropped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does it have voltage control yet and how have people been going with 295x2 + 290/x crossfire?


Yeah voltage control works. Personally I haven't found a way to monitor both GPUs without disabling ULPS using MSI afterburner.


----------



## Sgt Bilko

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> So it's been a while since i've been in here and i'm thinking about grabbing one of these since the prices have dropped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does it have voltage control yet and how have people been going with 295x2 + 290/x crossfire?
> 
> 
> 
> Yeah voltage control works. Personally I haven't found a way to monitor both GPUs without disabling ULPS using MSI afterburner.
Click to expand...

Awesome, I already have ULPS disabled on my rig and that would be carried over.

No issues with the Hybrid crossfire? (apart from games that don't support it)


----------



## electro2u

I'm probably not the best person to ask, but it seems to me 290x+295x2 works great--it also doubles as an excellent central heating unit if your main is on the fritz. I have a weird issue with FFXIV in windows 8.1 that apparently no one can duplicate, explain or help me with (it's kind of all over the internet now...) but aside from that and typical DX9 crossfire stuff (DX9 unigine valley is a good example), everything is dandy. I really only have significant heat output while folding, which is so bad that I really can't do it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *electro2u*
> 
> I'm probably not the best person to ask, but it seems to me 290x+295x2 works great--it also doubles as an excellent central heating unit if your main is on the fritz. I have a weird issue with FFXIV in windows 8.1 that apparently no one can duplicate, explain or help me with (it's kind of all over the internet now...) but aside from that and typical DX9 crossfire stuff (DX9 unigine valley is a good example), everything is dandy. I really only have significant heat output while folding, which is so bad that I really can't do it.


295x2 gets a bit warm? What's the throttle limit for it?


----------



## electro2u

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 295x2 gets a bit warm? What's the throttle limit for it?


The stock settings' target temp/throttle temp is 75C and it is adjustable downward but people using the stock cooler are often seeing throttling a little bit lower than that... For whatever reason there is no VRM temp sensor information available so it seems unlikely the VRM temps are directly connected to throttle point, but it is clear that improving VRM cooling by upgrading the thermal pads helps keep throttling at bay.

For what it's worth I have no throttling issues and my GPUs don't get anywhere near 75C but the heat output from my radiators is really really impressive. Folding will literally outmuscle the air conditioning going to my room and it gets unpleasant after a couple hours. Kinda like I had a fireplace in here.


----------



## Sgt Bilko

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 295x2 gets a bit warm? What's the throttle limit for it?
> 
> 
> 
> The stock settings' target temp/throttle temp is 75C and it is adjustable downward but people using the stock cooler are often seeing throttling a little bit lower than that... For whatever reason there is no VRM temp sensor information available so it seems unlikely the VRM temps are directly connected to throttle point, but it is clear that improving VRM cooling by upgrading the thermal pads helps keep throttling at bay.
> 
> For what it's worth I have no throttling issues and my GPUs don't get anywhere near 75C but the heat output from my radiators is really really impressive. Folding will literally outmuscle the air conditioning going to my room and it gets unpleasant after a couple hours. Kinda like I had a fireplace in here.
Click to expand...

Well i've got some Fujipoly here so i can do some changes to the stock pads pretty easy, also might get some better airflow going on in my case as well.

Going to try out some 295x2 and R9 290 Quadfire to start with then just keep one 290 and have a 24/7 Tri-Fire setup.

Thanks for all the info man, Rep+


----------



## electro2u

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thanks for all the info man, Rep+


Absolutely


----------



## ljreyl

Forgot to post that I own one lol

Corsair 750D
Asus Maximus Hero VII
4790K @ 4.4GHz 1.115V
295x2 @ 1069/1375, stock volts
290x @ 1069/1375, stock volts
16GB DDR3 GSkill TridentX @ 1866 8-9-9-24-2N
2X 750GB EVO840 SSD
Corsair AX1200i
EK PE Series Radiators 360mm + 120mm with 4x Noctua NF-F12 Fans, 7V
EK Supremacy
EK GPU Blocks
EK X-Res 140 w/MCP655 @S5
Asus Xonar U3 + Astro Mixamp + Sennheiser HD 598
BenQ XL2420T @ Portrait Mode, 120Hz
Asus ROG Swift @ Landscape Mode, 120Hz

With this setup, I get the following frames at 1440p, max everything, minimum/average:
Crysis 3 - 65/93
Tomb Raider - 58/88
Heaven - 32/97

I can do more benchmarks if you want.

Temps:
CPU - up to 72c (with front panel of 750D off) (Probably gonna delid next week)
GPU - 58-64c (all GPU, depends on game intensity)
GPU VRM - 48-52c

Total system load is 950-1005 watts.


----------



## yifeng3007

Well, i'm glad, i've finally decided to join the club, to ask for some help, lol
Recently, i was reading the thread, and some folks were mentioning disabling ULPS for some perfomance gain. So i decided to give it a try and disabled ULPS in MSI afterburner, restarted the PC, jumped in BF4 and it immediately crashed with Directx error. I than enabled ULPS in the Afterburner, restarted the PC and disabled ULPS manually in the registry, restarted the PC again, jumped in the game and it crashed again!
Can anyone please help me out?..
P.S.: i run two R9 295x2 on a Gigabyte GA-Z97X-Gaming G1 Wi-Fi BK, powered by AX1500i.


----------



## Syceo

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, i'm glad, i've finally decided to join the club, to ask for some help, lol
> Recently, i was reading the thread, and some folks were mentioning disabling ULPS for some perfomance gain. So i decided to give it a try and disabled ULPS in MSI afterburner, restarted the PC, jumped in BF4 and it immediately crashed with Directx error. I than enabled ULPS in the Afterburner, restarted the PC and disabled ULPS manually in the registry, restarted the PC again, jumped in the game and it crashed again!
> Can anyone please help me out?..
> P.S.: i run two R9 295x2 on a Gigabyte GA-Z97X-Gaming G1 Wi-Fi BK, powered by AX1500i.


Im guessing your air cooling??


----------



## yifeng3007

Sorry for the late response and thank You!
Erm, i am sorry, but i don't understand what is wrong with air cooling? :C
My 900D has both panels opened so all the components have less chances to overheat (Closed 900D under load gets almost 5 degrees hooter, than when i remove all the side and the front panels)


----------



## electro2u

Disabling Ulps
Quote:


> Originally Posted by *yifeng3007*
> 
> Sorry for the late response and thank You!
> Erm, i am sorry, but i don't understand what is wrong with air cooling? :C
> My 900D has both panels opened so all the components have less chances to overheat (Closed 900D under load gets almost 5 degrees hooter, than when i remove all the side and the front panels)


If the cards aren't throttling from overheating, then it's not a problem at all. Personally, my single 295x2 was lightly throttling after a single run through Unigine Valley at 1440p on air. So I went water.

I think disabling ULPS isn't necessary. It is sometimes (depending on a LOT of factors) the cause of instability or low performance, if you are already doing OK, it's not going to give you any added benefit, imo.


----------



## Syceo

Hey guys a bit of help choosing a new mobo

2 options are :

1. ASUS Z97-WS
2. MSI Z97 XPower AC

your thoughts would be appreciated .. Thanks in advance


----------



## Syceo

Quote:


> Originally Posted by *yifeng3007*
> 
> Sorry for the late response and thank You!
> Erm, i am sorry, but i don't understand what is wrong with air cooling? :C
> My 900D has both panels opened so all the components have less chances to overheat (Closed 900D under load gets almost 5 degrees hooter, than when i remove all the side and the front panels)


Prior to watercooling my cards, when playing BF4 , the 295x5 used to crash BF4 dues to the heat. Thats why i was asking if you where on air.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*


Quote:


> Originally Posted by *Syceo*


Well, the only reason i tried turning off ULPS was because i was getting a LOT of microstuttering in BF4 and sometimes my game could start lagging incredibly, going from 120-130 average fps to a moment when it drops down to 40(!!!) and coming back up again in a split second, it doesn't happen often, but it is still very frustrating.

I also forgot to mention, that i tried to overclock them, but when going up to 1100 MHz core clock, 3DMark FS Extreme test won't start, down clocking it to 1080 MHz starts the test, but it crashes in a couple of seconds. And, now, after disabling ULPS, i can change the core voltages in the MSI Afterburner, but i don't know if i should do it and what are the values i should choose...

A quick update: i went ahead and decided to try to increase the voltages for stability and to increase the MHz and everything works like a charm now, so my mistake was increasing core speed too much, without increasing the voltages
And, yeah, i know, i am doing some stupid stuff, most of you folks already know how and what to do with all this, but i am knew to it, so please forgive my ignorance









And, yeah, one more thing: it is actually pretty funny. As i said, i get an average of 120-140 fps in bf4, depending on a map and the amount of people, but, since i tweaked around in the MSI Afterburner, i don't get SUCH huge fps drops as i had before, but, i still get a lot of microstuttering. It almost feels like the cards are overheating, even though GPU-Z shows, that the radiator (or videocard) fans ran at about 30-40% and the GPUs are at 65C maximum. I just don't understand all this, lol

BTW, when i join any empty server, stand in the middle of the map, or just anywhere, my fps usually get up to 150 fps and fall down to 120-130 fps... With no action going on, no explosions, nothing...


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, the only reason i tried turning off ULPS was because i was getting a LOT of microstuttering in BF4 and sometimes my game could start lagging incredibly, going from 120-130 average fps to a moment when it drops down to 40(!!!) and coming back up again in a split second, it doesn't happen often, but it is still very frustrating.
> 
> I also forgot to mention, that i tried to overclock them, but when going up to 1100 MHz core clock, 3DMark FS Extreme test won't start, down clocking it to 1080 MHz starts the test, but it crashes in a couple of seconds. And, now, after disabling ULPS, i can change the core voltages in the MSI Afterburner, but i don't know if i should do it and what are the values i should choose...
> 
> A quick update: i went ahead and decided to try to increase the voltages for stability and to increase the MHz and everything works like a charm now, so my mistake was increasing core speed too much, without increasing the voltages
> And, yeah, i know, i am doing some stupid stuff, most of you folks already know how and what to do with all this, but i am knew to it, so please forgive my ignorance


Ahhh... the cards were hungry for voltage. Makes sense.
I had to disable ULPS so i could monitor all 3 GPU's with HWINFO64. For some reason, when its on, the 2nd and 3rd gpu aren't sensed.


----------



## electro2u

Ignore


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Hey guys a bit of help choosing a new mobo
> 
> 2 options are :
> 
> 1. ASUS Z97-WS
> 2. MSI Z97 XPower AC
> 
> your thoughts would be appreciated .. Thanks in advance


Doesn't matter really. But I'd go with the Asus just because you have the aesthetics for it. Also, a quick review search shows that the ASUS is favored over the MSI.
BTW, your PLX thing got me wondering so I did research last night. Since the 295X2 has a PLX chip, it's fine to work in Trifire. The PLX really is for multiGPU solutions on the motherboard.
Your current board and mine are rated for Dual GPU. For a board to go over 2, you will need a PLX chip for PCIE optimization.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Ahhh... the cards were hungry for voltage. Makes sense.
> I had to disable ULPS so i could monitor all 3 GPU's with HWINFO64. For some reason, when its on, the 2nd and 3rd gpu aren't sensed.


My guess is, since ULPS basically kind of disables unused GPUs, then HWINFO might not monitor them properly... Or i don't know, better not listen to me, i am really new to this stuff, that is just my opinion.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Ignore


I remember, you wrote something about throttling, but, honestly, i don't think that's the case. I've seen many people having this setup running with 1375 Watt PSU with no problems... But if it is, i also have Lepa P1700M available, i might check it out sometime. I'm just to lazy to do it, because i'll need to do all the unwiring again, lol I'd rather check every possible thing, before doing that.


----------



## yifeng3007

How do you guys check power usage? I am really wondering how much power my PC draws...


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> I remember, you wrote something about throttling, but, honestly, i don't think that's the case. I've seen many people having this setup running with 1375 Watt PSU with no problems... But if it is, i also have Lepa P1700M available, i might check it out sometime. I'm just to lazy to do it, because i'll need to do all the unwiring again, lol I'd rather check every possible thing, before doing that.


Throttling would be due to excess heat but if your cores are at 65c then you are in great shape.
Certainly you have enough power. The concern about using a single psu for 2 295x2 is that the PCIE cables are rated for certain amperage and AMD decided to just ignore the specifications. So if you are pulling 100amps from a single rail... That's a bit nuts but probably not dangerous. It places the strain on the psu not the cards


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> Hey guys a bit of help choosing a new mobo
> 
> 2 options are :
> 
> 1. ASUS Z97-WS
> 2. MSI Z97 XPower AC
> 
> your thoughts would be appreciated .. Thanks in advance


Have you checked Gigabyte GA-Z97X-Gaming G1 Wi-Fi BK? This thing has 4-core sound processor, swappable op-amps, PLX chip, 168 hours factory tested stability under load, tons of fan control options, watercooled ready radiators, DAC-usb ports and something else, lol

But, personally, out of this two i would choose ASUS Z97-WS.
Because i already own MSI Z97 MPower MAX AC, which is, imho, a very, very good motherboard, i assume, that XPower is much better, but I just think, that MSI's MPower-XPower series motherboards are too much for regular users: they have too many pin connectors for power and all these OC'ing functions. I guess, if you are not planning to overclock the heck out of your system, i wouldn't recommend this motherboard


----------



## yifeng3007

I am just curious, is this a normal test result for my system? Or is it has to be better/worse?
http://www.3dmark.com/fs/2852235


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> I am just curious, is this a normal test result for my system? Or is it has to be better/worse?
> http://www.3dmark.com/fs/2852235


Wow, you're much higher than I am, and I'm running 3 GPUs overclocked...
I wonder if its your drivers... what're you using?


----------



## electro2u

Maybe it's placebo effect but everything seems smoother to me on x79 compared to z97. Just using a single 295x2 with a 4820k right now and, yeah, desktop stuff is snappier.
Quote:


> Originally Posted by *ljreyl*
> 
> Wow, you're much higher than I am, and I'm running 3 GPUs overclocked...
> I wonder if its your drivers... what're you using?


Well that's 4 GPUs...


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> Maybe it's placebo effect but everything seems smoother to me on x79 compared to z97. Just using a single 295x2 with a 4820k right now and, yeah, desktop stuff is snappier.
> Well that's 4 GPUs...


HAHAHAHA I read his specs wrong lol. Makes perfect sense now.

BTW, Electro2u, what's your load wattage like?
And how many radiators do you have/temps of your components?


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> HAHAHAHA I read his specs wrong lol. Makes perfect sense now.
> 
> BTW, Electro2u, what's your load wattage like?
> And are you under custom water?


With just the 295x2 running I don't even hit 800 watts. Adding in the 290x means my UPS (battery backup) can't handle the load and I don't have any other way to test. It's going to be around 1050W in trifire.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Maybe it's placebo effect but everything seems smoother to me on x79 compared to z97. Just using a single 295x2 with a 4820k right now and, yeah, desktop stuff is snappier.


Well, your processor is way faster, than mine, and it also has 40 PCI Express lanes, which means that you don't even need a PLX chip, if you decide to add in another R9 295x2, to get x16 speeds for both cards. Whereas my CPU only has 16 lanes and in order for me to get x16 speed for both cards, i must connect them to PCI Express slots, that are supported by a PLX chip.
That actually might be the reason of microstuttering i am experiencing, but, subjectively, i get much more microstuttering, then people that have done reviews for double R9 295x2 with similar to mine setup...


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> With just the 295x2 running I don't even hit 800 watts. Adding in the 290x means my UPS (battery backup) can't handle the load and I don't have any other way to test. It's going to be around 1050W in trifire.


How do you check the wattages? Are you using some kind of a software or a device like Kill-A-Watt?


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Doesn't matter really. But I'd go with the Asus just because you have the aesthetics for it. Also, a quick review search shows that the ASUS is favored over the MSI.
> BTW, your PLX thing got me wondering so I did research last night. Since the 295X2 has a PLX chip, it's fine to work in Trifire. The PLX really is for multiGPU solutions on the motherboard.
> Your current board and mine are rated for Dual GPU. For a board to go over 2, you will need a PLX chip for PCIE optimization.


Yes the 295x2 has a plx on it, but doesnt a single 290x also require a 16x slot of its own?


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> Yes the 295x2 has a plx on it, but doesnt a single 290x also require a 16x slot of its own?


You are right, every single card would run better with x16 speed, but it would give them only a maximum of 2 percent increase in performance, comparing to if they would run at x8 speeds. While, double GPU videocards benefit much more, if they are NOT using a PLX chip, because they already have one inside. It means that, if there are too much interference like these multiple PLX chips, it makes your FPS higher and non-gaming programs work better, but in games, your microstuttering increases a lot.
I am really not sure if what i am saying even makes sense, please, correct me, if i am wrong, lol


----------



## yifeng3007

Wow, i really don't know how this is possible, but i reached number 81 in the world in Fire Strike Extreme D:
http://www.3dmark.com/fs/2853329
http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/4+gpu


----------



## stxe34

hi, a bit off subject but i was on youtube and noticed this..



now i have a very similar setup with 2 x r9 295x2's but with a 3930k clocked at 4.8ghz as apposed to a 4770k. now on the vid his fps is around the 140fps at 4k...i get around 70fps at the start of the game + - 5-10 fps. is this vid legit? as its making me concerned my rig is not performing as it should! btw thats ultra with fxaa
thanks for any input!


----------



## Syceo

Quote:


> Originally Posted by *yifeng3007*
> 
> You are right, every single card would run better with x16 speed, but it would give them only a maximum of 2 percent increase in performance, comparing to if they would run at x8 speeds. While, double GPU videocards benefit much more, if they are NOT using a PLX chip, because they already have one inside. It means that, if there are too much interference like these multiple PLX chips, it makes your FPS higher and non-gaming programs work better, but in games, your microstuttering increases a lot.
> I am really not sure if what i am saying even makes sense, please, correct me, if i am wrong, lol


so the 295x2 in a x16 and the 290x in a x8 is absolutely fine?


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> so the 295x2 in a x16 and the 290x in a x8 is absolutely fine?


Well, if you are a casual gamer you won't see a difference, because the increase may be somewhere about 5 fps maximum.

BUT, if you are asking about your Asus Z87-PRO, then you are running your setup at x8/x8. It means that you won't get as much fps as you would get with x16/x16, but you get much less stuttering (lag).

The point is, if a CPU supports 16 lanes, then one card connected to one x16 slot will run at x16 speed. If there are two cards with the same CPU, even if the mobo has x16 and x8 slots, the CPU will power both cards equally at x8 speeds. Unless, the mobo has a PLX chip, which adds more lanes (or spreads, or multiplies, or whatever it does there) so that the mobo could have two PCIe x16 and support two graphics cards at x16 speeds, BUT, since there is one more thing between graphics cards and CPU (i am talking about PLX) it increases the latency between CPU and GPUs, which in game looks like small lags, even if fps are high. That is called microstuttering, which is very annoying.
And if a CPU supports 40 lanes, then it doesn't need a PLX chip for GPUs to run at x16 speeds, because, for example, if you have two GPUs, that need x16, then 40-16+16=8, so the CPU will still have 8 lanes available to supply something else.

And, again, it is only what i know and heard from different sources, and if you want a solid fact, then you might want to search for yourself.
Here is an interesting article about x8 vs x16 - http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/

I hope anyone can understand at least something i wrote here D: Sorry for my English :C


----------



## yifeng3007

Quote:


> Originally Posted by *stxe34*
> 
> hi, a bit off subject but i was on youtube and noticed this..
> 
> 
> 
> now i have a very similar setup with 2 x r9 295x2's but with a 3930k clocked at 4.8ghz as apposed to a 4770k. now on the vid his fps is around the 140fps at 4k...i get around 70fps at the start of the game + - 5-10 fps. is this vid legit? as its making me concerned my rig is not performing as it should! btw thats ultra with fxaa
> thanks for any input!


I just checked what i have at 1080p with everything maxed out with 8x MSAA and i only get 80 fps minimum, with about 100 fps average. My guess is, that vid is legit. That guy doesn't have everything maxed out, only on very high settings, and he also has just 2x MSAA. Yet, he is running it at 4K, so, as i said, i am only guessing.
Here is a good review about R9 295x2 in Quadfire with a Crysis 3 benchmark at 4K in very high settings - http://www.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review/5#.VCij5vmSwjw And, yeah, these guys got an average of 85 fps with 4x MSAA

Correct me, if i am wrong...


----------



## Syceo

Quote:


> Originally Posted by *yifeng3007*
> 
> I just checked what i have at 1080p with everything maxed out with 8x MSAA and i only get 80 fps minimum, with about 100 fps average. My guess is, that vid is legit. That guy doesn't have everything maxed out, only on very high settings, and he also has just 2x MSAA. Yet, he is running it at 4K, so, as i said, i am only guessing.
> Here is a good review about R9 295x2 in Quadfire with a Crysis 3 benchmark at 4K in very high settings - http://www.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review/5#.VCij5vmSwjw And, yeah, these guys got an average of 85 fps with 4x MSAA
> 
> Correct me, if i am wrong...


there is no way he is running at 4k and getting 120fps, he is more than likely using a 1080 or 1440. 4K can only do 60hz at the moment


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, if you are a casual gamer you won't see a difference, because the increase may be somewhere about 5 fps maximum.
> 
> BUT, if you are asking about your Asus Z87-PRO, then you are running your setup at x8/x8. It means that you won't get as much fps as you would get with x16/x16, but you get much less stuttering (lag).
> 
> The point is, if a CPU supports 16 lanes, then one card connected to one x16 slot will run at x16 speed. If there are two cards with the same CPU, even if the mobo has x16 and x8 slots, the CPU will power both cards equally at x8 speeds. Unless, the mobo has a PLX chip, which adds more lanes (or spreads, or multiplies, or whatever it does there) so that the mobo could have two PCIe x16 and support two graphics cards at x16 speeds, BUT, since there is one more thing between graphics cards and CPU (i am talking about PLX) it increases the latency between CPU and GPUs, which in game looks like small lags, even if fps are high. That is called microstuttering, which is very annoying.
> And if a CPU supports 40 lanes, then it doesn't need a PLX chip for GPUs to run at x16 speeds, because, for example, if you have two GPUs, that need x16, then 40-16+16=8, so the CPU will still have 8 lanes available to supply something else.
> 
> And, again, it is only what i know and heard from different sources, and if you want a solid fact, then you might want to search for yourself.
> Here is an interesting article about x8 vs x16 - http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/
> 
> I hope anyone can understand at least something i wrote here D: Sorry for my English :C


Hey Syceo, try using your cards while taking out your sound card. Maybe your sound card is taking up the extra bandwidth that your cards should have, causing lower performance.


----------



## dcombs108

Any one have a tutorial to add 240mm radiator to replace the 120mm one?


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> there is no way he is running at 4k and getting 120fps, he is more than likely using a 1080 or 1440. 4K can only do 60hz at the moment


Well, that guy just can get in-game 120+ fps and play at 60Hz, like i do with my 1080p, lol
But there is something wrong in that video, i can't put my finger on it... 4K, everything on Very High with 2x MSAA and 120+ fps? Judging by the look of the fps meter, he is probably using Fraps, which is actually not a very reliable source to monitor your fps with... But, if you were using Fraps as well and still don't get the same fps, then i just don't know, sorry...


----------



## yifeng3007

Can anyone please explain how do you switch stock fan and add another one on top of the radiator? I just don't know how... If removing the stock fan is kind of easy, how do you add another fan and where do you connect it to?


----------



## Nichismo

hey guys

I noticed the 295x2 XFX card is on Newegg for 999.99$, is this a really good price? Everywhere else ive looked it seems to be alot more....

Im interested in buying one with an EK block and the single slot PCI, but ive never owned an AMD card. Im a little skeptical about the port connectivity, and the weight of the card... does it seem to sag at all?

im pretty tempted to buy one, but need a little affirmation.

Thanks


----------



## stxe34

Quote:


> Originally Posted by *Syceo*
> 
> there is no way he is running at 4k and getting 120fps, he is more than likely using a 1080 or 1440. 4K can only do 60hz at the moment


thats what i thought its as if he has altered the vid and put a 1 in front of his fps count lol.


----------



## yifeng3007

Quote:


> Originally Posted by *stxe34*
> 
> thats what i thought its as if he has altered the vid and put a 1 in front of his fps count lol.


Then how is the resolution in the options menu shows 4K res?


----------



## stxe34

Quote:


> Originally Posted by *yifeng3007*
> 
> Then how is the resolution in the options menu shows 4K res?


dont know but as i said i have a similar setup except cpu which is a 4770k(mine is a 3930k clocked at 4.8ghz) so i would say reasonably high spec and im touching 80fps at the start of post human. even my intro vids are locked to 30fps..his are 140fps? the majoeity of vids on the tube are between 60-80 fps this is another 60 fps more..cant see it


----------



## electro2u

Quote:


> Originally Posted by *stxe34*
> 
> thats what i thought its as if he has altered the vid and put a 1 in front of his fps count lol.


That HardOCP review is of QuadFire 2x295x2, not triFire. They do the numbers weird over there, but they don't normally flat out lie.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> That HardOCP review is of QuadFire 2x295x2, not triFire. They do the numbers weird over there, but they don't normally flat out lie.


Erm, if stxe34 says, that he has similar setup to that. guy. that means, that they both have two R9 295x2, which is Quadfire, that's why i've sent that review by HardOCP. They also have Trifire test, btw.
Why do you think they are making the numbers weird?







I think, imo, that review is the best out there, since it covers up mostly everything and really shows the performance of two cards quiet accurate...


----------



## electro2u

Quote:


> Originally Posted by *Nichismo*
> 
> hey guys
> 
> I noticed the 295x2 XFX card is on Newegg for 999.99$, is this a really good price? Everywhere else ive looked it seems to be alot more....
> 
> Im interested in buying one with an EK block and the single slot PCI, but ive never owned an AMD card. Im a little skeptical about the port connectivity, and the weight of the card... does it seem to sag at all?
> 
> im pretty tempted to buy one, but need a little affirmation.
> 
> Thanks


Hi.That 999$ price should be standard on all liquid AOI 295x2 and it is a good deal imo. I paid 1500$ for mine when they first came out and don't really care even though I could certainly use the 500$ back. Having so much power in ONE PCIE slot is just cool, whether or not it is necessary.

I put an Aquacomputer block and EK backplate on my 295x2 and it does sag at the end a little bit. maybe 1.5mm. The block is very heavy, heavier than the stock HS by a good bit, I think. I used a stop fitting wrapped in electrical tape to anchor my 295x2 onto the SATA ports of my Rampage IV black for support and it is perfectly straight now. I have seen other people use this technique (with various elements, not necessarily the stop fitting, it just happened to be almost perfect width for the spot in my case).

As for Eyefinity support I cannot comment but I suspect most of the kinks have been worked out by now.


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> Erm, if stxe34 says, that he has similar setup to that. guy. that means, that they both have two R9 295x2, which is Quadfire, that's why i've sent that review by HardOCP. They also have Trifire test, btw.
> Why do you think they are making the numbers weird?
> 
> 
> 
> 
> 
> 
> 
> I think, imo, that review is the best out there, since it covers up mostly everything and really shows the performance of two cards quiet accurate...


Sorry meant to be addressing Syceo who is using TriFire, but I forgot to multi quote. Didn't even see the video til now. Looks fake for sure. Dude's full of it, imo. says theres only 2 boards on the market that support 4way crossfire but any board that can do 8x/8x will fully support 2x295x2.

I think H's numbers are weird because they don't compare apples to apples. They use the highest playable configuration for each setup. So you get apples to oranges to grapes.


----------



## Nichismo

^^^ ah, ic. Thanks for the info

as I owned many Mini-itx rigs in the past, and utilizing many external radiators, I always wanted to have a powerful, single slot GPU... Ive wanted to have quick disconnects with panel mounts to be used on one of those PCI slot covers ive seen from Koolance and AQ, but seeing how many Mini-itx cases only have 2 PCI slots, I never got the chance. This really is a fascinating card.

But I still wonder if using the single slot version would hinder the support of the cards weight.

I was considering buying one of these (apparently, it is universal too):




But for 15$, im not sure how well it might work, as I honestly feel like thats pretty pricey (but then again, isnt it always??)

I also saw this picture of the MainGear facebook page:



and those two seem to be pretty straight.... but perhaps its just the picture angle. I would imagine having two, connected through the terminal bridge would produce even more stress....

This issue may sound petty in retrospect, but the little things are important to me when putting as much work as such.


----------



## electro2u

I suspect mine was straight and bent over time. Now I'm bending it back I guess? I'm not sure if that Bitspower bracket would help or not, I still think fishing line from top or an anchor below is the only way to be absolutely sure the PCB won't bend.


----------



## stxe34

Quote:


> Originally Posted by *electro2u*
> 
> Sorry meant to be addressing Syceo who is using TriFire, but I forgot to multi quote. Didn't even see the video til now. Looks fake for sure. Dude's full of it, imo. says theres only 2 boards on the market that support 4way crossfire but any board that can do 8x/8x will fully support 2x295x2.
> 
> I think H's numbers are weird because they don't compare apples to apples. They use the highest playable configuration for each setup. So you get apples to oranges to grapes.


thats what i thought thanks


----------



## doctakedooty

Quote:


> Originally Posted by *Nichismo*
> 
> hey guys
> 
> I noticed the 295x2 XFX card is on Newegg for 999.99$, is this a really good price? Everywhere else ive looked it seems to be alot more....
> 
> Im interested in buying one with an EK block and the single slot PCI, but ive never owned an AMD card. Im a little skeptical about the port connectivity, and the weight of the card... does it seem to sag at all?
> 
> im pretty tempted to buy one, but need a little affirmation.
> 
> Thanks


I will give you my experience as I just got mine about a month ago and in my 4930k rig I have 3 780Ti's and in my new mitx build I wanted a gpu that could handle 1440p and still do 120 fps so I was pretty much limitied to the 295x2 or titan z and I was not about to spend $3000 on a gpu. One of the main games I play is BF4 but I do play alot of other games too. The 295x2 I actually like more then my tri sli 780Ti in most games I get equal or a little bit less then my tri sli 780Ti and seems to be slightly smoother and alot less dips in framrate. So to me jumping from nvidia to amd on this build I was kind of skeptical and debated it for awhile before I finally did it but in the end I was definetly happy I made the switch and love the card overall.


----------



## ljreyl

Quote:


> Originally Posted by *stxe34*
> 
> thats what i thought thanks


After reading comments on if PCIE 3.0 @ 16x or 8x is enough, etc... I went and found this review:
http://www.digitalstormonline.com/unlocked/amd-radeon-r9-295x2-and-r9-290x-hybrid-crossfire-review-multi-gpu-scaling-idnum251/

Since it was 1440P and they were basically using nearly the same stuff (They had a 4960X/M5E vs my 4790K/M7H) I could do a comparison.

At the same resolution and the same settings, the scaling they had on their system was identical, if not SLOWER than mine.

The 3 things their system had over mine was that the M5E has a built in PLX chip, the 4960X has more PCIE lanes (40 vs 16) and the CPU is clocked higher (4.6GHZ vs my 4.4GHz)
The only thing I had over their system was faster GPU clocks (they ran 1018/1250 and 1000/1250 vs my 1069/1375 for all 3 GPUs)

So, what can we conclude?
-PCIE 8X for the 295X2 and 290X is enough bandwidth
-Motherboards with no PLX chip will be fine up to dual graphics at 8x each
-Trifire (specifically 295X2/290X) on a motherboard with no PLX chip will work fine because the 295X2 has a PLX on the PCB itself

When I get home, I can post screenshots if anyone wants.
The only one I can remember is that my Firestrike Extreme got a score of 12501 when I ran it last night.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> I suspect mine was straight and bent over time. Now I'm bending it back I guess? I'm not sure if that Bitspower bracket would help or not, I still think fishing line from top or an anchor below is the only way to be absolutely sure the PCB won't bend.


Quote:


> Originally Posted by *ljreyl*
> 
> After reading comments on if PCIE 3.0 @ 16x or 8x is enough, etc... I went and found this review:
> http://www.digitalstormonline.com/unlocked/amd-radeon-r9-295x2-and-r9-290x-hybrid-crossfire-review-multi-gpu-scaling-idnum251/
> 
> Since it was 1440P and they were basically using nearly the same stuff (They had a 4960X/M5E vs my 4790K/M7H) I could do a comparison.
> 
> At the same resolution and the same settings, the scaling they had on their system was identical if not SLOWER than mine.
> 
> The 3 things their system had over mine was that the M5E has a built in PLX chip, the 4960X has more PCIE lanes (40 vs 16) and the CPU is clocked higher than me (4.6GHZ vs my 4.4GHz)
> The only thing I had over their system was faster GPU clocks (they ran 1018/1250 and 1000/1250 vs my 1069/1375 for all 3 GPUs)
> 
> So, what can we conclude?
> -PCIE 8X for the 295X2 and 290X is enough bandwidth
> -Motherboards with no PLX chip will be fine up to dual graphics at 8x each
> -Trifire (specifically 295X2/290X) on a motherboard with no PLX chip will work fine because the 295X2 has a PLX on the PCB itself
> 
> When I get home, I can post screenshots if anyone wants.
> The only one I can remember is that my Firestrike Extreme got a score of 12501 when I ran it last night.


Fantastic, that really cleared things up for me







however I think 90% of my need to upgrade my mobo was due to the fact that i simply cannot resist the urge to spend needless amounts of money on upgrades that add no significant performance benefits to the system. The money i had intended to use to purchase a new mobo and cpu will now go towards feeding my children *****


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Fantastic, that really cleared things up for me
> 
> 
> 
> 
> 
> 
> 
> however I think 90% of my need to upgrade my mobo was due to the fact that i simply cannot resist the urge to spend needless amounts of money on upgrades that add no significant performance benefits to the system. The money i had intended to use to purchase a new mobo and cpu will now go towards feeding my children *****


LOL. The computer is more important than a human being. Upgrade that thing!


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> LOL. The computer is more important than a human being. Upgrade that thing!


I have calculated that if i dont treat the kids this week, and take the wife out i could probably get away with updating this motherboard that i have. Those benchmarks from digi storm where on a x79 board, I am still convinced that an upgrade to the x79 and ivybridge e may still be beneficial for performance. Im currently getting 11735 on fire-strike extreme whereas there results on the x79 setup are 12222. I want to make use of this trifire setup i have , i want to squeeze out every last drop of available performance,

So what do you think? X79 & 6 core IVY or just call it a day?


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> I have calculated that if i dont treat the kids this week, and take the wife out i could probably get away with updating this motherboard that i have. Those benchmarks from digi storm where on a x79 board, I am still convinced that an upgrade to the x79 and ivybridge e may still be beneficial for performance. Im currently getting 11735 on fire-strike extreme whereas there results on the x79 setup are 12222. I want to make use of this trifire setup i have , i want to squeeze out every last drop of available performance,
> 
> So what do you think? X79 & 6 core IVY or just call it a day?


Lulz, How about removing your sound card and trying the system again? That extra lane of bandwidth could up your score... just sayin...


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Lulz, How about removing your sound card and trying the system again? That extra lane of bandwidth could up your score... just sayin...


Oh yes you did mention that before , il give that a shot now and see if it makes any difference.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Oh yes you did mention that before , il give that a shot now and see if it makes any difference.












And BTW, getting a 4790K would probably be better than an X79 and 4960X. I already get higher scores than their system, for cheaper.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And BTW, getting a 4790K would probably be better than an X79 and 4960X. I already get higher scores than their system, for cheaper.


So that would mean Z97 mobo

Just done a run on firstrike without the sound card and i gained 30 points


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> After reading comments on if PCIE 3.0 @ 16x or 8x is enough, etc... I went and found this review:
> http://www.digitalstormonline.com/unlocked/amd-radeon-r9-295x2-and-r9-290x-hybrid-crossfire-review-multi-gpu-scaling-idnum251/
> 
> Since it was 1440P and they were basically using nearly the same stuff (They had a 4960X/M5E vs my 4790K/M7H) I could do a comparison.
> 
> At the same resolution and the same settings, the scaling they had on their system was identical, if not SLOWER than mine.
> 
> The 3 things their system had over mine was that the M5E has a built in PLX chip, the 4960X has more PCIE lanes (40 vs 16) and the CPU is clocked higher (4.6GHZ vs my 4.4GHz)
> The only thing I had over their system was faster GPU clocks (they ran 1018/1250 and 1000/1250 vs my 1069/1375 for all 3 GPUs)
> 
> So, what can we conclude?
> -PCIE 8X for the 295X2 and 290X is enough bandwidth
> -Motherboards with no PLX chip will be fine up to dual graphics at 8x each
> -Trifire (specifically 295X2/290X) on a motherboard with no PLX chip will work fine because the 295X2 has a PLX on the PCB itself
> 
> When I get home, I can post screenshots if anyone wants.
> The only one I can remember is that my Firestrike Extreme got a score of 12501 when I ran it last night.


That is a great post, clearly represents what i was talking about earlier: 8x speed is absolutely fine, however, you are still loosing 2-3 frames per second, comparing to full speed 16x without the PLX chip.
For example, as i said before, i just hit 15495 score in FS Extreme (http://www.3dmark.com/fs/2853329, number 81 in the world :O), and i have an i7-4790K and a mobo with a PLX chip. Now, if i just swap my cpu, to say, 4930K, and leave the rest of the GPU settings untouched, i would probably get about 16000+ score.


----------



## yifeng3007

BTW, guys, do you know the proper voltages for 1100 MHz core and 1600 MHz memory? I can get it run at +55 in MSI Afterburner and complete FS Extreme, but when i jump into BF4, it crashes immediately.
Funny thing is, i got the score of 15495 with lower voltages and core and mem speeds, lol


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> That is a great post, clearly represents what i was talking about earlier: 8x speed is absolutely fine, however, you are still loosing 2-3 frames per second, comparing to full speed 16x without the PLX chip.
> For example, as i said before, i just hit 15495 score in FS Extreme (http://www.3dmark.com/fs/2853329, number 81 in the world :O), and i have an i7-4790K and a mobo with a PLX chip. Now, if i just swap my cpu, to say, 4930K, and leave the rest of the GPU settings untouched, i would probably get about 16000+ score.


You could be right. But even then, the difference is with a small % of error.
http://www.digitalstormonline.com/unlocked/4-way-quad-crossfire-amd-r9-295x2-benchmarks-at-4k-idnum228/

They also did a quadfire test, assuming they used the same setup last the trifire bench (they were done 1 month apart) with a M5E and a 4960X.
They BARELY got a lower score than you.

There have been various test that no card can fully saturate a PCIE16X slot, whether it's a Titan Z or a 295X2. I really think the differences are between the various different specs/setups everyone has.


----------



## Nichismo

funny we are suddenly talking about x79 and IVB-E.

After my impulsive upgrade to x99, I now have a practically brand new ASUS Rampage 4 Black Edition, and a 4930k...

I could bundle it and sell them for really cheap if anyone is interested.
Quote:


> Originally Posted by *doctakedooty*
> 
> I will give you my experience as I just got mine about a month ago and in my 4930k rig I have 3 780Ti's and in my new mitx build I wanted a gpu that could handle 1440p and still do 120 fps so I was pretty much limitied to the 295x2 or titan z and I was not about to spend $3000 on a gpu. One of the main games I play is BF4 but I do play alot of other games too. The 295x2 I actually like more then my tri sli 780Ti in most games I get equal or a little bit less then my tri sli 780Ti and seems to be slightly smoother and alot less dips in framrate. So to me jumping from nvidia to amd on this build I was kind of skeptical and debated it for awhile before I finally did it but in the end I was definetly happy I made the switch and love the card overall.


dang, really? Have you done any tests or anything to produce some numbers? I find that almost too good to be true, especially since you are running x79 with 40 pci lanes, you could really take advantage in 3 way SLI

Id kill for 3 780ti's, and I have been considering that as well, due to the huge price drops recently.

However, im also a little worried as to the power consumption. What kind of PSU are you running?


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> BTW, guys, do you know the proper voltages for 1100 MHz core and 1600 MHz memory? I can get it run at +55 in MSI Afterburner and complete FS Extreme, but when i jump into BF4, it crashes immediately.
> Funny thing is, i got the score of 15495 with lower voltages and core and mem speeds, lol


Try upping it to +75 and Power Limiter to +50.
If that passes, go down to the next step (AB goes +/-6 so that would be 69)

Also, I would drop your memory to 1500 Max. From what i've done when I had a single 290X, memory overclocks are extremely small in return. The best ratio I had when I overclocked my 290X before trifire was 1120/1400 with +81mv/+50 Power limit. I kept a 1/1.25 ratio to core and memory.

I got that ratio by taking the stock clocks (1000/1250) and dividing them, creating the 1.25 ratio. From what i've seen too, keeping that ratio allowed me to push the same clocks at lower voltages. i read articles and user experiences on ratio overclocking and some have had the same results as me when it comes to stability and voltage.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> You could be right. But even then, the difference is with a small % of error.
> http://www.digitalstormonline.com/unlocked/4-way-quad-crossfire-amd-r9-295x2-benchmarks-at-4k-idnum228/
> 
> They also did a quadfire test, assuming they used the same setup last the trifire bench (they were done 1 month apart) with a M5E and a 4960X.
> They BARELY got a lower score than you.
> 
> There have been various test that no card can fully saturate a PCIE16X slot, whether it's a Titan Z or a 295X2. I really think the differences are between the various different specs/setups everyone has.


Going to have to absolutely agree with you on that. I think i will simply have to hold out. the performance gains just dont warrant the mobo upgrade . Will wait for the next gen of cpu's then decide.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Going to have to absolutely agree with you on that. I think i will simply have to hold out. the performance gains just dont warrant the mobo upgrade . Will wait for the next gen of cpu's then decide.


Yup. I actually upgraded from a AMD 8350 @ 4.7GHz with a Crosshair V to what I have now.
I was debating on getting the new X99 system but settled with the Z97 instead. The reviews, power consumption and overall speed for gaming just wasn't good enough to warrant the 1.5 grand I'd have to spend on the Rampage V, 5930K, and new DDR4 modules. My focus was to keep what I have now, and use them effectively (mainly my water loop and power supply).

Spend 600 on the Z97 system and couldn't be happier. Massive FPS boost, Total system load was 950-1005 watts, my cooling setup didn't need to be upgraded... SUPER FLIPPIN HAPPY.









Never going to use AMD again UNLESS they pull an Athlon 64 in the future and slaughter Intel again lol. Jim Keller has a lot to clean up if he's gonna pull that again...


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> You could be right. But even then, the difference is with a small % of error.


This is exactly what i am trying to say!
If you are a casual gamer or your work at the PC can wait for a couple of seconds, then you will not see any difference between 8x and 16x!
However, if you are gaming, microstuttering with a PLX chip MIGHT, i am not saying it MUST, i am just saying it MIGHT, cause some microstuttering. In fact, all this stuttering is kind of subjective, if you can leave with it, then good for you, but if you hate even a little amount of lags, than you probably, better to avoid PLX chip, BUT, PLX chip gives you maximum performance for your working applications, still, you would benefit 2, maximum 4%, increase, which might not be to important.
Quote:


> Originally Posted by *ljreyl*
> 
> There have been various test that no card can fully saturate a PCIE16X slot, whether it's a Titan Z or a 295X2. I really think the differences are between the various different specs/setups everyone has.


Yeah, it is true, most of the times your overall performance depends on some magic stuff, happening inside your components, because, even people with ABSOLUTELY equal setups get different scores, fps, whatever for no particular reason.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Try upping it to +75 and Power Limiter to +50.
> If that passes, go down to the next step (AB goes +/-6 so that would be 69)
> 
> Also, I would drop your memory to 1500 Max. From what i've done when I had a single 290X, memory overclocks are extremely small in return. The best ratio I had when I overclocked my 290X before trifire was 1120/1400 with +81mv/+50 Power limit. I kept a 1/1.25 ratio to core and memory.
> 
> I got that ratio by taking the stock clocks (1000/1250) and dividing them, creating the 1.25 ratio. From what i've seen too, keeping that ratio allowed me to push the same clocks at lower voltages. i read articles and user experiences on ratio overclocking and some have had the same results as me when it comes to stability and voltage.


I did that and i am about to check it out, thanks for the advice!
Why are you recommending to go +/-6? I am just curious, because i was moving the slider +/-5... Does this makes any difference?
Btw, increasing the voltages to such a high value doesn't have any drawbacks whatsoever? I thought it is bad for the card...


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> This is exactly what i am trying to say!
> If you are a casual gamer or your work at the PC can wait for a couple of seconds, then you will not see any difference between 8x and 16x!
> However, if you are gaming, microstuttering with a PLX chip MIGHT, i am not saying it MUST, i am just saying it MIGHT, cause some microstuttering. In fact, all this stuttering is kind of subjective, if you can leave with it, then good for you, but if you hate even a little amount of lags, than you probably, better to avoid PLX chip, BUT, PLX chip gives you maximum performance for your working applications, still, you would benefit 2, maximum 4%, increase, which might not be to important.
> Yeah, it is true, most of the times your overall performance depends on some magic stuff, happening inside your components, because, even people with ABSOLUTELY equal setups get different scores, fps, whatever for no particular reason.


Not trying to start a war or anything, just having an educated/experience based discussion









I do agree that there will be benefits to a 16X slot in particular loads. It's just hard to really say YES or NO when there are so many factors that come into play.


----------



## axiumone

For anyone using these cards in quad fire and eyefinity. Keep your eyes on amd's webiste. There should be a new driver version posted today that may solve some of these issues we've been experiencing. Catalyst 14.9.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Yup. I actually upgraded from a AMD 8350 @ 4.7GHz with a Crosshair V to what I have now.
> I was debating on getting the new X99 system but settled with the Z97 instead. The reviews, power consumption and overall speed for gaming just wasn't good enough to warrant the 1.5 grand I'd have to spend on the Rampage V, 5930K, and new DDR4 modules. My focus was to keep what I have now, and use them effectively (mainly my water loop and power supply).
> 
> Spend 600 on the Z97 system and couldn't be happier. Massive FPS boost, Total system load was 950-1005 watts, my cooling setup didn't need to be upgraded... SUPER FLIPPIN HAPPY.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Never going to use AMD again UNLESS they pull an Athlon 64 in the future and slaughter Intel again lol. Jim Keller has a lot to clean up if he's gonna pull that again...


Don't forget, that you will be able to swap your 4790K, to, say, 5770K, when it comes out, since Z97 is compatible with Broadwell!


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> I did that and i am about to check it out, thanks for the advice!
> Why are you recommending to go +/-6? I am just curious, because i was moving the slider +/-5... Does this makes any difference?
> Btw, increasing the voltages to such a high value doesn't have any drawbacks whatsoever? I thought it is bad for the card...


Afterbuner always rounded up or down to the nearest 6th tick for me. Maybe they changed that. For example. if you put +50, then try to put it at 55 and hit apply, it SHOULD bump to 56 instead. That's what i experienced anyway.
And the absolute highest I would increase the voltage to is 150mV, but 100mV is safest (in my opinion)

Are you on water or air? Those VRM's get HOT when you go above +75


----------



## ljreyl

Quote:


> Originally Posted by *axiumone*
> 
> For anyone using these cards in quad fire and eyefinity. Keep your eyes on amd's webiste. There should be a new driver version posted today that may solve some of these issues we've been experiencing. Catalyst 14.9.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Not trying to start a war or anything, just having an educated/experience based discussion
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do agree that there will be benefits to a 16X slot in particular loads. It's just hard to really say YES or NO when there are so many factors that come into play.


Damn, i am sorry, if you thought i was saying something bad, so that you mentioned starting a fight :C
I was just trying to explain my way of thinking and what i heard...
P.S.: i apologize for my English again :C


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Damn, i am sorry, if you thought i was saying something bad, so that you mentioned starting a fight :C
> I was just trying to explain my way of thinking and what i heard...
> P.S.: i apologize for my English again :C


No No No, just stating I'm not fighting or anything lol. No worries, I love heated discussion. It's fun and educational.


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> For anyone using these cards in quad fire and eyefinity. Keep your eyes on amd's webiste. There should be a new driver version posted today that may solve some of these issues we've been experiencing. Catalyst 14.9.


These are some great news, thank you for sharing them with us!
In fact, i have something to share with you guys as well: i don't know, if it is true or not, maybe it's just a joke or a prank, but yesterday night, when i was playing BF4, there were a couple of folks form OCUK platoon (which is a BF4 platoon for any forum member, including, maybe, admins), so what they said, is that AMD will announce their new flagship model next month. As again, i don't how much true is there in that statement, take it with a grain of salt and please don't laugh to hard on me


----------



## Syceo

Quote:


> Originally Posted by *yifeng3007*
> 
> This is exactly what i am trying to say!
> If you are a casual gamer or your work at the PC can wait for a couple of seconds, then you will not see any difference between 8x and 16x!
> However, if you are gaming, microstuttering with a PLX chip MIGHT, i am not saying it MUST, i am just saying it MIGHT, cause some microstuttering. In fact, all this stuttering is kind of subjective, if you can leave with it, then good for you, but if you hate even a little amount of lags, than you probably, better to avoid PLX chip, BUT, PLX chip gives you maximum performance for your working applications, still, you would benefit 2, maximum 4%, increase, which might not be to important.
> Yeah, it is true, most of the times your overall performance depends on some magic stuff, happening inside your components, because, even people with ABSOLUTELY equal setups get different scores, fps, whatever for no particular reason.


Humm i think PLX microstutter issues on GEN 3 is a myth, mainly propagated by x79 board owners lol , we are talking about latency of nano seconds when using a board with a PLX chip


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Afterbuner always rounded up or down to the nearest 6th tick for me. Maybe they changed that. For example. if you put +50, then try to put it at 55 and hit apply, it SHOULD bump to 56 instead. That's what i experienced anyway.
> And the absolute highest I would increase the voltage to is 150mV, but 100mV is safest (in my opinion)
> 
> Are you on water or air? Those VRM's get HOT when you go above +75


Hmm, that's weird, every time i was changing my voltages, it just stood where i putted them, and it was all like +5, +10, +15 and so on...
I don't know what are the voltages at stock, but if i put it up to +75, i guess it would be something like 125? Because the voltage slider can only go up to +100 for me...
I still run on stock Asetek coolers, so i guess i am on air? I am planning to get custom waterloop, but it is very hard to get all the needed parts in Russia, even in Moscow.


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> Humm i think PLX microstutter issues on GEN 3 is a myth, mainly propagated by x79 board owners lol , we are talking about latency of nano seconds when using a board with a PLX chip


MSI afterburner ususally shows latency of 25+ for me, sometimes even more.
I will check it out later sometime today.


----------



## doctakedooty

Quote:


> Originally Posted by *Nichismo*
> 
> funny we are suddenly talking about x79 and IVB-E.
> 
> After my impulsive upgrade to x99, I now have a practically brand new ASUS Rampage 4 Black Edition, and a 4930k...
> 
> I could bundle it and sell them for really cheap if anyone is interested.
> dang, really? Have you done any tests or anything to produce some numbers? I find that almost too good to be true, especially since you are running x79 with 40 pci lanes, you could really take advantage in 3 way SLI
> 
> Id kill for 3 780ti's, and I have been considering that as well, due to the huge price drops recently.
> 
> However, im also a little worried as to the power consumption. What kind of PSU are you running?


Both of my systems run 1300 w PSU does the 295x2 need the 1300w psu no It don't but I had a extra one so I thought I would use it instead of buying another PSU. As far as the numbers go I don't have any numbers just off my own game play and experience between the two. I wish I had time to get some real numbers and some actually testing but my days are packed as it is. With the 295x2 running at 16x PCIE 3.0 and on X79 as you know running 3 way sli I only get 16x 8x 16x PCIE 3.0 speeds. To be honest unlike AMD after 2 GPUs on Nvidia side the scaleing is very poor and minimum.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Hmm, that's weird, every time i was changing my voltages, it just stood where i putted them, and it was all like +5, +10, +15 and so on...
> I don't know what are the voltages at stock, but if i put it up to +75, i guess it would be something like 125? Because the voltage slider can only go up to +100 for me...
> I still run on stock Asetek coolers, so i guess i am on air? I am planning to get custom waterloop, but it is very hard to get all the needed parts in Russia, even in Moscow.


Stock voltage on the 290X was 1.215volts for me, so I assume that the 295 should be near the same.
Adding +75 would make is around 1.275-1.285-ish.
The big thing with staying on air for VRM cooling is to watch for those VRM temps. They could throttle your cards if set too high and they get too hot (even if they are way below the threshold which should be 125C)
Also, +50 Power Limiter SHOULD help eliminate throttling, especially when upping the voltage.


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Humm i think PLX microstutter issues on GEN 3 is a myth, mainly propagated by x79 board owners lol , we are talking about latency of nano seconds when using a board with a PLX chip


Well I just don't know what else could explain this "wow" effect I'm getting. I've been using 1150 boards from different vendors, z87 & z97 with onboard Plx and without and they all acted the same. Then I get this x79 board because Axiumone says they have FFxiv working correctly on win 8.1 (still doesn't work for me but I'll just use win7) and I get this woah... Smooth effect. At 120hz just moving stuff around on the desktop looks brilliant. I haven't even gotten into playing anything yet.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Stock voltage on the 290X was 1.215volts for me, so I assume that the 295 should be near the same.
> Adding +75 would make is around 1.275-1.285-ish.
> The big thing with staying on air for VRM cooling is to watch for those VRM temps. They could throttle your cards if set too high and they get too hot (even if they are way below the threshold which should be 125C)
> Also, +50 Power Limiter SHOULD help eliminate throttling, especially when upping the voltage.


You know, this is probably why i get throttling or microstuttering or whatever it is i am getting, when i am gaming. After 15-20 minutes it is literally impossible to touch the backplates, the get so freaking hot, i can't touch them D: I've read somewhere, that this is kind of normal, but now i am really concerned... Have you ever tried to touch the oven after cooking a pizza? Kind of that hot :O


----------



## axiumone

Quote:


> Originally Posted by *ljreyl*
> 
> Stock voltage on the 290X was 1.215volts for me, so I assume that the 295 should be near the same.
> Adding +75 would make is around 1.275-1.285-ish.
> The big thing with staying on air for VRM cooling is to watch for those VRM temps. They could throttle your cards if set too high and they get too hot (even if they are way below the threshold which should be 125C)
> Also, +50 Power Limiter SHOULD help eliminate throttling, especially when upping the voltage.


I actually have a fix for this incoming as well. AMD in their infinite wisdom decided to limit the VRM fan to a max of 35% and they refuse to give us control over it.

I found a mod that requires you to take off the shroud and change the fan wire, letting you control the VRM fan to 100% from any motherboard fan header. No soldering or anything too complicated required.

Hopefully the guys that are still using the stock cooler will find this useful. I should have the video finished over the next few weeks.


----------



## yifeng3007

Hmm, that's funny, i just checked the global stats for FS extreme, and somehow my setup is fourth after two 4930K and 5960X and is first with the regular I7 series 4790K, lol i don't understand, there are so many people here with watercoold builds, which would beat the hell out of my rig, why aren't they on the list?


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> I actually have a fix for this incoming as well. AMD in their infinite wisdom decided to limit the VRM fan to a max of 35% and they refuse to give us control over it.
> 
> I found a mod that requires you to take off the shroud and change the fan wire, letting you control the VRM fan to 100% from any motherboard fan header. No soldering or anything too complicated required.
> 
> Hopefully the guys that are still using the stock cooler will find this useful. I should have the video finished over the next few weeks.


Is there anyway for me to subscribe to you, so i won't miss the vid? Thanks in advance!


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> Well I just don't know what else could explain this "wow" effect I'm getting. I've been using 1150 boards from different vendors, z87 & z97 with onboard Plx and without and they all acted the same. Then I get this x79 board because Axiumone says they have FFxiv working correctly on win 8.1 (still doesn't work for me but I'll just use win7) and I get this woah... Smooth effect. At 120hz just moving stuff around on the desktop looks brilliant. I haven't even gotten into playing anything yet.


lol ok enough Electro stop teasing and get some benchmarks up on here







i need some real world performance to look at before i abandon this x79 venture. I was just about to pull the trigger on it then i swayed back to the Z97 c.mon mate....


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> Is there anyway for me to subscribe to you, so i won't miss the vid? Thanks in advance!


I found his video channel on Youtube last night. You should see his setup........... Epic.

@Axiumone
Do you use Google Chrome? I think that's the only thing I didn't remember not to do, I have a habit of immediately setting Chrome as default web browser after installing Windows 8.1 OEM.
Do you install Windows 8.1 from iso? Or do you upgrade from Windows 8.0?
Quote:


> Originally Posted by *Syceo*
> 
> lol ok enough Electro stop teasing and get some benchmarks up on here
> 
> 
> 
> 
> 
> 
> 
> i need some real world performance to look at before i abandon this x79 venture. I was just about to pull the trigger on it then i swayed back to the Z97 c.mon mate....


Yeah, fair enough, I've been stalling because I only have the 295x2 in and I'm not actually running Tri-Fire right now, which maybe that's why it's smooth? The dang slots are spaced differently on this board from the M7H board. So my triple rotary fitting isn't going to work by itself for the bridge. Need a 20mm extension I think? I guess I'll try to rig it up with tubing but I fractured a bone in my hand 2 weeks ago and it really hurts to tube up







I'll be honest with you I was considering selling the 290x because I like this setup so much. But I definitely need to give triFire a go again. The main disadvantage is all the added heat.

Above, about the backplates getting so hot... mine get hot enough to melt plastic.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Stock voltage on the 290X was 1.215volts for me, so I assume that the 295 should be near the same.
> Adding +75 would make is around 1.275-1.285-ish.
> The big thing with staying on air for VRM cooling is to watch for those VRM temps. They could throttle your cards if set too high and they get too hot (even if they are way below the threshold which should be 125C)
> Also, +50 Power Limiter SHOULD help eliminate throttling, especially when upping the voltage.


So, i just ran FS Extreme and i got a score of 15338.
What was weird while runnning the test:
1. GPU core voltages were uneven, they were jumping around between each other going from sometimes 1.191V (on some cores) up to 297 V (on some cores)... Is it alright? Are they supposed to jump like that?
2. When i got to the combined test, i was getting an average of 40 ms latency! I saw it dropping down to 27 ms only 3 times ant these 3 times only lasted for moments, and i saw the latency jumping up to 60, 70 pretty frequently! I even saw 80+ ms twice! I am absolutely sure, that this is not normal! I was getting around 30 fps, without dropping down to 24, but because of these latency everything was lagging like if i decided to run Crysis 3 on a calculator...
And the point is, in the begging of the test, latency was usually around 8 or so, but sometimes junping up to around 25, which is when i saw an image kind of freezing for a moment, and it was jumping around like that during that test as well!


----------



## axiumone

Quote:


> Originally Posted by *yifeng3007*
> 
> Is there anyway for me to subscribe to you, so i won't miss the vid? Thanks in advance!


Sure. I'm on my phone right now, but this should be the link to my channel..

https://www.youtube.com/playlist?list=UUIH27NuH4jkSHvUczTaHg-Q
Quote:


> Originally Posted by *electro2u*
> 
> I found his video channel on Youtube last night. You should see his setup........... Epic.
> 
> @Axiumone
> Do you use Google Chrome? I think that's the only thing I didn't remember not to do, I have a habit of immediately setting Chrome as default web browser after installing Windows 8.1 OEM.
> Do you install Windows 8.1 from iso? Or do you upgrade from Windows 8.0?


Thanks for the compliments.









I have a win 8.1 ISO that I use for all of installations. I use chrome as default as well. I'm just too used to it, plus it's very convenient to be able to pull up all of your open tabs and bookmarks on any device.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> I found his video channel on Youtube last night. You should see his setup........... Epic.


Oh, yeah, apparently, i saw this channel long before i visited the forum, lol I remember even commenting in it once, asking for something, but i don't even remember already








His setup is just wicked sick, it makes my PC cry cold closed loop liquid :C


----------



## electro2u

Quote:


> Originally Posted by *axiumone*
> 
> Sure. I'm on my phone right now, but this should be the link to my channel..
> 
> https://www.youtube.com/playlist?list=UUIH27NuH4jkSHvUczTaHg-Q
> Thanks for the compliments.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a win 8.1 ISO that I use for all of installations. I use chrome as default as well. I'm just too used to it, plus it's very convenient to be able to pull up all of your open tabs and bookmarks on any device.


Well I'm not going to bug you about it anymore. I have no idea what my system doesn't want to play nice with FFXIV on win 8.1 but there is some sort of weird incompatibility. I suppose it could be the VBIOS I'm using and a combination of installing Win8.1 in UEFI mode... some small detail I'm missing. I've got Win7 64 up and running and everything just works for FFXIV. For that matter the dpi scaling is nicer on win 7, imo. Softer somehow.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> So, i just ran FS Extreme and i got a score of 15338.
> What was weird while runnning the test:
> 1. GPU core voltages were uneven, they were jumping around between each other going from sometimes 1.191V (on some cores) up to 297 V (on some cores)... Is it alright? Are they supposed to jump like that?
> 2. When i got to the combined test, i was getting an average of 40 ms latency! I saw it dropping down to 27 ms only 3 times ant these 3 times only lasted for moments, and i saw the latency jumping up to 60, 70 pretty frequently! I even saw 80+ ms twice! I am absolutely sure, that this is not normal! I was getting around 30 fps, without dropping down to 24, but because of these latency everything was lagging like if i decided to run Crysis 3 on a calculator...
> And the point is, in the begging of the test, latency was usually around 8 or so, but sometimes junping up to around 25, which is when i saw an image kind of freezing for a moment, and it was jumping around like that during that test as well!


What driver are you using?
For the voltage spikes, makes sense. It depends on GPU utilization.
Also, go into CCC and turn on crossfirex even for games with no Profile and also turn on frame pacing. Give that a try.
What cpu do you have again? And also, what other PCI slots are being used besides the GPU?


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> What driver are you using?
> For the voltage spikes, makes sense. It depends on GPU utilization.
> Also, go into CCC and turn on crossfirex even for games with no Profile and also turn on frame pacing. Give that a try.
> What cpu do you have again? And also, what other PCI slots are being used besides the GPU?


Well, i already had cfx on also for games with no profiles, but how do you change frame pacing? Let me google it really quick...
My CPU is i7-4790K and the mobo is Gigabyte GA-Z97X-Gaming G1 WiFi BK, and, yes, i have additional power connector for the PCIe slots connected to it.
I don't have anything else in the PCIe slots, since the mobo itself already has quadcore cpu for the sound. And since i don't record videos, i don't have any capturing hardware either.
Update, i forgot, that i have a Wi-Fi card connected to PCIe 1x slot, lol, i will take it out a little later, right now i'll try to run the test with the frame pacing on!


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, i already had cfx on also for games with no profiles, but how do you change frame pacing? Let me google it really quick...
> My CPU is i7-4790K and the mobo is Gigabyte GA-Z97X-Gaming G1 WiFi BK, and, yes, i have additional power connector for the PCIe slots connected to it.
> I don't have anything else in the PCIe slots, since the mobo itself already has quadcore cpu for the sound. And since i don't record videos, i don't have any capturing hardware either.


Does the motherboard have built in wifi? Does that wifi use a PCI lane? If so, try disabling it.
I had to remove my wifi pcie card because it was fudging up my computer. Dual cards at 8x each took up all the lanes and the pcie wifi card was trying to take bandwidth which messed me up.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Does the motherboard have built in wifi? Does that wifi use a PCI lane? If so, try disabling it.
> I had to remove my wifi pcie card because it was fudging up my computer. Dual cards at 8x each took up all the lanes and the pcie wifi card was trying to take bandwidth which messed me up.


Yeah, i guess, this is what it does right now, lol
So, ran the test and oh my god everything was almost butter smooth! Thank you SO MUCH! Even though i got more, then a 100 less points, it was so smooth, i was actually enjoying it!
Do you think i should enable OpenGL Triple Buffering?


----------



## electro2u

Heh. You guys are smart!


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Yeah, i guess, this is what it does right now, lol
> So, ran the test and oh my god everything was almost butter smooth! Thank you SO MUCH! Even though i got more, then a 100 less points, it was so smooth, i was actually enjoying it!
> Do you think i should enable OpenGL Triple Buffering?


No need, you're not going to use vsync so there's no need for triple buffering.
And no problem. Wish that framepacing was ticked on from the beginning.


----------



## electro2u

So, if we are running 8x/8x on an 1150 board with no PLX chip... we can't use any other PCIE cards at all? I mean... I get it. I feel stupid, but I've always been using a soundcard as well. Would setting a slot to 1x instead of 4x make any difference?

Meh, that's not supposed to be an issue, the M7H I was using has a PCIE 2.0 x16 slot at the bottom which I was using for my soundcard, it shouldn't be limiting the 2 PCIE 3.0 slots to use a soundcard there... should it?


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> No need, you're not going to use vsync so there's no need for triple buffering.
> And no problem. Wish that framepacing was ticked on from the beginning.


So, i removed the WiFi card, turned off frame pacing, texture filtering quality set to perfomance, voltages to 85, core speed to 1121, memory speed to 1521 and, of course, turned the power limit up to 50% and i got this - http://www.3dmark.com/fs/2858172 !!! Which allowed me to get to number 76 in the global leader board, and all this thanks to you!!! I love you man, thank you so much!!!


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Heh. You guys are smart!


I can say for 100%, that everyone in here is smart, except for me, lol, if it wasn't for you guys, i wouldn't get all these awesome advises!


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> So, if we are running 8x/8x on an 1150 board with no PLX chip... we can't use any other PCIE cards at all? I mean... I get it. I feel stupid, but I've always been using a soundcard as well. Would setting a slot to 1x instead of 4x make any difference?
> 
> Meh, that's not supposed to be an issue, the M7H I was using has a PCIE 2.0 x16 slot at the bottom which I was using for my soundcard, it shouldn't be limiting the 2 PCIE 3.0 slots to use a soundcard there... should it?


Well, you can use other PCIe cards, but your speeds would significantly decrease, depending on how many cards you add. I recommend looking up for your PCIe configuration for your motherboard on the official website - http://www.asus.com/Motherboards/MAXIMUS_VII_HERO/


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> I can say for 100%, that everyone in here is smart, except for me, lol, if it wasn't for you guys, i wouldn't get all these awesome advises!


No sir! You *asked the right questions*.









I'm still confused on why everything seems so smooth on an older platform with a slightly slower processor. I have a 4.8Ghz OCCT stable 4790k at low voltage, but it looks like I'm going to have to get rid of it








Quote:


> Originally Posted by *yifeng3007*
> 
> Well, you can use other PCIe cards, but your speeds would significantly decrease, depending on how many cards you add. I recommend looking up for your PCIe configuration for your motherboard on the official website - http://www.asus.com/Motherboards/MAXIMUS_VII_HERO/


Yah the 16 PCIE 3.0 lanes come from the processor, but the 4x 2.0 lanes come from the z97 chipset. It appears to me there may be a significant difference between running a 295x2 at 16x and 8x but it won't show up in the framerate, it will show up in the latency or frametime/delay.


----------



## Syceo

Gotta love overclock.net... everyone here is so darn friendly... ok enough of the niceties whos going to help me choose a Z97 board, im sooooo confused.


----------



## stxe34

new drivers out!


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> So, if we are running 8x/8x on an 1150 board with no PLX chip... we can't use any other PCIE cards at all? I mean... I get it. I feel stupid, but I've always been using a soundcard as well. Would setting a slot to 1x instead of 4x make any difference?


Well, I don't know.
I assume we can't from experience.
The PLX chips on motherboards are usually for the CPUs with 40+ PCIE lanes.
From what I understand, the PLX chip only helps when there are not enough lanes to compensate for all the GPU's used.
In my case, the PLX allows the 295x2 to utilize the 8x bandwidth of it's slot effectively with the 290x in the other slot. Someone correct me if i'm wrong =)
Quote:


> Originally Posted by *electro2u*
> 
> No sir! You *asked the right questions*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm still confused on why everything seems so smooth on an older platform with a slightly slower processor. I have a 4.8Ghz OCCT stable 4790k at low voltage, but it looks like I'm going to have to get rid of it
> 
> 
> 
> 
> 
> 
> 
> 
> Yah the 16 PCIE 3.0 lanes come from the processor, but the 4x 2.0 lanes come from the z97 chipset. It appears to me there may be a significant difference between running a 295x2 at 16x and 8x but it won't show up in the framerate, it will show up in the latency or frametime/delay.


What's wrong with your setup? it's slower?


----------



## ljreyl

Quote:


> Originally Posted by *stxe34*
> 
> new drivers out!


http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

Added download link


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Gotta love overclock.net... everyone here is so darn friendly... ok enough of the niceties whos going to help me choose a Z97 board, im sooooo confused.


If you're gonna go to the Haswell refresh cpus, you can just use the z87 and update the bios.
Really, you're clocked at 4.3 GHz. There's no need for extra speed.
I'm at stock clocks man, 100 MHz above you. There's such a low difference, the cost isn't worth it... IMO


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> No sir! You *asked the right questions*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm still confused on why everything seems so smooth on an older platform with a slightly slower processor. I have a 4.8Ghz OCCT stable 4790k at low voltage, but it looks like I'm going to have to get rid of it
> 
> 
> 
> 
> 
> 
> 
> 
> Yah the 16 PCIE 3.0 lanes come from the processor, but the 4x 2.0 lanes come from the z97 chipset. It appears to me there may be a significant difference between running a 295x2 at 16x and 8x but it won't show up in the framerate, it will show up in the latency or frametime/delay.


If i am correct, the thing about 16x, 8x is all about speeds, so you will most likely just have sliiiiiiiightly less performance, but less likely to get more latency.


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> Gotta love overclock.net... everyone here is so darn friendly... ok enough of the niceties whos going to help me choose a Z97 board, im sooooo confused.


Quote:


> Originally Posted by *Syceo*
> 
> Gotta love overclock.net... everyone here is so darn friendly... ok enough of the niceties whos going to help me choose a Z97 board, im sooooo confused.


I'll just copy paste, what i wrote before, if you don't mind









Have you checked Gigabyte GA-Z97X-Gaming G1 Wi-Fi BK? This thing has 4-core sound processor, swappable op-amps, PLX chip, 168 hours factory tested stability under load, tons of fan control options, watercooled ready radiators, DAC-usb ports and something else, lol The best thing about it is that, you don't need to think to much if you want to use x16/x16 (PLX chip = slightly increased performance, as well as latency) or regular PCIe x8/x8 slots that are supported by the CPU (almost unnoticeable decrase in performance, but much less latency)

But, personally, out of this two i would choose ASUS Z97-WS.
Because i already own MSI Z97 MPower MAX AC, which is, imho, a very, very good motherboard, i assume, that XPower is much better, but I just think, that MSI's MPower-XPower series motherboards are too much for regular users: they have too many pin connectors for power and all these OC'ing functions. I guess, if you are not planning to overclock the heck out of your system, i wouldn't recommend this motherboard


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Well, I don't know.
> I assume we can't from experience.
> The PLX chips on motherboards are usually for the CPUs with 40+ PCIE lanes.
> From what I understand, the PLX chip only helps when there are not enough lanes to compensate for all the GPU's used.
> In my case, the PLX allows the 295x2 to utilize the 8x bandwidth of it's slot effectively with the 290x in the other slot. Someone correct me if i'm wrong =)
> What's wrong with your setup? it's slower?


Erm, if you have Maximus Hero VII, then you don't have a PLX chip...
Your motherboard just splits 16 lanes to both of your cards equally at x8 and x8. If you remove one videocard and leave one in the PCIe x16 slot, the card you put there will work at x8 speed...


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> If you're gonna go to the Haswell refresh cpus, you can just use the z87 and update the bios.
> Really, you're clocked at 4.3 GHz. There's no need for extra speed.
> I'm at stock clocks man, 100 MHz above you. There's such a low difference, the cost isn't worth it... IMO


I am absolutely agree with you, the only reason i would recommend buying a Z97 motherboard now is to get prepared for the Broadwell. But if you are not planning to buy Broadwell chips, then Z87 is already good enough and performance increase difference between Z87 and Z97 is so insignificant, you can compare it to x8 and x16 speeds


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Erm, if you have Maximus Hero VII, then you don't have a PLX chip...
> Your motherboard just splits 16 lanes to both of your cards equally at x8 and x8. If you remove one videocard and leave one in the PCIe x16 slot, the card you put there will work at x8 speed...


I was talking about my 295x2 having a PLX chip. Since I'm going above what my motherboard can handle, which is dual gpu, the PLX chip on the 295x2 helps with the extra gpu utilization.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> I was talking about my 295x2 having a PLX chip. Since I'm going above what my motherboard can handle, which is dual gpu, the PLX chip on the 295x2 helps with the extra gpu utilization.


Oh, yes, i am sorry, i misread it :C You are right, it definitely helps a lot!


----------



## yifeng3007

Well, guys, this is just outstanding!
This, is what i had with previous drivers - http://www.3dmark.com/fs/2858172
And this, is with the drivers that just got released - http://www.3dmark.com/fs/2858864


----------



## Syceo

almost makes me want to consider another 295x2


----------



## yifeng3007

Well, i just tried to run BF4 with the new drivers, buuut, it still sucks :C
I still get less fps, then people with similar setups get. I tried using frame pacing, overclocking the cards, resetting overclocks, overvolting and increasing power limit with no other adjustments, i also disabled the ULPS, but nothing seems to work out for me :C

In Metro: LL when i just sit in the menu, staring forward, my latency is jumping like crazy from 11.2 up to 22 and it never just settles on any of those numbers, it just keeps jumping around...

When i enter the game this is what happens: first 3 seconds everything is veeeeeery smooth, i get around 60 fps and 11+ latency, then it goes down to 24 fps with 45+ latency (!!!) and after sometime (like half a minute or so) it goes back up to about 60 fps with 11+ latency, stays like that for another couple of seconds and goes back down again... Just what on earth is going on with this?

After just watching some non-heavy talking scene, the latency goes like 14+ to 30+ and jumping around again, yet the fps are quiet consistent i can still see these lags >:'C

A quick update: finally, when i got to the shooting, fps usually staid in the upper 50s, something like 58-59 fps with latency average of 11. Sometimes the latency jumps up to 20+ and then goes back down again, but that is somehow almost unnoticeable. So i'd say, that Metro: LL runs very smooth at 1080 with everything maxed out including MSAA.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, i just tried to run BF4 with the new drivers, buuut, it still sucks :C
> I still get less fps, then people with similar setups get. I tried using frame pacing, overclocking the cards, resetting overclocks, overvolting and increasing power limit with no other adjustments, i also disabled the ULPS, but nothing seems to work out for me :C
> 
> In Metro: LL when i just sit in the menu, staring forward, my latency is jumping like crazy from 11.2 up to 22 and it never just settles on any of those numbers, it just keeps jumping around...
> 
> When i enter the game this is what happens: first 3 seconds everything is veeeeeery smooth, i get around 60 fps and 11+ latency, then it goes down to 24 fps with 45+ latency (!!!) and after sometime (like half a minute or so) it goes back up to about 60 fps with 11+ latency, stays like that for another couple of seconds and goes back down again... Just what on earth is going on with this?
> 
> After just watching some non-heavy talking scene, the latency goes like 14+ to 30+ and jumping around again, yet the fps are quiet consistent i can still see these lags >:'C


For Battlefield 4, don't use mantle. Also, make sure crossfire x is enabled.

For Metro, turn on frame pacing, crossfire x, etc.

Also, do you have any major services running in the background? Anti virus, etc?


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> For Battlefield 4, don't use mantle. Also, make sure crossfire x is enabled.
> 
> For Metro, turn on frame pacing, crossfire x, etc.
> 
> Also, do you have any major services running in the background? Anti virus, etc?


I always use direct3d for BF4 and cfx is always enabled, i had frame pacing on, though, hoping, that it would smooth out my fps, but it did nothing...
For Metro: LL i had frame pacing off just now, actually and when i got into the actual gameplay, and not the talking part, fps somehow got veeery smooth, even latency was quiet good. I also had cfx on at that moment.

There is only one thing that always runs at the background and uses little amount of resources - Corsair Link. Any other application didn't do anything.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> I always use direct3d for BF4 and cfx is always enabled, i had frame pacing on, though, hoping, that it would smooth out my fps, but it did nothing...
> For Metro: LL i had frame pacing off just now, actually and when i got into the actual gameplay, and not the talking part, fps somehow got veeery smooth, even latency was quiet good. I also had cfx on at that moment.
> 
> There is only one thing that always runs at the background and uses little amount of resources - Corsair Link. Any other application didn't do anything.


Weird... I have nearly the same settings as you. Could be a quadfire issue.


----------



## ljreyl

So, I took off my cpu block, cleaned both the cpu and block, reapply Gelid GC Extreme 3, reseated, and BOOM TEMPS ARE DOWN 10C. Full load is 55C or less on all cores on stock clocks and voltages (Aida64)

I.
Am.
FLIPPING.
HAPPY.

Looks like there's no need to delid anymore


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Weird... I have nearly the same settings as you. Could be a quadfire issue.


what results are you achieving in firestrike?


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> what results are you achieving in firestrike?


http://www.3dmark.com/3dm/4206615?

Just ran it right now with the old beta driver.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> http://www.3dmark.com/3dm/4206615?
> 
> Just ran it right now with the old beta driver.


ok nice


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> ok nice


What do you get?


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> What do you get?


at stock im getting

http://www.3dmark.com/fs/2859510


----------



## xer0h0ur

My firestrike score went down 347 points and it shows as not being an approved driver. Seems legit. God AMD is pathetic with drivers.


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> My firestrike score went down 347 points and it shows as not being an approved driver. Seems legit. God AMD is pathetic with drivers.


lol tell me about it...... was pretty close to going over to 980's in sli about an hour ago


----------



## xer0h0ur

I am ballin on a budget so to speak so I can't afford to jump from graphics card to graphics card. I made my bed so I sleep in it. I love the hardware but wish they would get their act together with driver releases.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> So, I took off my cpu block, cleaned both the cpu and block, reapply Gelid GC Extreme 3, reseated, and BOOM TEMPS ARE DOWN 10C. Full load is 55C or less on all cores on stock clocks and voltages (Aida64)
> 
> I.
> Am.
> FLIPPING.
> HAPPY.
> 
> Looks like there's no need to delid anymore


Great, congrats! Do you know why were you overheating in the first place?

As for me everything is not that great: i jump into BF4 test range, with cfx enabled, frame pacing enabled. At the beginning i get veeeery smooth framerates somewhere about 140 fps (when i just stare in front of me, fps is jumping from 136-141, which is not a big deal, you can barely see this.) and latency from 6.x to 11.x, with most of the times staying at about 6.6 ms (which cause these little fps drops, that are hard to notice). But then after a couple of minutes (2-3 minutes) everything goes down to hell: fps drops down to 70-80+ and latency starts jumping around from 8.x to 22.x, and the latency in this case is not something like in the beginning, this latency doesn't stick to any average, it literally jumps around... So when this starts it is literally impossible to get any gameplay going, it is just sooo laggy... I will probably take a video of that later this evening, so you guys will see what i'm talking about


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> Great, congrats! Do you know why were you overheating in the first place?
> 
> As for me everything is not that great: i jump into BF4 test range, with cfx enabled, frame pacing enabled. At the beginning i get veeeery smooth framerates somewhere about 140 fps (when i just stare in front of me, fps is jumping from 136-141, which is not a big deal, you can barely see this.) and latency from 6.x to 11.x, with most of the times staying at about 6.6 ms (which cause these little fps drops, that are hard to notice). But then after a couple of minutes (2-3 minutes) everything goes down to hell: fps drops down to 70-80+ and latency starts jumping around from 8.x to 22.x, and the latency in this case is not something like in the beginning, this latency doesn't stick to any average, it literally jumps around... So when this starts it is literally impossible to get any gameplay going, it is just sooo laggy... I will probably take a video of that later this evening, so you guys will see what i'm talking about


Gonna pick up battlefield 4 so I can have some sort of frame of reference.

I'm still thinking there's overheating going on, but I remembered you saying your GPU temps were OK?


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Gonna pick up battlefield 4 so I can have some sort of frame of reference.
> 
> I'm still thinking there's overheating going on, but I remembered you saying your GPU temps were OK?


That would be great! Finally i would have some real-life comparison, thank you! Well, i am running at stock coolers, and my temps are ok, according to GPU-Z and MSI Afterburner, the temps never exceeded 70C, even when i overclocked the cards a little bit...


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> That would be great! Finally u would have some real-life comparison, thank you! Well, i am running at stock coolers, and my temps are ok, according to GPU-Z and MSI Afterburner, the temps never exceeded 70C, even when i overclocked the cards a little bit...


And you get through Firestrike Extreme OK. Try changing the Windows Power Options plan to High Performance.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> And you get through Firestrike Extreme OK. Try changing the Windows Power Options plan to High Performance.


Thanks for the advice, but i already did that as soon as i got this build up and running


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> My firestrike score went down 347 points and it shows as not being an approved driver. Seems legit. God AMD is pathetic with drivers.


I got a 600-700 point boost to my Graphics score with 14.9 and the reason it isn't approved yet is because Futuremark need to review it first


----------



## xer0h0ur

My mistake then. Lets see how long it takes for the 14.9 to get approved.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> My mistake then. Lets see how long it takes for the 14.9 to get approved.


Usually only takes a couple of days to be approved


----------



## yifeng3007

Quote:


> Originally Posted by *xer0h0ur*
> 
> My mistake then. Lets see how long it takes for the 14.9 to get approved.


Yeah, you just need to wait a little bit. My score went from 15666 (http://www.3dmark.com/fs/2858172) up to 16002 (http://www.3dmark.com/fs/2858864)


----------



## Nichismo

well guys, theres actually 1 Diamond 295x2 left at my local Frys, for 1,049$, Im extremely tempted to take the plunge and snatch it while I can, and bringing a Newegg 999$ price match with me as well....

If I do grab it, im going to immediately order an EK Nickel Plexi waterblock and backplate for it.

I could easily get 2x 780ti s with that money though.

Decisions decisions.

Im also going to need a new power supply for sure.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Yeah, you just need to wait a little bit. My score went from 15666 (http://www.3dmark.com/fs/2858172) up to 16002 (http://www.3dmark.com/fs/2858864)


it already got approved.


----------



## electro2u

Quote:


> Originally Posted by *Nichismo*
> 
> well guys, theres actually 1 Diamond 295x2 left at my local Frys, for 1,049$, Im extremely tempted to take the plunge and snatch it while I can, and bringing a Newegg 999$ price match with me as well....
> 
> If I do grab it, im going to immediately order an EK Nickel Plexi waterblock and backplate for it.
> 
> I could easily get 2x 780ti s with that money though.
> 
> Decisions decisions.
> 
> Im also going to need a new power supply for sure.


It's a question of space, really.

I wanted to do trifire on an 1150 board and have plenty of room for a soundcard, so I thought it made sense. I'm firmly in the AMD camp right now because of temporal dithering in Windows, which Nvidia does not do (I may be one of about 2 people in the world that care about that sort of thing). If space weren't an issue (or I didn't already have one of these) I would go 2x290x instead.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> at stock im getting
> 
> http://www.3dmark.com/fs/2859510


So, you have your D5 pump at setting 2?

I'm running mine at 5 and if i lower it, temps rocket up.


----------



## Nichismo

Quote:


> Originally Posted by *electro2u*
> 
> It's a question of space, really.
> 
> I wanted to do trifire on an 1150 board and have plenty of room for a soundcard, so I thought it made sense. I'm firmly in the AMD camp right now because of temporal dithering in Windows, which Nvidia does not do (I may be one of about 2 people in the world that care about that sort of thing). If space weren't an issue (or I didn't already have one of these) I would go 2x290x instead.


interesting, and I have no idea what temporal dithering is.... But hey, if I did, maybe I would care?!

lol in all seriousness, I just love the look of multple cards equipped with full cover blocks and backplates. My two GTX 770s are still plenty right now, but im starting to feel like theres a good chunck to be desired in relation to the rest of my hardware, and I also feel like I need to take advantage of all the recent price drops.

Im pretty interested in the idea of driving to my local electronics store and being able to pick up whats the fasted GPU in the world... an even grand sounds like a really good price. Id love to get home and pop open that briefcase....


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> So, you have your D5 pump at setting 2?
> 
> I'm running mine at 5 and if i lower it, temps rocket up.


Hmm, yeah I run a single D5 at setting 3 right now and could easily get away with 2. Do you have a flow meter? Mine runs about 130lph at 2 and 170 at 3, and my meter is at the far end of my loop, the return line to my bay/res+pump housing.


----------



## electro2u

Quote:


> Originally Posted by *Nichismo*
> 
> interesting, and I have no idea what temporal dithering is.... But hey, if I did, maybe I would care?!
> 
> lol in all seriousness, I just love the look of multple cards equipped with full cover blocks and backplates. My two GTX 770s are still plenty right now, but im starting to feel like theres a good chunck to be desired in relation to the rest of my hardware, and I also feel like I need to take advantage of all the recent price drops.
> 
> Im pretty interested in the idea of driving to my local electronics store and being able to pick up whats the fasted GPU in the world... an even grand sounds like a really good price. Id love to get home and pop open that briefcase....


It's certainly the fastest single card, but the GPUs themselves are actually not as fast as single 290Xs. Because of the onboard PLX chip (internal crossfire bridge) 2x290s will beat a 295x2 every time. That's why 2x295x2 is not at the top of the charts, 4x290s is (well, actually the Nvidia cards win lol).

Temporal dithering is for profiled displays. Keeps color banding at bay. In Windows Nvidia doesn't have this feature, and it doesn't matter if you don't use icc profiles you wont get color banding anyway, but it basically looks like:


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> Hmm, yeah I run a single D5 at setting 3 right now and could easily get away with 2. Do you have a flow meter? Mine runs about 130lph at 2 and 170 at 3, and my meter is at the far end of my loop, the return line to my bay/res+pump housing.


I do not have a flow meter. What's your setup like?

Mine is XRes > 360mm Radiator ? EK Supremacy > 120mm Radiator > 295x2 > 290x > XRes

Also, I couldn't find any information on a dual gpu block such as the 295x2 and pressure drop. they say a single gpu block is about a .9psi drop so I wonder if my 295x2 gpu block is .9 or 1.8


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> I do not have a flow meter. What's your setup like?
> 
> Mine is XRes > 360mm Radiator ? EK Supremacy > 120mm Radiator > 295x2 > 290x > XRes
> 
> Also, I couldn't find any information on a dual gpu block such as the 295x2 and pressure drop. they say a single gpu block is about a .9psi drop so I wonder if my 295x2 gpu block is .9 or 1.8


Res>XSPC CPU block>360mm rad>295x2>290x>280mm rad>AC Flow meter block>Res


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Because of the onboard PLX chip (internal crossfire bridge) 2x290s will beat a 295x2 every time. That's why 2x295x2 is not at the top of the charts, 4x290s is (well, actually the Nvidia cards win lol).


I am sorry, but i will have to strongly disagree with you on this one: the PLX chip in the videocard IS NOT a crossfire bridge. There is a PLX chip in EVERY double GPU videocard, because, basically, what it does, is supplying both GPU's at speed of x16, otherwise, the x16 PICe slot, the videocard is connected to, would spread these 16 lanes at speeds of x8 for both GPUs.
Then, how does any other R7, R9 series videocards work in CFX without any PLX chip?

Here is an explanation of the CFX in new AMD videocards: http://www.fudzilla.com/home/item/32674-lack-of-crossfire-connector-on-r9-290x-explained


----------



## electro2u

Whatever you want to call the PLX chip is fine. The point was a 295x2 is slower than 2x290x's because of it.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Whatever you want to call the PLX chip is fine. The point was a 295x2 is slower than 2x290x's because of it.


Actually, i would have to disagree with you on this one as well







You see, because of the PLX chip, 295x2 will get more latency then two 290x's without a PLX chip at x8 or sometimes even with the PLX chip.
It looks something like this:
1. 295x2 = two 290x with a PLX chip at x16
2. 295x2 <- two 290x without a PLX chip at x16 (which means, that you will need a CPU, that can support at least 32 PCI lanes)
3. 295x2 -> two 290x without a PLX chip at x8 (however, R9 295x2 will get slightly more fps (maximum 5 fps more), it will get more latency. While two 290x's will run slightly slower, they will have less latency, which means less actual lag)
But there is also one more thing to consider: R9 295x2 has a higher clock rate, then a reference R9 290x does, but if you choose non-reference 290x's, that run at higher clock rates, then it will outperform 295x2 in the first example.


----------



## electro2u

What would you need a Plx chip for 2 290x's for?

295x2 doesn't have a higher clock rate than 290x (necessarily), can't oc as high and has other issues that 290x doesn't. Putting 2 GPUs on 1 PCB requires some compromises.

Show me some benches to support your arguments since you want to strongly disagree. A Plx chip is, by definition, a bridge. Guess who makes crossfire bridges? Company called PLX. Getting rid of them improved scaling, getting rid of the Plx bridge improves latency. You won't find 295x2 breaking any records.


----------



## ljreyl




----------



## electro2u

Oh jeez...............

Well the Final Fantasy XIV/win8.1 saga is over.

I just installed win 8.1 again and decided to do everything opposite of the way I normally do.

So I let MS use my picture/name for advertising (usually I disallow it) and I sign up for a Microsoft account, problem solved...

It's something about Windows Explorer security settings (Windows uses Explorer to emulate 32 bit games in a 64bit OS, apparently--its called Windows on Windows: or WOW64).

I'm pretty sure it was one of these settings:


that or signing up for MS account/email.

Cant believe I worked on that for so long. Biggest fail of these 272 pages.


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> Oh jeez...............
> 
> Well the Final Fantasy XIV/win8.1 saga is over.
> 
> I just installed win 8.1 again and decided to do everything opposite of the way I normally do.
> 
> So I let MS use my picture/name for advertising (usually I disallow it) and I sign up for a Microsoft account, problem solved...
> 
> It's something about Windows Explorer security settings (Windows uses Explorer to emulate 32 bit games in a 64bit OS, apparently--its called Windows on Windows: or WOW64).


How long did it take for you to finally get the game to work?


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> Res>XSPC CPU block>360mm rad>295x2>290x>280mm rad>AC Flow meter block>Res


You have more overall cooling than I do but... I still don't understand why my temps go up to 80c for my cpu though at stock settings... I feel like that's too high for stock...
My GPU's stay under 62C during heavy load. That's with GC Extreme on each GPU and Fujipoly Ultra on all VRMs, a LOT of heat ransfer yet, they still stay super cool and they are cooled AFTER the CPU.
I really feel like the 4790k temps should be lower for completely stock settings. I even did a re-seat, temps seemed better, but it was from a cold boot so it wasn't accurate.


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> How long did it take for you to finally get the game to work?


Uh... {long time}









I got the game as a gift with my 4770k more than a year ago. Oh... but I didn't actually notice the issue until I got my 295x2 in June iirc. Feelin' pretty dumb. At least I fixed it myself.
Quote:


> Originally Posted by *ljreyl*
> 
> You have more overall cooling than I do but... I still don't understand why my temps go up to 80c for my cpu though at stock settings... I feel like that's too high for stock...
> My GPU's stay under 62C during heavy load. That's with GC Extreme on each GPU and Fujipoly Ultra on all VRMs, a LOT of heat ransfer yet, they still stay super cool and they are cooled AFTER the CPU.
> I really feel like the 4790k temps should be lower for completely stock settings. I even did a re-seat, temps seemed better, but it was from a cold boot so it wasn't accurate.


80C stock 4790k?

Well... they are really badly designed from a thermal standpoint. Intel wanted to give people another reason to go x79/x99. And it's a good one.

I would suggest delidding, but I know that's not what most people want to hear.

Also possible your CPU block is full of shmutz. I only had my loop running for a couple weeks and I decided to crack my XSPC Raystorm open when I did some adjustments to my loop and it was full of threading debris, loose acetyl, and crud that may have come from my radiators. The GPU blocks weren't nearly as bad.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> What would you need a Plx chip for 2 290x's for?
> 
> 295x2 doesn't have a higher clock rate than 290x (necessarily), can't oc as high and has other issues that 290x doesn't. Putting 2 GPUs on 1 PCB requires some compromises.
> 
> Show me some benches to support your arguments since you want to strongly disagree. A Plx chip is, by definition, a bridge. Guess who makes crossfire bridges? Company called PLX. Getting rid of them improved scaling, getting rid of the Plx bridge improves latency. You won't find 295x2 breaking any records.


I don't know who will need this PLX chip and why, but there are people exist that might need it.

I agree with the second paragraph.

All i am trying to say, that there are a lot of different things to consider. Yes, 295x2 won't break any records for sure, but you don't need to go two 290x's vs 295x2, if you seek plane performance in gaming. In fact, 295x2 is probably a better choice for gaming just because it doesn't heat up as much as two 290x's on air will most likely do, it has slightly (so slightly, that you might not notice) more fps and more latency. While two 290x's in x8/x8 will get zero to none latency, their fps will decrease (and again, so slightly, you won't even notice.) But why take up two PCIe slots, if you can put one 295x2 now, and get very, very, very similar performance, and after sometime you might add another 295x2!


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> You have more overall cooling than I do but... I still don't understand why my temps go up to 80c for my cpu though at stock settings... I feel like that's too high for stock...
> My GPU's stay under 62C during heavy load. That's with GC Extreme on each GPU and Fujipoly Ultra on all VRMs, a LOT of heat ransfer yet, they still stay super cool and they are cooled AFTER the CPU.
> I really feel like the 4790k temps should be lower for completely stock settings. I even did a re-seat, temps seemed better, but it was from a cold boot so it wasn't accurate.


Wow, if you are just gaming, that temps are nuts D: My 4790k only heats up to 50C max, but i will have to check that later, and i am just running it with H80i!


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Wow, if you are just gaming, that temps are nuts D: My 4790k only heats up to 50C max, but i will have to check that later, and i am just running it with H80i!


When I play Crysis 3, I get up to 80c. Then I read how people with an H80i get 15+ lower temps than me. I understand that the H80i isolated the cpu from the rest of the heat but still, doesn't make sense. I'm starting to think the TIM on the cpu has an air bubble since the core variance is usually 5c or more when it comes to temps.
Thinking of just deliding today with GC extreme (I don't wanna use CLU/CLP, too afraid of messing up)


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> When I play Crysis 3, I get up to 80c. Then I read how people with an H80i get 15+ lower temps than me. I understand that the H80i isolated the cpu from the rest of the heat but still, doesn't make sense. I'm starting to think the TIM on the cpu has an air bubble since the core variance is usually 5c or more when it comes to temps.
> Thinking of just deliding today with GC extreme (I don't wanna use CLU/CLP, too afraid of messing up)


I don't know why, but i am really afraid to delid anything and i wouldn't recommend doing it to anyone, unless they know what they are doing.
What i would recommend is to run RealTemp, download Prime95 as well and start Prime95 in RealTemp. When the test is over, it will give the idea about your temps under different stages of CPU load.


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> All i am trying to say, that there are a lot of different things to consider. Yes, 295x2 won't break any records for sure, but you don't need to go two 290x's vs 295x2, if you seek plane performance in gaming. In fact, 295x2 is probably a better choice for gaming just because it doesn't heat up as much as two 290x's on air will most likely do, it has slightly (so slightly, that you might not notice) more fps and more latency. While two 290x's in x8/x8 will get zero to none latency, their fps will decrease (and again, so slightly, you won't even notice.) But why take up two PCIe slots, if you can put one 295x2 now, and get very, very, very similar performance, and after sometime you might add another 295x2!


This is just... no. If you are trying to say that there isn't much difference between 290x 2-way XF and a 295x2 let's take a quikc look apples for apples:

Top Firestrike extreme score by 2-way 290x is about 11,500
My best Firestrike Extreme score with 295x2 on Rampage IV Black is 9,200
This review shows a 295x2 scoring 8500 in same benchmark

Huge difference. There is no 295x2 anywhere on the charts, not because people aren't running the benches, but because the PLX chip and cooling solution isn't helping the 290x GPUs on the 295x2.

290x crossfire curbstomps a 295x2.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> I don't know why, but i am really afraid to delid anything and i wouldn't recommend doing it to anyone, unless they know what they are doing.
> What i would recommend is to run RealTemp, download Prime95 as well and start Prime95 in RealTemp. When the test is over, it will give the idea about your temps under different stages of CPU load.


No way. If Crysis 3 makes my temps hit 80 while gaming, which is what I primarily do anyway, no way in hell will I run prime95 and possibly fry my stuff.
I used it in the past when I overclocked my 8350 and it's just not realistic to use as a tester. AIDA64, IETU, and heavy gaming workloads that stress both the CPU and GPU is what I do now, and it's how I base my overclocks since that is my focus for performance anyway.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> This is just... no. If you are trying to say that there isn't much difference between 290x 2-way XF and a 295x2 let's take a quikc look apples for apples:
> 
> Top Firestrike extreme score by 2-way 290x is about 11,500
> My best Firestrike Extreme score with 295x2 on Rampage IV Black is 9,200
> This review shows a 295x2 scoring 8500 in same benchmark
> 
> Huge difference. There is no 295x2 anywhere on the charts, not because people aren't running the benches, but because the PLX chip and cooling solution isn't helping the 290x GPUs on the 295x2.
> 
> 290x crossfire curbstomps a 295x2.


Well, you never know how these people got that score: did they use stock fans, water cooling, maybe they were even using fio, lol, you never know.
But, yeah, i see your point. Overclocking and getting the maximum performance is certainly two 290x. BUT, how many people would really push their systems to the limit? Most likely, that people will just buy the cards, maybe even do some overclocking, and that's it. And in reality they perform very similarly.

Let me just check the score i will get with one card running, i am very curious


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> No way. If Crysis 3 makes my temps hit 80 while gaming, which is what I primarily do anyway, no way in hell will I run prime95 and possibly fry my stuff.
> I used it in the past when I overclocked my 8350 and it's just not realistic to use as a tester. AIDA64, IETU, and heavy gaming workloads that stress both the CPU and GPU is what I do now, and it's how I base my overclocks since that is my focus for performance anyway.


Well, you seem to be much more experienced in this than i am, so you know better, lol.
I just ran Crysis 3 with everything maxed out and 8x MSAA in 1080p. I got my 4790K overclocked to 4.6GHz.
I was testing the beginning of the first mission for two-three minutes after the main character got the handgun simply running around that dock and my CPU temps never exceed 62C, in fact, it was usually sitting around 57-58 degrees Celcius.
I was monitoring these stats with MSI Afterburner.


----------



## xer0h0ur

I have noted that Afterburner and Aida64 never agree on CPU temps.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have noted that Afterburner and Aida64 never agree on CPU temps.


True Story. I use HWINFO64 and AI Suite 3 to monitor real core voltages and temps.


----------



## yifeng3007

Well, here is what i got in FS Extreme with just one R9 295x2 running:
1. 4790K overclocked to 4.6GHz, 295x2 @ stock - http://www.3dmark.com/fs/2868460 - score 8710
2. 4790K overclocked to 4.6GHz, 295x2 overclocked - http://www.3dmark.com/fs/2868484 - score 9447
And, yes, it just proofs, that two R9 290x's overclock better, but i've never argues with that.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> True Story. I use HWINFO64 and AI Suite 3 to monitor real core voltages and temps.


If you want, i can run the same "test" i did in Crysis 3 with HWINFO64?


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> If you want, i can run the same "test" i did in Crysis 3 with HWINFO64?


No, there's no need. Thanks though. I'm about to go buy another 4790k to see if mine is a dud.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> No, there's no need. Thanks though. I'm about to go buy another 4790k to see if mine is a dud.


Well, i did it anyway, and it showed me something very similar to what Afterburner did, but much more informatively and accurately.
Overall, the average temp never exceeded 65C. and never went below 62C.

Before buying another 4790K, don't you want to reattach the CPU cooler, change the thermal paste, maybe tighten or loosen the screws a little bit? In fact, i was getting very high temps under load a couple of weeks ago, but after i loosen screws on my H80i a little bit, my temps went down like 20C!

But, as i said, you seem to be much more confident in this type of thing, so if you think changing the CPU is better for you, there is no one to stop you, lol


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> No, there's no need. Thanks though. I'm about to go buy another 4790k to see if mine is a dud.


you should take the lid of it first and get some coolaboratory liquid pro on the cpu die before you purchase another one, then you will know for sure if you have a dud chip


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> Oh jeez...............
> 
> Well the Final Fantasy XIV/win8.1 saga is over.
> 
> I just installed win 8.1 again and decided to do everything opposite of the way I normally do.
> 
> So I let MS use my picture/name for advertising (usually I disallow it) and I sign up for a Microsoft account, problem solved...
> 
> It's something about Windows Explorer security settings (Windows uses Explorer to emulate 32 bit games in a 64bit OS, apparently--its called Windows on Windows: or WOW64).
> 
> I'm pretty sure it was one of these settings:
> 
> 
> that or signing up for MS account/email.
> 
> Cant believe I worked on that for so long. Biggest fail of these 272 pages.


lol epic fail... but glad you finally sorted it out, unbelievable that something as insignificant as a MS profile would have such a knock-on affect


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, here is what i got in FS Extreme with just one R9 295x2 running:
> 1. 4790K overclocked to 4.6GHz, 295x2 @ stock - http://www.3dmark.com/fs/2868460 - score 8710
> 2. 4790K overclocked to 4.6GHz, 295x2 overclocked - http://www.3dmark.com/fs/2868484 - score 9447
> And, yes, it just proofs, that two R9 290x's overclock better, but i've never argues with that.


2000points higher in Firestrike extreme is not due to simply better overclock.

If you go research this, you'll see 290x crossfire is simply more powerful than a single 295x2, whether they are both bone stock or both overclocked balls to the wall

I have a 290x, so I know what it can do. Since they scale nearly perfectly you can just x2 a 290x scores to get a ballpark idea of crossfire performance.
Quote:


> Originally Posted by *Syceo*
> 
> lol epic fail... but glad you finally sorted it out, unbelievable that something as insignificant as a MS profile would have such a knock-on affect


Thanks mate! It really is pretty epic, but I did offer Hardforum.com 64$ reward for a fix and they went on with me for 6 very long pages and no one even touched this kind of solution. There's a guy around here somewhere that had the same exact problem with SLI.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> No, there's no need. Thanks though. I'm about to go buy another 4790k to see if mine is a dud.


oh and 1 more thing, as dumb as it may sound ... have you checked your rads for dust build up (especially if they are exhausting with the fans pulling air through them) just a suggestion before you pull the trigger on a new chip


----------



## xer0h0ur

What syceo says seems far less drastic than replacing the processor.


----------



## ljreyl

So, I drained my loop, took apart my cpu block, changed the jetplate (I forgot to change the plate when I moved from my AMD system), cleaned off the hunk on the copper block on the inside, put it back together, added thermal paste, reseated, temps went down 3c.
After a lot of thinking, I came to the conclusion that it's my gpus that are causing my CPU to not cool as much.
when I run AIDA64 with the gpus idle, my temps hit 62 max on all cores. I'm ok with that. When playing an intensive game, I hit up to 80.
I then thought about what I did differently, which was adding fujipoly ultra to ALL VRMs on my gpu. Basically, I'm dumping in more heat from the gpus and it's affecting my components. If I used a lesser thermal pad for the VRMs, I would have gotten higher temps but the other components would be slightly cooler.
I'm not gonna get a new cpu. I believe that my thinking makes sense and it's reasonable.
And this CPU is great too. I'm at 4GHZ using only .975 volts lol. Amazing.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> 2000points higher in Firestrike extreme is not due to simply better overclock.
> 
> If you go research this, you'll see 290x crossfire is simply more powerful than a single 295x2, whether they are both bone stock or both overclocked balls to the wall
> 
> I have a 290x, so I know what it can do. Since they scale nearly perfectly you can just x2 a 290x scores to get a ballpark idea of crossfire performance.


Well, i did research on R9 295x2 and R9 290x performance comparison, and here is what i've found:
http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,17.html
http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799.html
http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review
http://www.hardocp.com/article/2014/04/08/amd_radeon_r9_295x2_video_card_review/1#.VCupKPmSwjw
According to these websites, these videocards perform really, really, really similarly, they have almost none difference. Sure, sometimes 290x is doing better, sometimes 295x2 does better, but we are talking mostly about 2 fps difference.
I am almost convinced, that two 290x in cfx perform better, judging by the FS Extreme scores and it really makes sense, that twovideocards would perform better, then two of them on one pcb (better power delivery, better heat dissipation, maybe something else i don't know of) but i just can't see it on the gaming performance...


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> So, I drained my loop, took apart my cpu block, changed the jetplate (I forgot to change the plate when I moved from my AMD system), cleaned off the hunk on the copper block on the inside, put it back together, added thermal paste, reseated, temps went down 3c.
> After a lot of thinking, I came to the conclusion that it's my gpus that are causing my CPU to not cool as much.
> when I run AIDA64 with the gpus idle, my temps hit 62 max on all cores. I'm ok with that. When playing an intensive game, I hit up to 80.
> I then thought about what I did differently, which was adding fujipoly ultra to ALL VRMs on my gpu. Basically, I'm dumping in more heat from the gpus and it's affecting my components. If I used a lesser thermal pad for the VRMs, I would have gotten higher temps but the other components would be slightly cooler.
> I'm not gonna get a new cpu. I believe that my thinking makes sense and it's reasonable.
> And this CPU is great too. I'm at 4GHZ using only .975 volts lol. Amazing.


Well, i'm glad everything worked out for you! I am planning to build a custom water loop, but i don't know what should i go for: better CPU temps or GPU temps, what would you recommend? Is there anyway to make GPU cooler but not suffer from CPU overheating?


----------



## Syceo

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, i did research on R9 295x2 and R9 290x performance comparison, and here is what i've found:
> http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,17.html
> http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799.html
> http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review
> http://www.hardocp.com/article/2014/04/08/amd_radeon_r9_295x2_video_card_review/1#.VCupKPmSwjw
> According to these websites, these videocards perform really, really, really similarly, they have almost none difference. Sure, sometimes 290x is doing better, sometimes 295x2 does better, but we are talking mostly about 2 fps difference.
> I am almost convinced, that two 290x in cfx perform better, judging by the FS Extreme scores and it really makes sense, that twovideocards would perform better, then two of them on one pcb (better power delivery, better heat dissipation, maybe something else i don't know of) but i just can't see it on the gaming performance...


2 people sprinting down the road holding hands is always going to be less efficient than a single person sprinting down the road with there hands free .....


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, i'm glad everything worked out for you! I am planning to build a custom water loop, but i don't know what should i go for: better CPU temps or GPU temps, what would you recommend? Is there anyway to make GPU cooler but not suffer from CPU overheating?


Well, I built this system with the intent to do heavy overclocking, which is why I used the best thermal pads I could find for VRM temps. I then changed my mind LOL
My advice to you, research twice, buy once. Read every freaking thing you can find about water cooling. Find flow calculation, pressure drop estimations, think about what your overall goal is, etc.
When I had my AMD system with a single 290X, I went for the highest overclocks I could get. Now that I have my 4790k and trifire system, My goal is a quiet, efficient, and cool computer in a decently sized case. (Corsair 750D)
So far, I'm running less than 25dbA when on full load (all noctua fans, either under 25dbA or undervolted to 7V to be under 25dbA), I'm drawing 1005 watts max load on this system (almost 100 watts less than everyone else). I just need to get my temps right with more undervolting on the CPU side (4GHz is stable @ .977V full load, 4.2GHz is stable @ 1.045V, working on 4.4GHz right now then 4.5GHz after)

Again, research twice, buy once. Take your time. Don't rush.
Oh yea, EK fittings are crap. XSPC is a lot better. And don't use colored dyes, get color tubing. (Dyes can alter the thermal conductivity of water, resulting in lower performance.)


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> I am almost convinced, that two 290x in cfx perform better, judging by the FS Extreme scores and it really makes sense, that twovideocards would perform better, then two of them on one pcb (better power delivery, better heat dissipation, maybe something else i don't know of) but i just can't see it on the gaming performance...
> Well, i did research on R9 295x2 and R9 290x performance comparison, and here is what i've found:
> http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,17.html
> http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799.html
> http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review
> http://www.hardocp.com/article/2014/04/08/amd_radeon_r9_295x2_video_card_review/1#.VCupKPmSwjw
> According to these websites, these videocards perform really, really, really similarly, they have almost none difference. Sure, sometimes 290x is doing better, sometimes 295x2 does better, but we are talking mostly about 2 fps difference.
> I am almost convinced, that two 290x in cfx perform better, judging by the FS Extreme scores and it really makes sense, that twovideocards would perform better, then two of them on one pcb (better power delivery, better heat dissipation, maybe something else i don't know of) but i just can't see it on the gaming performance...


[/quote]
OK *now* we're having a meaningful exchange, and I appreciate you taking the time to link those reviews.

I _think_ I know what happened here because these reviews do tend to *claim* the 295x2 will outrun crossfire 290Xs, but it is somewhat misleading. Why would these people do that? Money. To help sell people like you and me 295x2s.

Then you go and ask the enthusiasts who actually *use* these products--my first stop would be Overclockers UK--http://forums.overclockers.co.uk/showthread.php?t=18607061 and they spend 30 seconds wrapping up the basic question of which is faster, and then start raging at the OP for using an AMD platform for enthusiast graphics cards. Even the guy posting the question knows the answer. The thread is: " Easy question really...two 290X's or one 295 x2?"
Quote:


> Originally Posted by *Kaapstad*
> Bit of a non contest, two watercooled 290Xs will thrash a 295X2.


AMD tried really really hard to make the 295x2 desirable. I suspect they went so far as to nudge these reviewers to help them do it. The hardOCP review of TriFire sold me on the 295x2. I'm still happy with it and I stand by my conviction that H is, for the most part on the up and up. I also suspect from an economies of scale POV, selling one card for $1500 by squeezing 2 unbinned 290x GPUs onto a single PCB and slapping on a very inexpensive Asetek cooler (very effective but not ideal) is much more cost effective than selling single GPU cards for $500. AMD and Nvidia are in the dual GPU graphics card business because they are able to make it work for them (or they think they can).

You'll notice many of these reviews do not give you much detail on their testing methodology, they gloss over it.

Take a look at the tomshardware review. They say they are testing a 295x2 and 2 x AMD Radeon R9 290X 4 GB (CrossFire)... OK. But, which 290X's? What clocks? What cooler? We know what cooler was on the 295x2... it's watercooled with an AIO. Presumably all of these reviews are testing air-cooled 290X's in crossfire that are going to be struggling to maintain stock clocks (especially the top card). The Tom's review states they warmed the cards for 5 minutes before benchmarking. That's even more unfair considering the 295x2 is on an Asetek AIO. They should even the playing field and put Asetek coolers on the 290X's to make it apples to apples. But reviewers have a job to do, and it's testing stock configurations for the masses, I suppose.

So I concede that 2 stock reference 290X's on blower coolers or even 2 AfterMarket 290X's on air are going to trade blows with the watercooled 295x2 (particularly in short runs before the VRMs on the 295x2 get scorching), they are the same exact GPUs after all. But an enthusiast and pretty much anyone who's going to drop $1,000 on 2 290X's is going to do everything they can to keep them from throttling, and then will try to overclock them as high as he/she can--which is why those Kraken brackets and Corsair H55s are/were *flying* off the shelves.

Here's what I think:
If you watercool the 295x2 AND the 290X's (using NZXT Kraken brackets and the same Asetek coolers that the 295x2s have) and leave them at the same exact GPU and memory clocks (you'll have to underclock the 290X mem chips to make it fair...) the XF 290X's will still win on ANY platform. Why? The PLX chip on the 295x2. How big of a difference does the PLX chip on the 295x2 make? I would LOVE to know. I'm not sure. But it is not a free PCIE 3.0 x8/x16 slot. There is no such thing as a free lunch.

Sorry. I was bored and couldn't sleep because of Ebola. So I wrote a PLX manifesto.


----------



## KyadCK

Quote:


> Originally Posted by *electro2u*
> 
> Here's what I think:
> If you watercool the 295x2 AND the 290X's (using NZXT Kraken brackets and the same Asetek coolers that the 295x2s have) and leave them at the same exact GPU and memory clocks (you'll have to underclock the 290X mem chips to make it fair...) the XF 290X's will still win on ANY platform. *Why? The PLX chip on the 295x2. How big of a difference does the PLX chip on the 295x2 make? I would LOVE to know. I'm not sure. But it is not a free PCIE 3.0 x8/x16 slot. There is no such thing as a free lunch.*
> 
> Sorry. I was bored and couldn't sleep because of Ebola. So I wrote a PLX manifesto.


No.

The PLX chip used on the 295X2 works exactly as advertised; it is a PCI-e switch. The 295X2 will use XDMA over it, reducing the time it takes for the frame data from the 2nd GPU to get to the first GPU to be exported to the display. GPU1 to GPU2 gets a full PCI-e 3.0 x16 bridge, always. Even if you put them in an x4 slot. CPU can transfer to the two GPUs at a max combined of the socket width. If it is in a x16, then the CPU can speak to GPU 1 at x12 worth of time and GPU2 at x4 worth of time, or any other combination.

Also PLX time is measured in nanoseconds. 100 at most to be exact, or 0.1ms. The Dual-GPU cards use a PEX8747; http://www.plxtech.com/download/file/1824

Page 1 2nd paragraph and Page 3, right side.

GPUs do not magically talk to one another point-to-point just because they get their own slot. Whether you use a 295X2 or two 290Xs, they will always have one point in common.

290X(1) -> NorthBridge -> 290X(2)
295X2(1) -> PLX Chip -> 295X2(2)

Same clocks, a 295X2 and two 290Xs will perform the same barring outside factors.


----------



## electro2u

Quote:


> Originally Posted by *KyadCK*
> 
> No.
> 
> The PLX chip used on the 295X2 works exactly as advertised; it is a PCI-e switch. The 295X2 will use XDMA over it, reducing the time it takes for the frame data from the 2nd GPU to get to the first GPU to be exported to the display. GPU1 to GPU2 gets a full PCI-e 3.0 x16 bridge, always. Even if you put them in an x4 slot. CPU can transfer to the two GPUs at a max combined of the socket width. If it is in a x16, then the CPU can speak to GPU 1 at x12 worth of time and GPU2 at x4 worth of time, or any other combination.
> 
> Also PLX time is measured in nanoseconds. 100 at most to be exact, or 0.1ms. The Dual-GPU cards use a PEX8747; http://www.plxtech.com/download/file/1824
> 
> Page 1 2nd paragraph and Page 3, right side.
> 
> GPUs do not magically talk to one another point-to-point just because they get their own slot. Whether you use a 295X2 or two 290Xs, they will always have one point in common.
> 
> 290X(1) -> NorthBridge -> 290X(2)
> 295X2(1) -> PLX Chip -> 295X2(2)
> 
> Same clocks, a 295X2 and two 290Xs will perform the same barring outside factors.


No. Or 2x295x2 would be right up there in benchmarks with 4x290x
And 1 295x2 would be right up there with 2x290x in benches. But they arent even close.
How do you explain that?


----------



## KyadCK

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> No.
> 
> The PLX chip used on the 295X2 works exactly as advertised; it is a PCI-e switch. The 295X2 will use XDMA over it, reducing the time it takes for the frame data from the 2nd GPU to get to the first GPU to be exported to the display. GPU1 to GPU2 gets a full PCI-e 3.0 x16 bridge, always. Even if you put them in an x4 slot. CPU can transfer to the two GPUs at a max combined of the socket width. If it is in a x16, then the CPU can speak to GPU 1 at x12 worth of time and GPU2 at x4 worth of time, or any other combination.
> 
> Also PLX time is measured in nanoseconds. 100 at most to be exact, or 0.1ms. The Dual-GPU cards use a PEX8747; http://www.plxtech.com/download/file/1824
> 
> Page 1 2nd paragraph and Page 3, right side.
> 
> GPUs do not magically talk to one another point-to-point just because they get their own slot. Whether you use a 295X2 or two 290Xs, they will always have one point in common.
> 
> 290X(1) -> NorthBridge -> 290X(2)
> 295X2(1) -> PLX Chip -> 295X2(2)
> 
> Same clocks, a 295X2 and two 290Xs will perform the same barring outside factors.
> 
> 
> 
> No. Or 2x295x2 would be right up there in benchmarks with 4x290x
> And 1 295x2 would be right up there with 2x290x in benches. But they arent even close.
> How do you explain that?
Click to expand...

Lightnings.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Well, I built this system with the intent to do heavy overclocking, which is why I used the best thermal pads I could find for VRM temps. I then changed my mind LOL
> My advice to you, research twice, buy once. Read every freaking thing you can find about water cooling. Find flow calculation, pressure drop estimations, think about what your overall goal is, etc.
> When I had my AMD system with a single 290X, I went for the highest overclocks I could get. Now that I have my 4790k and trifire system, My goal is a quiet, efficient, and cool computer in a decently sized case. (Corsair 750D)
> So far, I'm running less than 25dbA when on full load (all noctua fans, either under 25dbA or undervolted to 7V to be under 25dbA), I'm drawing 1005 watts max load on this system (almost 100 watts less than everyone else). I just need to get my temps right with more undervolting on the CPU side (4GHz is stable @ .977V full load, 4.2GHz is stable @ 1.045V, working on 4.4GHz right now then 4.5GHz after)
> 
> Again, research twice, buy once. Take your time. Don't rush.
> Oh yea, EK fittings are crap. XSPC is a lot better. And don't use colored dyes, get color tubing. (Dyes can alter the thermal conductivity of water, resulting in lower performance.)


Wow, thanks! I was hoping, that when i will do watercooling, i'll do a little bit less research, then i thought to, but now, you convinced me i shouldn't. lol. I will definitely look for as much info as possible now, thank you, didn't even think this was that complicated!


----------



## electro2u

Quote:


> Originally Posted by *KyadCK*
> 
> Lightnings.


I have a 290x and I have a 295x2
i can put them at the same GPU/mem clocks and test.
You're saying the 295x2 should score pretty much exactly double the 290x? I haven't done this but it's easy enough.
You're one word argument is surprisingly persuasive (no sarcasm intended or implied).
Perhaps I've been schooled. Wouldn't be the first time today.


----------



## yifeng3007

electro2u and KyadCK, after reading your posts, guys, i am now less and less sure that i did a good system :C I was thinking about going 4-way CFX with R9 290x's, but i was so much afraid of the heat, so i decided to buy two R9 295x2's instead.. Well, we all make mistakes, aren't we? I am more concerned now, that i maid a somewhat huge mistake, because i don't get the performance i was expecting AT ALL! I get so, so, so much stuttering at times, it's unbearable...


----------



## KyadCK

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> Lightnings.
> 
> 
> 
> I have a 290x and I have a 295x2
> i can put them at the same GPU/mem clocks and test.
> You're saying the 295x2 should score pretty much exactly double the 290x? I haven't done this but it's easy enough.
> You're one word argument is surprisingly persuasive (no sarcasm intended or implied).
> Perhaps I've been schooled. Wouldn't be the first time today.
Click to expand...

Assuming all other hardware/software is identical, the clocks are the same, and neither throttles, they should perform within margin of error (~3%).

290Xs win overall because it's easier to keep them cool and they have more/better power delivery, allowing them to clock much higher. Other than that, they use the same everything. I'd like to see an ARES III get involved though.
Quote:


> Originally Posted by *yifeng3007*
> 
> electro2u and KyadCK, after reading your posts, guys, i am now less and less sure that i did a good system :C I was thinking about going 4-way CFX with R9 290x's, but i was so much afraid of the heat, so i decided to buy two R9 295x2's instead.. Well, we all make mistakes, aren't we? I am more concerned now, that i maid a somewhat huge mistake, because i don't get the performance i was expecting AT ALL! I get so, so, so much stuttering at times, it's unbearable...


You're going to have to provide far more information than that man. For example, two 295X2s for a single 1080 monitor? Do you like to just burn money, or do you have something more powerful sitting behind those cards?


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> electro2u and KyadCK, after reading your posts, guys, i am now less and less sure that i did a good system :C I was thinking about going 4-way CFX with R9 290x's, but i was so much afraid of the heat, so i decided to buy two R9 295x2's instead.. Well, we all make mistakes, aren't we? I am more concerned now, that i maid a somewhat huge mistake, because i don't get the performance i was expecting AT ALL! I get so, so, so much stuttering at times, it's unbearable...


I know it's very *very* frustrating. I'm not here to make it worse, either. I was happy when your score at FS improved. Your GPU temps seem ok... I can't explain the problems with bf4 and I still mean to buy a copy when I catch up. I think a lot of people ended up feeling similarly about their 295x2s and it's pretty upsetting because of the expense involved. I'm putting my 290x back in late tomorrow I've been waiting on a part I need. I will run some tests and check out bf4 and report back of course. Axiumone and lots of others have 2x295x2 working fine so I'm sure you can get the problems worked out too. I'd probably start by looking at your vrm thermal pads and just go from there.


----------



## yifeng3007

Quote:


> Originally Posted by *KyadCK*
> 
> You're going to have to provide far more information than that man. For example, two 295X2s for a single 1080 monitor? Do you like to just burn money, or do you have something more powerful sitting behind those cards?


Well, i really do run this setup with only one 1080p monitor, yes. But i am planning to buy some more monitors. I run BF4 on 4K scaling with everything on ultra, but i still don't think i get the performance this setup is supposed to get.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> I know it's very *very* frustrating. I'm not here to make it worse, either. I was happy when your score at FS improved. Your GPU temps seem ok... I can't explain the problems with bf4 and I still mean to buy a copy when I catch up. I think a lot of people ended up feeling similarly about their 295x2s and it's pretty upsetting because of the expense involved. I'm putting my 290x back in late tomorrow I've been waiting on a part I need. I will run some tests and check out bf4 and report back of course. Axiumone and lots of others have 2x295x2 working fine so I'm sure you can get the problems worked out too. I'd probably start by looking at your vrm thermal pads and just go from there.


Thank you very much! I am looking forward for your results!
How would i look at the vrm temps? What should i use?


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> Thank you very much! I am looking forward for your results!
> How would i look at the vrm temps? What should i use?


Strangely there are no temperature diodes on the 296x2 vrms (maybe they didn't want us to know?) but everyone pretty much uses fujipoly for these (I used the 11w which is cheaper but they have even stronger thermal pads that are rated at 17w/ something).

whats 4k scaling? That might be part of the issue. Do you have problems with other games too?


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> Strangely there are no temperature diodes on the 296x2 vrms (maybe they didn't want us to know?) but everyone pretty much uses fujipoly for these (I used the 11w which is cheaper but they have even stronger thermal pads that are rated at 17w/ something).
> 
> whats 4k scaling? That might be part of the issue. Do you have problems with other games too?


Actually, you CAN see the VRM temps. I've done it once but I can no longer look at them.
When I get home, I'll see what I can do.
As a benchmark, with Fujipoly Ultra, all VRM temps stayed under 55c on with all 3 cards overclocked on stock volts. Before I switched to Fujipoly ultra, I used 2 tiers below it (6 W/mk) and got temps up to 65c on stock volts, 75c with +100mV. The stuff is amazing, but that heat has to go somewhere... shoulda went with the 6 W/mk again so my cooling isn't as loaded


----------



## xer0h0ur

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, i really do run this setup with only one 1080p monitor, yes. But i am planning to buy some more monitors. I run BF4 on 4K scaling with everything on ultra, but i still don't think i get the performance this setup is supposed to get.


4K scaling on a 1080p monitor? What does that even mean?


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> 4K scaling on a 1080p monitor? What does that even mean?


probably means the resolution scaling in BF4 in the options forgot the name of it. At 200% its 4k equivalent?


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> Actually, you CAN see the VRM temps. I've done it once but I can no longer look at them.
> When I get home, I'll see what I can do.
> As a benchmark, with Fujipoly Ultra, all VRM temps stayed under 55c on with all 3 cards overclocked on stock volts. Before I switched to Fujipoly ultra, I used 2 tiers below it (6 W/mk) and got temps up to 65c on stock volts, 75c with +100mV. The stuff is amazing, but that heat has to go somewhere... shoulda went with the 6 W/mk again so my cooling isn't as loaded


This would be welcome news to me. Can check 2 different vrm temps on the 290x but the question is does the 295x2 throttle because of vrm temps? Hopefully... And that's probably what's happening to yifeng3007. Also the Plx chip gets hot and maybe that's what throttles and causes stuttering if it overheats.


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> This would be welcome news to me. Can check 2 different vrm temps on the 290x but the question is does the 295x2 throttle because of vrm temps? Hopefully... And that's probably what's happening to yifeng3007. Also the Plx chip gets hot and maybe that's what throttles and causes stuttering if it overheats.


I'm pretty sure it throttles due to the VRM, not really the PLX chip. You can't control the center fan on the 295x2. If you have bad air flow, your VRM's will get hot and will probably throttle the card.
I'm too lazy to search, but what were his core temps? I always hit the 75c limit, even stock...


----------



## xer0h0ur

The point I am making is that why the hell would you want to load your hardware with an added workload of resolution scaling? Far as I know that would certainly have an adverse effect on performance. I don't play BF4 so I guess I just don't understand. Also doesn't that game have a memory leak? Resolution scaling plus a memory leak sounds like you're gonna have a bad time.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Strangely there are no temperature diodes on the 296x2 vrms (maybe they didn't want us to know?) but everyone pretty much uses fujipoly for these (I used the 11w which is cheaper but they have even stronger thermal pads that are rated at 17w/ something).


Quote:


> Originally Posted by *ljreyl*
> 
> As a benchmark, with Fujipoly Ultra, all VRM temps stayed under 55c on with all 3 cards overclocked on stock volts. Before I switched to Fujipoly ultra, I used 2 tiers below it (6 W/mk) and got temps up to 65c on stock volts, 75c with +100mV. The stuff is amazing, but that heat has to go somewhere... shoulda went with the 6 W/mk again so my cooling isn't as loaded


I wish i knew, what in the world, you are talking about, guys








Quote:


> Originally Posted by *ljreyl*
> 
> Actually, you CAN see the VRM temps. I've done it once but I can no longer look at them.
> When I get home, I'll see what I can do.


I really hope, that we can check these vrm temps, so that i can find out if this is the case...
Quote:


> Originally Posted by *electro2u*
> 
> whats 4k scaling?


Quote:


> Originally Posted by *xer0h0ur*
> 
> 4K scaling on a 1080p monitor? What does that even mean?


Quote:


> Originally Posted by *DividebyZERO*
> 
> probably means the resolution scaling in BF4 in the options forgot the name of it. At 200% its 4k equivalent?


Yes, i am talking exactly about that. My monitor is 1080p, so when i scale the image in BF4 in the settings menu up to 200%, the image becomes 4K, i guess.
Quote:


> Originally Posted by *electro2u*
> 
> This would be welcome news to me. Can check 2 different vrm temps on the 290x but the question is does the 295x2 throttle because of vrm temps? Hopefully... And that's probably what's happening to yifeng3007. Also the Plx chip gets hot and maybe that's what throttles and causes stuttering if it overheats.


Honestly, i don't even know how to fix the vrm overheating issue, if that is the case, lol...
Quote:


> Originally Posted by *xer0h0ur*
> 
> The point I am making is that why the hell would you want to load your hardware with an added workload of resolution scaling? Far as I know that would certainly have an adverse effect on performance. I don't play BF4 so I guess I just don't understand. Also doesn't that game have a memory leak? Resolution scaling plus a memory leak sounds like you're gonna have a bad time.


Well, when i put the resolution to native 1080p, i get much more frames, but i still get lag spikes...


----------



## electro2u

If you French fry when you're supposed to pizza you're gonna have a bad time XD


----------



## electro2u

To improve heat dissipation on the vrm (voltage regulation modules) you need to take the coolers and backplates off and apply fujipoly thermal pads instead of the crappy stock ones. Works well. Yes the heat still comes out though.

If you are at 1080p just remove one of the cards and see how the game plays.


----------



## xer0h0ur

Really you don't even have to remove the card. Just disconnect the 8-pin power cables to the secondary video card and try native 1080p again. I don't know if it would prompt a re-install of the driver though.


----------



## Syceo

guys need some help here lol, just installed a new mobo MSI Z97 Xpower AC. damn Msi , i cant install the wifi driver from the disk, nothing happens, additionally when the pc starts up, it boots into windows but takes at least 5 mins for all the icons and desktop to load, wallpaper loads but the icons take time. I also have no mouse function at all. Any ideas guys


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> guys need some help here lol, just installed a new mobo MSI Z97 Xpower AC. damn Msi , i cant install the wifi driver from the disk, nothing happens, additionally when the pc starts up, it boots into windows but takes at least 5 mins for all the icons and desktop to load, wallpaper loads but the icons take time. I also have no mouse function at all. Any ideas guys


Did you try using a ethernet cable and downloading the latest stuff from the MSI website?
Also, do you have a lot of items loading up during start up?


----------



## Syceo

cant use the ethernet as my pc is located far from the modem


----------



## Syceo

cant use the ethernet as my pc is located far from the modem


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> guys need some help here lol, just installed a new mobo MSI Z97 Xpower AC. damn Msi , i cant install the wifi driver from the disk, nothing happens, additionally when the pc starts up, it boots into windows but takes at least 5 mins for all the icons and desktop to load, wallpaper loads but the icons take time. I also have no mouse function at all. Any ideas guys


Well, i can give an idea of WiFi card, but i don't know what is going on with th rest of the system.
So, when i got MSI MPower MAX AC, i couldn't get my WiFi working as well. I did everything: reinstalling drivers, removing the WiFi card... Until my brother came alone (luckily he came to visit me that time). All he did, is took the card out and simply reinstalled it into the slot by pushing it harder, then i did, until there was a "click" like when you insert a RAM or PCIe module into their places.
I don't know if that's your case, but this is what was wrong with my card, lol


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> To improve heat dissipation on the vrm (voltage regulation modules) you need to take the coolers and backplates off and apply fujipoly thermal pads instead of the crappy stock ones. Works well. Yes the heat still comes out though.
> 
> If you are at 1080p just remove one of the cards and see how the game plays.


Well, i already tried playing with only one card, even at 200% scaling in BF4 i was getting much less fps, but it was soooo smooth, especially at 1080p.
Fujipoly has so many stuff on their website, would please be so kind to send me the link to the things you are talking about?


----------



## yifeng3007

Quote:


> Originally Posted by *xer0h0ur*
> 
> Really you don't even have to remove the card. Just disconnect the 8-pin power cables to the secondary video card and try native 1080p again. I don't know if it would prompt a re-install of the driver though.


All i need to do is just disable crossfire







There is no need to remove the card or the wires.


----------



## xer0h0ur

Quote:


> Originally Posted by *yifeng3007*
> 
> All i need to do is just disable crossfire
> 
> 
> 
> 
> 
> 
> 
> There is no need to remove the card or the wires.


Far as I know that is not the same thing. Disabling crossfire means you're only using one of your four GPUs. I was referring to still using a single 295X2 with crossfire active to see if you get the same result as quadfire.


----------



## yifeng3007

Quote:


> Originally Posted by *xer0h0ur*
> 
> Far as I know that is not the same thing. Disabling crossfire means you're only using one of your four GPUs. I was referring to still using a single 295X2 with crossfire active to see if you get the same result as quadfire.


Well, this is what i got yesterday when i tried to check how a single 295x2 performs (at that moment i had two R9 295x2 installed in the system and i disabled crossfire):
1. 4790K overclocked to 4.6GHz, 295x2 @ stock - http://www.3dmark.com/fs/2868460 - score 8710 (i forgot to mention, that my RAM was at XMP 1600)
2. 4790K overclocked to 4.6GHz, 295x2 overclocked - http://www.3dmark.com/fs/2868484 - score 9447 (i forgot to mention, that my RAM was at XMP 1600)
And this is what i got, when i took one 295x2 out and i think, this is actually pretty funny:
1. 4790K at stock, RAM at stock (1333), 295x2 @ stock - http://www.3dmark.com/fs/2875874 - score 8658
2. 4790K at stock, RAM at stock (1333), 295x2 overclocked - http://www.3dmark.com/fs/2875833 - score 7778 (!!!)
It shows, that if you have two R9 295x2's and turn off crossfire, it will leave one 295x2 with both GPUs running.
But why when i overclock the single 295x2 to the same values i did, when i had two R9 295x2's with one of them turned off, i get less score, than at stock?







What the hell is going on?








Oh, and the reason i put my i7 to stock setting is because i watched this video and i hoped it would help me out: 



 . He was talking about why his 3-way SLI was getting less performance, than it supposed to. He explains it at 2:40.


----------



## xer0h0ur

I was not talking about how your system scores in benchmarks. You had said before that you ran into lag issues while using two 295X2's in crossfire in BF4. I was asking if your system presented the same problem(s) while using a single crossfired 295X2 and the easiest way to test that is just disconnecting the 8-pins from the secondary card.

These 295X2's are weird with overclocks. You need to find the right combination of GPU and vRAM OC otherwise your score drops. The exact same thing happened to me when I pushed my vRAM to 1700MHz.


----------



## yifeng3007

Quote:


> Originally Posted by *xer0h0ur*
> 
> I was not talking about how your system scores in benchmarks. You had said before that you ran into lag issues while using two 295X2's in crossfire in BF4. I was asking if your system presented the same problem(s) while using a single crossfired 295X2 and the easiest way to test that is just disconnecting the 8-pins from the secondary card.
> 
> These 295X2's are weird with overclocks. You need to find the right combination of GPU and vRAM OC otherwise your score drops. The exact same thing happened to me when I pushed my vRAM to 1700MHz.


Well, i don't see any, and i mean ANY lag spikes at all, when i use single R9 295x2, there is just 0 sudden fps drops. Sure, in 4K resoultion the system is very slow, compared to two of those cards, however, i can see, that the lag i get with low fps is actually something bearable, while latency is something i can't deal with, it feels like i am playing Portal 4...


----------



## Syceo

Finally got it sorted..... what the ...... is MS doing with their silly indexing. Anyway the dual 16 lanes on this new board , with the PLX chip seems to have added some additional performance, so no complaints aside from the 13 hrs it took me to get everything up and running (what a waste of a day off work lol)



Direct to die .....


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Finally got it sorted..... what the ...... is MS doing with their silly indexing. Anyway the dual 16 lanes on this new board , with the PLX chip seems to have added some additional performance, so no complaints aside from the 13 hrs it took me to get everything up and running (what a waste of a day off work lol)


Run a Firestrike Extreme benchmark and let me know what your cpu clock speeds/gpu clocks / mem clocks are. I'll mimic all the settings you have to see how much more performance you get with an extra PLX chip.


----------



## electro2u

In afterburner lately, if I overclock my VRAM at all on my 295x2 the voltage of GPU 1 goes to full ~1.2v...maybe it has to do with the pixel clock patcher for my 120Hz monitor?


----------



## cennis

Quote:


> Originally Posted by *ljreyl*
> 
> Run a Firestrike Extreme benchmark and let me know what your cpu clock speeds/gpu clocks / mem clocks are. I'll mimic all the settings you have to see how much more performance you get with an extra PLX chip.


Did you find out how to monitor vrm temps on 295x2?

Also as a general question, what are the temps you guys get?

im currently using the aquacomputer block with 7x120mm rads and i experience low 60c on stock playing mordor.
overclocked it can hit 70c
only 4.7ghz 4770k and 295x2 in the loop

my fans are at 1500rpm which are their max speeds


----------



## electro2u

My card is definitely dying. Can't be overclocked at all anymore. Get driver crashes and serious artifacts unless it is stock or very near stock.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> My card is definitely dying. Can't be overclocked at all anymore. Get driver crashes and serious artifacts unless it is stock or very near stock.


I know, this is probably a stupid question, but how do you overclock and do you also change your voltages?


----------



## StillClock1

Hey guys,

I'm not overclocking the card at all and after the latest catalyst driver update, I'm getting temperatures in the low 70sC within a couple minutes of starting a game (Arma III / Rome 2 Total War).

I read that temps in the low 70s are okay, but I just don't see a reason it should be running this hot after only a few minutes and I don't think it used to do that.

Do you have any ideas how I can get back to lower temps?

Thanks


----------



## ljreyl

Quote:


> Originally Posted by *cennis*
> 
> Did you find out how to monitor vrm temps on 295x2?
> 
> Also as a general question, what are the temps you guys get?
> 
> im currently using the aquacomputer block with 7x120mm rads and i experience low 60c on stock playing mordor.
> overclocked it can hit 70c
> only 4.7ghz 4770k and 295x2 in the loop
> 
> my fans are at 1500rpm which are their max speeds


Try this...
Uninstall the latest driver, do a driver sweeper (or whatever program you use to wipe out a driver) and reinstall.
Download HWINFO64 and run it. Scroll down to where the GPU's are. If you don't see VRM temps for each individual GPU...
Download MSI Afterburner and use it to turn off ULPS.
Restart the computer and rerun HWINFO64 and scroll down.

That was my method.

Also, you're getting the same temps as me. But mine are much better than you when it comes to cooling. I only have 360+120mm worth of radiators (Using Noctua NF F-12 undervolted to 7V) but i'm getting the exact same cooling as you... with 3 GPUs and a 4790k! (Cards are on stock clocks but overclocked to 1058/1325 for all 3 GPUs)
Maybe you need to reseat? I think you should be getting cooler than that.


----------



## ljreyl

Quote:


> Originally Posted by *Hoff248*
> 
> Hey guys,
> 
> I'm not overclocking the card at all and after the latest catalyst driver update, I'm getting temperatures in the low 70sC within a couple minutes of starting a game (Arma III / Rome 2 Total War).
> 
> I read that temps in the low 70s are okay, but I just don't see a reason it should be running this hot after only a few minutes and I don't think it used to do that.
> 
> Do you have any ideas how I can get back to lower temps?
> 
> Thanks


If it's on the stock cooler, I can believe it. What's your case airflow like? It could also be the fan on the radiator. I unplugged that thing and stuck it on a molex to run 100% all the time.


----------



## ljreyl

http://www.3dmark.com/fs/2877736

I'm 200 points away from getting into the top 100 for 3 GPUs lol. Too lazy to overclock more though


----------



## StillClock1

Thanks you, very good ideas - I'll try them when I get home tonight.

A few questions:

1. What is ULPS?

2. If i uninstall Catalyst Control center, does that not uninstall the driver with it? There is some 3rd party driver uninstaller I need to get?

3. (answer) Case airflow is decent, I just bought a couple more fans now. I have 3 front 120s for intake, the CPU has a h105i with two fans exhausting out the top, and then the stock r9295x2 radiator exhausting out the back. I ordered two new fans, I think I'll exhaust 1 out the top in the far back, and then the other I'll try to make the r9 295x2 radiator into a push/pull.

4. Its interesting you mention the part about seeing temps on both GPUs because when I first downloaded the 14.7 beta driver (a week ago or so), the temp on GPU2 went missing on MSI afterburner. Its back now, but it doesn't seem like the cooling is happening like it should anymore.

5. Won't the card throttle at 75C and downgrade performance to protect temps?

Thanks again.


----------



## electro2u

Quote:


> Originally Posted by *yifeng3007*
> 
> I know, this is probably a stupid question, but how do you overclock and do you also change your voltages?


Hmm. Good point. I guess new version of afterburner is working things a little differently. I was using +35mV before and that isn't enough now for my overclock. But raising it a bit more cleaned everything up! Thanx


----------



## ljreyl

Quote:


> Originally Posted by *Hoff248*
> 
> Thanks you, very good ideas - I'll try them when I get home tonight.
> 
> A few questions:
> 
> 1. What is ULPS?
> 
> 2. If i uninstall Catalyst Control center, does that not uninstall the driver with it? There is some 3rd party driver uninstaller I need to get?
> 
> 3. (answer) Case airflow is decent, I just bought a couple more fans now. I have 3 front 120s for intake, the CPU has a h105i with two fans exhausting out the top, and then the stock r9295x2 radiator exhausting out the back. I ordered two new fans, I think I'll exhaust 1 out the top in the far back, and then the other I'll try to make the r9 295x2 radiator into a push/pull.
> 
> 4. Its interesting you mention the part about seeing temps on both GPUs because when I first downloaded the 14.7 beta driver (a week ago or so), the temp on GPU2 went missing on MSI afterburner. Its back now, but it doesn't seem like the cooling is happening like it should anymore.
> 
> 5. Won't the card throttle at 75C and downgrade performance to protect temps?
> 
> Thanks again.


1 - Ultra Low power State. basically, your unused GPU(s) will shut down to save power
2 - it will uninstall the driver but remnants will still be in your system. A program such as DDU (http://www.guru3d.com/files-details/display-driver-uninstaller-download.html) will fully get rid of the driver.
3 - I would look for a fan that has high static pressure for your radiators (Corsair SP Series and noctua NF-F12 for example). Static pressure is more important when moving air away from a radiator. The fan has to be able to force air through the radiator fins in order to move heat away.
4 - Yea, it's weird like that.
5 - The cards will throttle around the 73-74 mark. It also depends on your VRM temps.


----------



## Syceo

Hey Ljreyl... what voltage are you running the 295x2 at? and are you overclocking the 290x ?


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Hey Ljreyl... what voltage are you running the 295x2 at? and are you overclocking the 290x ?


Stock voltages for all cards. Max I've pushed them is 1069/1500 on those volts. I put them back down to 1059/1325 to maintain a 1/1.25 ratio between core and mem clocks.

I believe stock voltages are 1.219 for the 290x and 1.230 for the 295x2


----------



## ljreyl

So... was I supposed to add a thermal pad to the PLX chip on the water block? I didn't see anything on my instruction sheet but the water block milling suggests that I should have...


----------



## xer0h0ur

Which water block are you using on your 295X2? When I attempted to use a thermal pad on the PLX chip I noted PCB warping so I decided instead to use the same TIM I had applied to the GPUs on the PLX chip. The only pad I would even attempt to use on the EK block would be the 0.5mm pad if at all. Anything more than that will likely cause warping.


----------



## ljreyl

I'm using an EK and I have spare fujipoly Extreme .5mm
Was there a performance benefit?
Man... I have to take it apart =( Stupid instructions showed no visuals that required thermal pads or TIM on the PLX.


----------



## xer0h0ur

The instruction sheet says in text to apply TIM to the PLX chip. It doesn't show a picture of applying it.


----------



## ljreyl

Quote:


> Originally Posted by *ljreyl*
> 
> I'm using an EK and I have spare fujipoly Extreme .5mm
> Was there a performance benefit?
> Man... I have to take it apart =( Stupid instructions showed no visuals that required thermal pads or TIM on the PLX.


Quote:


> Originally Posted by *xer0h0ur*
> 
> The instruction sheet says in text to apply TIM to the PLX chip. It doesn't show a picture of applying it.


sigh... did you see any benefits when you did it?


----------



## xer0h0ur

Well without TIM or a thermal pad its nearly certain that your PLX chip is overheating and I am sure that would cause performance loss.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well without TIM or a thermal pad its nearly certain that your PLX chip is overheating and I am sure that would cause performance loss.


That's probably what's been causing such weird throttling or GPU utilization (GPU1 on the 295x2 never hits 100%)
I'll add TIM when I get home and report back tonight.


----------



## xer0h0ur

Crossfire scaling changes from game to game. If you were experiencing throttling while running Firestrike or Firestrike Extreme then it would be a sign something isn't right.

For instance last night I was toying around with Thief. When using the Cat 14.9 and running the game in either Mantle Crossfire or DX11 Crossfire my usage was fairly even across all three GPUs but it was rarely if ever high GPU usage. Then I decided to manually create a crossfire profile in the CCC and try the different crossfire settings. AFR Friendly turned out to consistently cause high usage (in the high 90's or max) on GPU1 with near no usage on GPU2 or GPU3.

The result was higher framerates using a manual crossfire profile utilizing basically only one GPU at high load than using all three GPU's at lower usage. I was stunned and disappointed.


----------



## yifeng3007

Guys, how can i add a second fan to the stock radiator? Where do i connect it to and how? I just want to put two Corsair SP on each radiator, but i am not sure where do i connect the second fan to...
Btw, i don't know what i did, but today i was just trying to run one card alone in the system, and other than that i did nothing, but almost all stuttering that i had either decreased significantly or went away at all! I really don't know what i did, but i will do some more testing later...
And, oh, somehow, i still have ULPS turned off, even MSI Afterburner says, that it is off, but for some reason i cannot adjust my voltage setting anymore... I can leave with that, as long as the microstuttering gone never comes back, lol
Quote:


> Originally Posted by *ljreyl*
> 
> http://www.3dmark.com/fs/2877736
> I'm 200 points away from getting into the top 100 for 3 GPUs lol. Too lazy to overclock more though


Congratulations, m8, that's awesome!
Quote:


> Originally Posted by *electro2u*
> 
> Hmm. Good point. I guess new version of afterburner is working things a little differently. I was using +35mV before and that isn't enough now for my overclock. But raising it a bit more cleaned everything up! Thanx


Glad i was able to help!


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Crossfire scaling changes from game to game. If you were experiencing throttling while running Firestrike or Firestrike Extreme then it would be a sign something isn't right.
> 
> For instance last night I was toying around with Thief. When using the Cat 14.9 and running the game in either Mantle Crossfire or DX11 Crossfire my usage was fairly even across all three GPUs but it was rarely if ever high GPU usage. Then I decided to manually create a crossfire profile in the CCC and try the different crossfire settings. AFR Friendly turned out to consistently cause high usage (in the high 90's or max) on GPU1 with near no usage on GPU2 or GPU3.
> 
> The result was higher framerates using a manual crossfire profile utilizing basically only one GPU at high load than using all three GPU's at lower usage. I was stunned and disappointed.


What I experienced during FSE was crazy utilization across all 3 GPUs. Sometimes, 2 would be at 100% utilization while 1 is 0-20%. then, 1 of the 100% utilized drops to zero while the underutilized one hits 80, etc etc. I get massive drops in performance sometimes, going from 12500 - 11200 with the same settings, clocks, cpu speeds, etc.

BTW, core clocks stayed exactly the same, at times there was throttling but rarely.

Hopefully, it's the PLX and adding the TIM (Gelid GC Extreme) will stabilize the cards.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> Guys, how can i add a second fan to the stock radiator? Where do i connect it to and how? I just want to put two Corsair SP on each radiator, but i am not sure where do i connect the second fan to...


Do Push/Pull

Fan pulling air away from radiator > Radiator > Fan pushing air into radiator


----------



## ljreyl

DELETE, REPLIED TO MYSELF ON ACCIDENT


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Do Push/Pull
> 
> Fan pulling air away from radiator > Radiator > Fan pushing air into radiator


Thank you for the advice, but i already got it figured out, lol, i wouldn't want it to run like "pull-pull" or "push-pull:








I just don't know which connectors to use, since there is only one connector going from the videocard to the radiator...


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> What I experienced during FSE was crazy utilization across all 3 GPUs. Sometimes, 2 would be at 100% utilization while 1 is 0-20%. then, 1 of the 100% utilized drops to zero while the underutilized one hits 80, etc etc. I get massive drops in performance sometimes, going from 12500 - 11200 with the same settings, clocks, cpu speeds, etc.
> 
> BTW, core clocks stayed exactly the same, at times there was throttling but rarely.
> 
> Hopefully, it's the PLX and adding the TIM (Gelid GC Extreme) will stabilize the cards.


Wow, this is almost exactly what i am getting! The only difference is that i was getting a LOT of microstuttering (even when i had 100+ fps in the first test it looked like i only had 20-30...)


----------



## xer0h0ur

Quote:


> Originally Posted by *yifeng3007*
> 
> Guys, how can i add a second fan to the stock radiator? Where do i connect it to and how? I just want to put two Corsair SP on each radiator, but i am not sure where do i connect the second fan to...


Just get yourself a Y cable so you can run both fans in sync off a single fan header on your motherboard. Assuming you have an available fan header. I was using 2000RPM NF-F12 Industrial PPC fans with the standard NF-F12's Y cable. Don't worry about disconnecting the original fan from the video card. It won't affect the performance at all unless you reach throttling temps.


----------



## yifeng3007

For those, who are interested, there was a man, who joined the thread i started a month ago at JonnyGURU forums. This man was getting similar to mine problems, yet his second card was faulty, while mine works fine by itself (even though i was getting weird FSE results http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/2760#post_22938272)
So, here is the link: http://www.jonnyguru.com/forums/showthread.php?t=11785&page=5 , his nickname is "xringx"


----------



## yifeng3007

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just get yourself a Y cable so you can run both fans in sync off a single fan header on your motherboard. Assuming you have an available fan header. I was using 2000RPM NF-F12 Industrial PPC fans with the standard NF-F12's Y cable. Don't worry about disconnecting the original fan from the video card. It won't affect the performance at all unless you reach throttling temps.


If use a Y cable, why can't i connect it directly to the videocard's connector? And, btw, if i connect the fans to the motherboard, they won't know when to quiet down and when to increase the fan speeds, right?


----------



## xer0h0ur

Its all about the connectors themselves. Are both of your fans you're using on the radiator using the same connector type as what is connected to the video card? If so then just get a Y cable for that connector type. I used what I had readily available without having to buy anything and I couldn't connect 4-pin PWM fan connectors to that connector on the video card anyways.


----------



## yifeng3007

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its all about the connectors themselves. Are both of your fans you're using on the radiator using the same connector type as what is connected to the video card? If so then just get a Y cable for that connector type. I used what I had readily available without having to buy anything and I couldn't connect 4-pin PWM fan connectors to that connector on the video card anyways.


Well, as far as i can see, the fan on the stock radiator is a 3-pin connector with only two wires connected to it...
Can i use a Y cable on it and connect this: Corsair SP120 High Performance Edition Twin Pack to it?


----------



## Syceo

Quote:


> Originally Posted by *yifeng3007*
> 
> Well, as far as i can see, the fan on the stock radiator is a 3-pin connector with only two wires connected to it...
> Can i use a Y cable on it and connect this: Corsair SP120 High Performance Edition Twin Pack to it?


Yup you can ( personally used the same notcua ones as xer0h0ur ) for a tad better cooling


----------



## xer0h0ur

As long as you're not buying the PWM version of those fans you will be fine using two of those and a 3-pin Y cable.

This is the type you need: http://www.newegg.com/Product/Product.aspx?Item=N82E16835181027

This is the type of 3-pin Y cable you would need if I remember correctly the connector setup: http://www.newegg.com/Product/Product.aspx?Item=N82E16812189063&cm_re=3_pin_y_splitter-_-12-189-063-_-Product


----------



## yifeng3007

Quote:


> Originally Posted by *Syceo*
> 
> Yup you can ( personally used the same notcua ones as xer0h0ur ) for a tad better cooling


Quote:


> Originally Posted by *xer0h0ur*
> 
> As long as you're not buying the PWM version of those fans you will be fine using two of those and a 3-pin Y cable.
> 
> This is the type you need: http://www.newegg.com/Product/Product.aspx?Item=N82E16835181027
> 
> This is the type of 3-pin Y cable you would need if I remember correctly the connector setup: http://www.newegg.com/Product/Product.aspx?Item=N82E16812189063&cm_re=3_pin_y_splitter-_-12-189-063-_-Product


Thank you, guys!
Can you please check this out: http://www.quietpc.com/na-syc2 ? Am i right that this is what i need?


----------



## xer0h0ur

Yeah it appears to be the same type of Y cable, just sleeved.


----------



## Mega Man

Quote:


> Originally Posted by *yifeng3007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syceo*
> 
> Yup you can ( personally used the same notcua ones as xer0h0ur ) for a tad better cooling
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> As long as you're not buying the PWM version of those fans you will be fine using two of those and a 3-pin Y cable.
> 
> This is the type you need: http://www.newegg.com/Product/Product.aspx?Item=N82E16835181027
> 
> This is the type of 3-pin Y cable you would need if I remember correctly the connector setup: http://www.newegg.com/Product/Product.aspx?Item=N82E16812189063&cm_re=3_pin_y_splitter-_-12-189-063-_-Product
> 
> Click to expand...
> 
> Thank you, guys!
> Can you please check this out: http://www.quietpc.com/na-syc2 ? Am i right that this is what i need?
Click to expand...

unfortunately the patent troll does not list the max amps on that line so i would be careful and buy low power fans, you could burn it out. as for me i just ditched them and run them off my aquaero


----------



## xer0h0ur

A combined 0.36 Amps burning it out? Seems pretty ridiculous to me.


----------



## Mega Man

actually more like 0.72a, you need to assume 2x as much for starting amps, but again i said low power, i never said to use or not use those fans,

the fans i use can use as much as 1.5a each, thats rated so up to 3a stating currant

also under volting causes more amp draw and more heat, unless this uses pwm ( not the fan standard, just pulsing 12v on and off ) this could also play a factor


----------



## xer0h0ur

Oh I had no idea that the fans draw double the current on startup. Even still that seems like a negligible amount of power draw.


----------



## Mega Man

let me say this again. all i said is to look at low power ( usage ) fans. the fans i use pull 1.5a even most mobo headers are only rated @1a


----------



## xer0h0ur

Okay its fine to warn him but he already said what he planned on using and you already know what its power draw is so why do you keep mentioning your setup instead of what he plans on using? Are you implying those aren't low power fans or not low enough? Speak plainly instead of repeating yourself.


----------



## Mega Man

i do not know the power draw, i do not care to look it up. i provided information. nothing else, maybe not useful to him, maybe useful to others in the future


----------



## xer0h0ur

Well I checked the specs on the fan that my 295X2 came with. Its a Power Logic PLA12025S12M 12V 0.2A


----------



## ljreyl

I took apart my 295x2, cleaned off the paste, reapplied thermal paste to the GPUs AND the PLX, reinstalled and BOOM, still crazy utilization but more consistent. Scores ended up slightly high after 3 runs and all scores were consistent as well which means that the PLX is now being cooled adequately.
To kick it up a notch, I also flashed the 295x2 with the master and slave bios of the Sapphire 295x2 OC bioses. Downclocked the 295 to a reference 295x2 and ran another bench, slightly higher scores!


----------



## ljreyl

Finally finished tweaking my system to perfection!
4790k - 4.5GHz @ 1.148V
295x2 (Sapphire OC Bios) - 1071 Core / 1450 Memory @ Stock Volts (1.231V)
290x (Sapphire Stock Bios) - 1071 Core / 1450 Memory @ Stock Volts (1.219V)

The Sapphire OC bios on the 295x2 chip seems like it smoothed out a lot of GPU utilization (could also be the TIM on the PLX chip)

Temps (After playing Crysis 3 @ Very High Textures/Graphics, 2x SMGPU, 1440p)
4790k - 71c, 73c, 70c, 67c
295x2 - 59c (both GPUs)
290x - 61c


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> Finally finished tweaking my system to perfection!
> 4790k - 4.5GHz @ 1.148V
> 295x2 (Sapphire OC Bios) - 1071 Core / 1450 Memory @ Stock Volts (1.231V)
> 290x (Sapphire Stock Bios) - 1071 Core / 1450 Memory @ Stock Volts (1.219V)
> 
> The Sapphire OC bios on the 295x2 chip seems like it smoothed out a lot of GPU utilization (could also be the TIM on the PLX chip)
> 
> Temps (After playing Crysis 3 @ Very High Textures/Graphics, 2x SMGPU, 1440p)
> 4790k - 71c, 73c, 70c, 67c
> 295x2 - 59c (both GPUs)
> 290x - 61c


Those are basically exactly the kind of temps I get. I'm happy with them anyway. Was expensive for 10-15C but it's the only way to do it without compromising form and/or function


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Finally got it sorted..... what the ...... is MS doing with their silly indexing. Anyway the dual 16 lanes on this new board , with the PLX chip seems to have added some additional performance, so no complaints aside from the 13 hrs it took me to get everything up and running (what a waste of a day off work lol)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Direct to die .....


Looks really nice, Syceo. I liked that board. I thought the white light-up msi logo on the PCH heatsink was excellent in terms of looks. I couldn't get my BIOS sorted out when I had it.

I'm hunting a pg278q... they are very very hard to get over here if you aren't ready with cc in hand right when they come up for sale. People snatch them up asap.

Since latest drivers/AB release, I really can't overclock anymore with the AMD pixel clock patch working. I'm guessing your 295x2 doesn't get hotter running a digital signal to the Swift? Whichever GPU I hook my Catleap to has to up its voltage to work properly.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> [/SPOILER]
> 
> Looks really nice, Syceo. I liked that board. I thought the white light-up msi logo on the PCH heatsink was excellent in terms of looks. I couldn't get my BIOS sorted out when I had it.
> 
> I'm hunting a pg278q... they are very very hard to get over here if you aren't ready with cc in hand right when they come up for sale. People snatch them up asap.
> 
> Since latest drivers/AB release, I really can't overclock anymore with the AMD pixel clock patch working. I'm guessing your 295x2 doesn't get hotter running a digital signal to the Swift? Whichever GPU I hook my Catleap to has to up its voltage to work properly.


Cheers mate...

Yeah the bios was a bit tricky to get going, figured it out in the end with the help of you tube and others who had the same issues with the setup if the board.The swift works excellently with the setup.I was a bit apprehensive prior to purchasing it because of the TN panel, but i clearly had nothing to worry about as it looks great. The 295x2 runs cooler now given that its not pushing out 4K as the main monitor. I must admit now, I jumped on the 4K gaming bandwagon a little bit to early. 4K @ 60hz looks gorgeous , but you really take a big hit in performance when those frames drop to 40 fps in demanding games. Hope you can grab one of these badboys pretty soon.


----------



## axiumone

Here's my latest run. Still no quite as fast as my previous 4x290 watercooled set up, but we're getting there.

This is pretty far from 24/7 stable clocks, but keep in mind this is done with the stock cooling on the cards. There are two fans on each radiator in push/pull and I have full control over the VRM fan. Having control over ALL of the fans on the card helps tremendously and this still has some room before throttling.



http://www.3dmark.com/fs/2887515


----------



## StillClock1

Hey ljeryl,

I tried using DDU and it created a restore point for me and then rebooted into safe mode. I didn't know how to get out of safe mode (each reboot took me back to safe mode) so I used the restore point.

I also rolled the driver back to 14.4 but I'm still getting up to 74C within a few minutes so I believe the issue to be driver-related. I'm also watching the GPU utilization and it frequently hits 100% and then falls. No noticeable drop in FPS though, and the games are still certainly playable.

Question: Is there something I was supposed to have done from safe mode that would have allowed me to continue uninstalling the driver and get out of the safe mode - or did something go wrong?

In the event I can't get that to work, I'm thinking of doing a new installation of windows on a different disk.

I've had my eye on a pcie SSD or an mpcie SSD for a while as my SSD is not that fast (didn't get a fast one) and I was wondering if you think I would run into any problems (1) using the same cd of windows 7 to install on a different disk, (2) re-arranging the boot order to boot to that new disk, and (3) starting computer life anew on that new disk. Would I need to re-format my hard drives when I reconnect them?

Thanks for any help.


----------



## joeh4384

Quote:


> Originally Posted by *axiumone*
> 
> Here's my latest run. Still no quite as fast as my previous 4x290 watercooled set up, but we're getting there.
> 
> This is pretty far from 24/7 stable clocks, but keep in mind this is done with the stock cooling on the cards. There are two fans on each radiator in push/pull and I have full control over the VRM fan. Having control over ALL of the fans on the card helps tremendously and this still has some room before throttling.
> 
> 
> 
> http://www.3dmark.com/fs/2887515


How do you have control on the VRM fan?


----------



## axiumone

Quote:


> Originally Posted by *joeh4384*
> 
> How do you have control on the VRM fan?


http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/2600_100#post_22925485

Details coming soon.


----------



## yifeng3007

xer0h0ur and Mega Man, thank you for all the info, guys, i'll try to consider every possibility and try not to burn anything, lol
Quote:


> Originally Posted by *ljreyl*
> 
> I took apart my 295x2, cleaned off the paste, reapplied thermal paste to the GPUs AND the PLX, reinstalled and BOOM, still crazy utilization but more consistent. Scores ended up slightly high after 3 runs and all scores were consistent as well which means that the PLX is now being cooled adequately.
> To kick it up a notch, I also flashed the 295x2 with the master and slave bios of the Sapphire 295x2 OC bioses. Downclocked the 295 to a reference 295x2 and ran another bench, slightly higher scores!


Can you send me a link or any instruction on how to rip the 295x2, please?
I just want to change GPU thermal paste (there is a thermal paste, like AC MX-4, and not a thermal glue, right?)
I also really want to know if i can apply some thermal paste on the PLX chip...
And how do i apply Fujipoly Ultra (i guess, everyone is talking about this one - http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html) and can i do this all on the stock cooler?


----------



## ljreyl

Quote:


> Originally Posted by *Hoff248*
> 
> Hey ljeryl,
> 
> I tried using DDU and it created a restore point for me and then rebooted into safe mode. I didn't know how to get out of safe mode (each reboot took me back to safe mode) so I used the restore point.
> 
> I also rolled the driver back to 14.4 but I'm still getting up to 74C within a few minutes so I believe the issue to be driver-related. I'm also watching the GPU utilization and it frequently hits 100% and then falls. No noticeable drop in FPS though, and the games are still certainly playable.
> 
> Question: Is there something I was supposed to have done from safe mode that would have allowed me to continue uninstalling the driver and get out of the safe mode - or did something go wrong?
> 
> In the event I can't get that to work, I'm thinking of doing a new installation of windows on a different disk.
> 
> I've had my eye on a pcie SSD or an mpcie SSD for a while as my SSD is not that fast (didn't get a fast one) and I was wondering if you think I would run into any problems (1) using the same cd of windows 7 to install on a different disk, (2) re-arranging the boot order to boot to that new disk, and (3) starting computer life anew on that new disk. Would I need to re-format my hard drives when I reconnect them?
> 
> Thanks for any help.


When you run DDU, it should ask you to boot to safe mode, which you've already done.
Then, once it boots, it should open the program automatically (wait for a minute). Run the program, and then it'll reboot OUT of safe mode.
If it does not, go to search and type msconfig. There should be a tick mark next to "boot to safe mode". Tick normal startup or whatever you prefer.
Then install the 14.9WHQL Driver.


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> xer0h0ur and Mega Man, thank you for all the info, guys, i'll try to consider every possibility and try not to burn anything, lol
> Can you send me a link or any instruction on how to rip the 295x2, please?
> I just want to change GPU thermal paste (there is a thermal paste, like AC MX-4, and not a thermal glue, right?)
> I also really want to know if i can apply some thermal paste on the PLX chip...
> And how do i apply Fujipoly Ultra (i guess, everyone is talking about this one - http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html) and can i do this all on the stock cooler?


Here's the guide to take it apart (it's the guide i used when fitting my water block)
http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869062.pdf

I would really just put higher static pressure fans on your radiator in push/pull. I just used (1) corsair SP 120mm fan in push and my temps never went above 71c.
Also, the PLX chip already has a thermal pad on the stock heatsink, so there's no need.
You can put the new fujipoly pads, I just don't know the sizes/thickness for a stock block. Do your research if you want to go that route. i doubt you'll get much performance though since the center fan is a factor. Hopefully, Axiumone puts out his guide soon. That would be the route to take instead of puting new pads.


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> Those are basically exactly the kind of temps I get. I'm happy with them anyway. Was expensive for 10-15C but it's the only way to do it without compromising form and/or function


I was looking at your build log. Are you overclocked? I find it strange you're getting the same temps as me when you have better cooling potential than I do...


----------



## stxe34

hi there, i just finished my build which is a 5960x, rampage extreme v 16gb of 3mhz ram and my 2 r9 295x2. i have a problem i have massive lagging on benchmarks and i dont know why...in heaven it says im running 80fps at 4k on ultra but it seems like 25-30fps i have checked xfire, upls and refresh rate..im a bit lost..any ideas?


----------



## axiumone

Quote:


> Originally Posted by *stxe34*
> 
> hi there, i just finished my build which is a 5960x, rampage extreme v 16gb of 3mhz ram and my 2 r9 295x2. i have a problem i have massive lagging on benchmarks and i dont know why...in heaven it says im running 80fps at 4k on ultra but it seems like 25-30fps i have checked xfire, upls and refresh rate..im a bit lost..any ideas?


Yep, unfortunately that's what the latest driver brings to the table. I have the same issues with my 2x295x2 crossfire. Games are showing that they're over 100fps, but it feels like the game is running at 30fps. Since, you're on one screen, try different driver versions, it may help.

Write an email to AMD and report your issues or it will never be fixed.


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> I was looking at your build log. Are you overclocked? I find it strange you're getting the same temps as me when you have better cooling potential than I do...


Yeah my 4790k wasn't seated well or something. It was at 4.8Ghz and stable in IBT but 1 of the cores was literally 25C hotter than the coolest core, and hitting 92C. So that didn't help.

Now I'm on a Ivy-E [email protected] p95 stable. Got my offset running and it idles nice and cool and tops out at 66C in P95. Still haven't had a chance to get my 290x back in the system yet, but it's on my list for the day. Was going to try to do it without draining the loop by setting the case flat on its side


----------



## stxe34

will do thanks


----------



## ocvn

Quote:


> Originally Posted by *stxe34*
> 
> hi there, i just finished my build which is a 5960x, rampage extreme v 16gb of 3mhz ram and my 2 r9 295x2. i have a problem i have massive lagging on benchmarks and i dont know why...in heaven it says im running 80fps at 4k on ultra but it seems like 25-30fps i have checked xfire, upls and refresh rate..im a bit lost..any ideas?


4k display is 2 title panels connect togather by MST link. So it similar as eyefinity 2 displays. That's why you have the problem like some of us have when we run eyefinity and crossfire 295x2. Driver mod 13.12 do the trick







. I havent test 14.9 to check AMD fix it yet







. Sorry for my bad English


----------



## stxe34

is that the case with dp1.2?


----------



## stxe34

also it was fine on my x79 board


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> [/SPOILER]
> 
> Looks really nice, Syceo. I liked that board. I thought the white light-up msi logo on the PCH heatsink was excellent in terms of looks. I couldn't get my BIOS sorted out when I had it.
> 
> I'm hunting a pg278q... they are very very hard to get over here if you aren't ready with cc in hand right when they come up for sale. People snatch them up asap.
> 
> Since latest drivers/AB release, I really can't overclock anymore with the AMD pixel clock patch working. I'm guessing your 295x2 doesn't get hotter running a digital signal to the Swift? Whichever GPU I hook my Catleap to has to up its voltage to work properly.


Hey electro, was reading another thread about the asus rog monitor, someone posted a link where they are apparently still available so here you go (just incase your still looking for one)

http://www.xoticpc.com/advanced_search_result.php?keywords=pg278q

Good luck


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Hey electro, was reading another thread about the asus rog monitor, someone posted a link where they are apparently still available so here you go (just incase your still looking for one)
> 
> http://www.xoticpc.com/advanced_search_result.php?keywords=pg278q
> 
> Good luck


Thanks man, I took a look at their return policy and didn't like what I saw. I'm waiting for 1 of the 3 etailers I use to have them in stock







It may be a while.


----------



## stxe34

Quote:


> Originally Posted by *axiumone*
> 
> Yep, unfortunately that's what the latest driver brings to the table. I have the same issues with my 2x295x2 crossfire. Games are showing that they're over 100fps, but it feels like the game is running at 30fps. Since, you're on one screen, try different driver versions, it may help.
> 
> Write an email to AMD and report your issues or it will never be fixed.


tried all driver versions all the same...getting fed up with these cards now nothing but trouble...driver support is terrible...think they will be going up for sale very soon..maybe some titan blacks quad sli?


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> Here's the guide to take it apart (it's the guide i used when fitting my water block)
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869062.pdf
> 
> I would really just put higher static pressure fans on your radiator in push/pull. I just used (1) corsair SP 120mm fan in push and my temps never went above 71c.
> Also, the PLX chip already has a thermal pad on the stock heatsink, so there's no need.
> You can put the new fujipoly pads, I just don't know the sizes/thickness for a stock block. Do your research if you want to go that route. i doubt you'll get much performance though since the center fan is a factor. Hopefully, Axiumone puts out his guide soon. That would be the route to take instead of puting new pads.


Thank's for the info! I guess i'll just add two Corsair SP fans on each radiator and won't change any TIM or something...
Quote:


> Originally Posted by *stxe34*
> 
> tried all driver versions all the same...getting fed up with these cards now nothing but trouble...driver support is terrible...think they will be going up for sale very soon..maybe some titan blacks quad sli?


I know that feel, bro... This is so frustrating and i just don't know what to do... I will wait until i get all the cooling i need, install it on the stock fans and will wait for axiumone's guide on how to control the VRM fan. So if that won't fix all the stuttering, i will probably switch for 4x980's or wait for R9 390x...


----------



## axiumone

Quote:


> Originally Posted by *stxe34*
> 
> tried all driver versions all the same...getting fed up with these cards now nothing but trouble...driver support is terrible...think they will be going up for sale very soon..maybe some titan blacks quad sli?


Quote:


> Originally Posted by *yifeng3007*
> 
> Thank's for the info! I guess i'll just add two Corsair SP fans on each radiator and won't change any TIM or something...
> I know that feel, bro... This is so frustrating and i just don't know what to do... I will wait until i get all the cooling i need, install it on the stock fans and will wait for axiumone's guide on how to control the VRM fan. So if that won't fix all the stuttering, i will probably switch for 4x980's or wait for R9 390x...


Guys, contact AMD. Make your voices heard! By venting about it on the forums nothing will be solved. Write to anyone AMD that will listen.

We're all in the same boat and the state of AMD's drivers needs a lot of work.

Start here - https://twitter.com/amd_roy


----------



## stxe34

Quote:


> Originally Posted by *axiumone*
> 
> Guys, contact AMD. Make your voices heard! By venting about it on the forums nothing will be solved. Write to anyone AMD that will listen.
> 
> We're all in the same boat and the state of AMD's drivers needs a lot of work.
> 
> Start here - https://twitter.com/amd_roy


did do a report on the amd site...dont think cooling is my issue...drivers are at fault i think..


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> Guys, contact AMD. Make your voices heard! By venting about it on the forums nothing will be solved. Write to anyone AMD that will listen.
> 
> We're all in the same boat and the state of AMD's drivers needs a lot of work.
> 
> Start here - https://twitter.com/amd_roy


The problem is that i really don't know what to write! I don't even know what the hell is wrong with my system!
Maybe it's the cooling, maybe it's the drivers, maybe my cards are all faulty...
I can't just write something like: "Hey, my videocards microstutter for no particular reasoin! Fix it!" Or can i? :0


----------



## axiumone

Quote:


> Originally Posted by *yifeng3007*
> 
> The problem is that i really don't know what to write! I don't even know what the hell is wrong with my system!
> Maybe it's the cooling, maybe it's the drivers, maybe my cards are all faulty...
> I can't just write something like: "Hey, my videocards microstutter for no particular reasoin! Fix it!" Or can i? :0


I promise you, it's the drivers. My card's are very well cooled. With radiator fans in push/pull and the vrm fan at 100%. GPU's are at around 65c on load and they still micro stutter.

The only solution, again, is to run only one 295x2, as I can't notice any stuttering like that.


----------



## stxe34

Quote:


> Originally Posted by *axiumone*
> 
> I promise you, it's the drivers. My card's are very well cooled. With radiator fans in push/pull and the vrm fan at 100%. GPU's are at around 65c on load and they still micro stutter.


i agree but its not like the cards are a week old..coming from nvidia i cant believe how bad amd drivers are! i saw no real difference with the 14.9 drivers on my old system considering how long we waited!


----------



## electro2u

I still find it odd that the VRMs don't have temp sensors, or if they do, that we can't monitor them. Then the Plx chips themselves can overheat. Everyone in the thread with 2 295x2s that's having trouble with them is on stock cooling. I wonder how Elmy's water cooled cards are behaving.


----------



## stxe34

Quote:


> Originally Posted by *electro2u*
> 
> I still find it odd that the VRMs don't have temp sensors, or if they do, that we can't monitor them. Then the Plx chips themselves can overheat. Everyone in the thread with 2 295x2s that's having trouble with them is on stock cooling. I wonder how Elmy's water cooled cards are behaving.


Both my cards are watercooled...temps are not an issue


----------



## Syceo

Hi guys, anyone else finding the weight of the 295x2 pulling down on the pcie slot?? Anyone know where i can get one of those brackets?? Had to use a bit a primochil acrylic tubing for now as i didnt realise until today that the board was about 4mm out


----------



## xer0h0ur

I need to get a level to check it myself.


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> I need to get a level to check it myself.


as you can see , with the level on Xer0h0ur its still way out. What do you reckon mate? im confident that if left unchecked , over time this has got to bend the pcie slot (how can it not)

I did as custom flow bridge , so the weight of the lower card must be adding to that strain. I know someone poster a comment on here with an image of what looked like a manufactured support bracket for heavy cards.


----------



## axiumone

Both of my cards are definitely sagging. I ordered a horizontal test bench today to replace my corsair 450D, because I honestly don't feel great about cards putting that much strain on the motherboard.


----------



## Syceo

Quote:


> Originally Posted by *axiumone*
> 
> Both of my cards are definitely sagging. I ordered a horizontal test bench today to replace my corsair 450D, because I honestly don't feel great about cards putting that much strain on the motherboard.


I totally agree with you.... eventually the weight of these cards will do some permanent damage to the mobo, this im sure off....


----------



## xer0h0ur

Right, well I would also try to limit or eliminate the sagging if it was that bad. People had suggested using fishing line before to suspend the card. That could work as a temporary fix if nothing else. I keep forgetting to get a support made out of acrylic.


----------



## electro2u

Yeah there's a lot of things about these cards that are not ideal. As I said when someone I shall not name lest he appear called me a fanboy. I don't regret buying a 295x2 but it's mostly because it got me into water cooling, which is fun.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> Yeah there's a lot of things about these cards that are not ideal. As I said when someone I shall not name lest he appear called me a fanboy. I don't regret buying a 295x2 but it's mostly because it got me into water cooling, which is fun.


Lol i think i know who your referring too, in fact i haven't seen nor heard from him since the 980's surfaced. I too am not a fanboy and i must admit this whole watercooling exp with the 295 has been a blast albeit an expensive one. I must admit, if rumours regarding the 390x are anything to go buy I totally envisage myself do a trifire with the 295x2 and the 390x... ohhhh boy oh boy , now that's going to be interesting.


----------



## xer0h0ur

Hold the phone. 300 series is going to be crossfire compatible with the 295X2? If this is true then I have no reason at all to slap an EK block on my 290X anymore.


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Lol i think i know who your referring too, in fact i haven't seen nor heard from him since the 980's surfaced.


What was amusing to me is I saw him say on the Ares 3 thread that he wanted one.


----------



## joeh4384

Quote:


> Originally Posted by *xer0h0ur*
> 
> Hold the phone. 300 series is going to be crossfire compatible with the 295X2? If this is true then I have no reason at all to slap an EK block on my 290X anymore.


I doubt that. Usually needs to be the same GPU to crossfire. It is going to be hard to resist to urge to upgrade from the hawaiis to bermudas


----------



## xer0h0ur

Perhaps for you ballers out there but I spent $1500 on this damn thing and have not even paid it off yet. I can afford to swap out a 290X for a 390X if its crossfire compatible but even if they make a 395X2 I won't be able to get it.


----------



## electro2u

Yah I like my space heater that does math. Been considering selling my 290x. Have a 4790k and hero VII need to put on eBay too. Wish this place had fewer seller requirements.


----------



## ljreyl

Hey Syceo,
Do you have a fire strike extreme score now that you have a PLX chip on your mobo? I'm still curious if there's a performance benefit.


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> Yah I like my space heater that does math. Been considering selling my 290x. Have a 4790k and hero VII need to put on eBay too. Wish this place had fewer seller requirements.


Why're you selling, and how much?


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> Why're you selling, and how much?


Got x79 build and I like it. The processor I'm using (4820k) isn't quite as good, but I love the Rampage platform.


----------



## electro2u

Initial impressions testing 16x vs. 8x PCIE 3.0 on 290x and 295x2:
It DOES make a difference.

Here's what I have experienced so far after replumbing my 290x into my system with x79 platform instead of z97--
I wanted to use the 290x on the bottom and I wanted to plug my Catleap into the 290x
Immediately upon setting the 120Hz resolution up I could tell things were not running smoothly like they have been this past week. It is super obvious just moving the mouse cursor around. Frames are being lost.
So I ran Firestrike. Nasty microstutter.

So I went into my Rampage IV BIOS> It has a PCIE lane estimator so you can see what will happen if you put a card in each PCIE slot without actually doing it.
First off, I see the 290x isn't running at 16x... but I'm not sure why until I check the lane estimator and set it to configure a card in the very last slot... where my soundcard is/was. Having a card in that last slot caused my 290x to run at 8x.

So I took the soundcard out. BAM microstutter gone. Mouse cursor movement smoothed. Check GPU-z and my 290x is at 16x and so is the 295x2.

More testing to be done.


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> I promise you, it's the drivers. My card's are very well cooled. With radiator fans in push/pull and the vrm fan at 100%. GPU's are at around 65c on load and they still micro stutter.
> 
> The only solution, again, is to run only one 295x2, as I can't notice any stuttering like that.


I really hope you are right, i just can't believe it's so hard to make the drivers work properly... It's been, what, 4 months, since those cards have launched?
I just feel like i really lost something, when one card performs better, than two, lol
Well, at least those cards will run cooler, when i'll get my fans and when you'll post how to control the VRM fan.


----------



## stxe34

well both my cards with ek water-blocks are now going up for sale...not waiting anymore if i could i would RMA them and get my money back but they are over a month old now so doubt i will get a refund.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> Initial impressions testing 16x vs. 8x PCIE 3.0 on 290x and 295x2:
> It DOES make a difference.
> 
> Here's what I have experienced so far after replumbing my 290x into my system with x79 platform instead of z97--
> I wanted to use the 290x on the bottom and I wanted to plug my Catleap into the 290x
> Immediately upon setting the 120Hz resolution up I could tell things were not running smoothly like they have been this past week. It is super obvious just moving the mouse cursor around. Frames are being lost.
> So I ran Firestrike. Nasty microstutter.
> 
> So I went into my Rampage IV BIOS> It has a PCIE lane estimator so you can see what will happen if you put a card in each PCIE slot without actually doing it.
> First off, I see the 290x isn't running at 16x... but I'm not sure why until I check the lane estimator and set it to configure a card in the very last slot... where my soundcard is/was. Having a card in that last slot caused my 290x to run at 8x.
> 
> So I took the soundcard out. BAM microstutter gone. Mouse cursor movement smoothed. Check GPU-z and my 290x is at 16x and so is the 295x2.
> 
> More testing to be done.


Im with you on that. As you know i changed my board to the MSI Xpower AC , 295x2 and 290x2 now are running at x16, it does (from what i have seen) indeed make a difference


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Im with you on that. As you know i changed my board to the MSI Xpower AC , 295x2 and 290x2 now are running at x16, it does (from what i have seen) indeed make a difference


I don't think it improves fps in benches much if at all, it's something else. I can't explain it except that things "hitch". For example in the 3dmark menus, changing from, say, custom tab to tests tab, at 8x pcie the animation of the menu scrolling over will stutter. At 16x it is smoother, this goes on pretty much everywhere, even just moving the mouse. BTW cpu settings have a significant effect on this stutter as well, using Manual vcore vs adaptive/offset.


That's about as good as my system can do. Neither my 295x2 nor my 290x can overclock very high at all. This is like 1125cores 1500mem
I think CPU speed affects these scores more than it should. Particularly the combined test.

Edit: Well less adaptive vs manual vcore settings than just turning speedstep off. Speedstep is horrible.

P.S. Use testufo.com in chrome and see if it's hitching at all. It should be perfectly smooth and it should stay valid.


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> I don't think it improves fps in benches much if at all, it's something else. I can't explain it except that things "hitch". For example in the 3dmark menus, changing from, say, custom tab to tests tab, at 8x pcie the animation of the menu scrolling over will stutter. At 16x it is smoother, this goes on pretty much everywhere, even just moving the mouse. BTW cpu settings have a significant effect on this stutter as well, using Manual vcore vs adaptive/offset.
> 
> 
> That's about as good as my system can do. Neither my 295x2 nor my 290x can overclock very high at all. This is like 1125cores 1500mem
> I think CPU speed affects these scores more than it should. Particularly the combined test.


Jayztwocents did a video not too long ago, mentioning his problem with his 3-way SLI setup. When he underclocked his CPU, he noticed, that his performance increased significantly.


----------



## Syceo

Guys your thoughts please.....

Do you think its worth while pairing the 295x2 with an Asus ROG Ares III AMD Radeon Dual Core R9 290X or do you think this is overkill for 1440 on Asus swifts?


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> I don't think it improves fps in benches much if at all, it's something else. I can't explain it except that things "hitch". For example in the 3dmark menus, changing from, say, custom tab to tests tab, at 8x pcie the animation of the menu scrolling over will stutter. At 16x it is smoother, this goes on pretty much everywhere, even just moving the mouse. BTW cpu settings have a significant effect on this stutter as well, using Manual vcore vs adaptive/offset.
> 
> 
> That's about as good as my system can do. Neither my 295x2 nor my 290x can overclock very high at all. This is like 1125cores 1500mem
> I think CPU speed affects these scores more than it should. Particularly the combined test.
> 
> Edit: Well less adaptive vs manual vcore settings than just turning speedstep off. Speedstep is horrible.
> 
> P.S. Use testufo.com in chrome and see if it's hitching at all. It should be perfectly smooth and it should stay valid.


I find it strange that, at least for me, I have not seen any microstutter or any of these hitches on my system. Everything has been super smooth (after I put thermal grease on the PLX chip







). There has to be a way to compare this between us. Seems like the FPS gain is niche when seeing the z97 vs x79, PCIE 3.0 @8X vs PCIE 3.0 @16x.

I really don't know how we can compare though. It seems like this is a "feeling" type of thing... and on my side, I feel super smooth and consistent.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Guys your thoughts please.....
> 
> Do you think its worth while pairing the 295x2 with an Asus ROG Ares III AMD Radeon Dual Core R9 290X or do you think this is overkill for 1440 on Asus swifts?


Not worth it, quadfire scaling is terrible vs trifire scaling. And the extra money for scaling issues isn't worth it.


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> Guys your thoughts please.....
> 
> Do you think its worth while pairing the 295x2 with an Asus ROG Ares III AMD Radeon Dual Core R9 290X or do you think this is overkill for 1440 on Asus swifts?


What is this word overkill? What does it mean?


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> I don't think it improves fps in benches much if at all, it's something else. I can't explain it except that things "hitch". For example in the 3dmark menus, changing from, say, custom tab to tests tab, at 8x pcie the animation of the menu scrolling over will stutter. At 16x it is smoother, this goes on pretty much everywhere, even just moving the mouse. BTW cpu settings have a significant effect on this stutter as well, using Manual vcore vs adaptive/offset.
> 
> 
> That's about as good as my system can do. Neither my 295x2 nor my 290x can overclock very high at all. This is like 1125cores 1500mem
> I think CPU speed affects these scores more than it should. Particularly the combined test.
> 
> Edit: Well less adaptive vs manual vcore settings than just turning speedstep off. Speedstep is horrible.
> 
> P.S. Use testufo.com in chrome and see if it's hitching at all. It should be perfectly smooth and it should stay valid.


And this score still shows me you have problems with your setup.

The 4790k is at 4.5 GHz and the GPUs are at 1071/1450... LOWER CLOCKS THAN YOU and I STILL get a higher score.
http://www.3dmark.com/fs/2901244

One issue I did find is using manual voltage. Turns out I was in the voltage zone where my CPU was starved for volts and was kinda throttling itself to compensate. I would try raising voltage on your CPU a couple steps.


----------



## ljreyl

I ran TestUFO... got my valid result in case you're wondering.


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> And this score still shows me you have problems with your setup.
> 
> The 4790k is at 4.5 GHz and the GPUs are at 1071/1450... LOWER CLOCKS THAN YOU and I STILL get a higher score.
> http://www.3dmark.com/fs/2901244
> 
> One issue I did find is using manual voltage. Turns out I was in the voltage zone where my CPU was starved for volts and was kinda throttling itself to compensate. I would try raising voltage on your CPU a couple steps.


I'm not having any issues... I'm not using a Haswell. Your cpu is way faster.


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> I'm not having any issues... I'm not using a Haswell. Your cpu is way faster.


What's your CPU clock? I want to downclock and do a comparison.

edit - Did a check and looks like you turbo at 3.9GHz. Did you overclock? And if so, what's the speed?


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> What's your CPU clock? I want to downclock and do a comparison.


We're not on the same platform, your cpu is faster than mine at the SAME CLOCKS and I'm not interested in competing with you over 200 points in Firestrike.


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> We're not on the same platform, your cpu is faster than mine at the SAME CLOCKS and I'm not interested in competing with you over 200 points in Firestrike.


It's not competing, it's figuring out this entire PCIE 8x vs 16x, PLX chips, platforms, microstutter, blah blah blah.
I personally don't care if anyone get "X" score and what not. I DO CARE to understand what each person has, and the correlation in performance with a similar GPU set up.

The past couple pages of this forum were talking heavily on PLX chips and PCIE lanes and how one is smoother or faster and I really want to investigate it.
Again, I'm not trying to compete, I'm trying to learn. And I'm sure that there are other people here that want a facts based on tests and evidence as well.


----------



## xer0h0ur

Well I bumped my turbo boost overclock 100MHz from 4.1 to 4.2 GHz just to see if it made any difference. Overall score dropped 5 points, Graphics and Combined scores went down with the Physics score going up.

Before: 12744 http://www.3dmark.com/fs/2893610

After: 12739 http://www.3dmark.com/fs/2902107

Now I am curious to revert the system back to stock clocks and run a test to see if score goes up or not.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well I bumped my turbo boost overclock 100MHz from 4.1 to 4.2 GHz just to see if it made any difference. Overall score dropped 5 points, Graphics and Combined scores went down with the Physics score going up.
> 
> Before: 12744 http://www.3dmark.com/fs/2893610
> 
> After: 12739 http://www.3dmark.com/fs/2902107
> 
> Now I am curious to revert the system back to stock clocks and run a test to see if score goes up or not.


I had that issue. I got a full 1000 points lower, all because I had my voltage .001 too low.

Try upping the voltage. At 4,5GHz, 1.147 volts, I got this score.
http://www.3dmark.com/fs/2893903

Raising it .001V to 1.148V, I get this score.
http://www.3dmark.com/fs/2901244

Don't know what voltage you would need though.


----------



## xer0h0ur

Well as is I am not completely certain that the OC is stable. It took me several BIOS setting changes to make it through a 30 minute stress test in the XTU since I wasn't running enough additional turbo boost voltage. Had to bump it somewhere around 100mV before it completed the 30 min stress test.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well as is I am not completely certain that the OC is stable. It took me several setting changes to make it through a 30 minute stress test in the XTU since I wasn't running enough additional turbo boost voltage. Had to bump it somewhere around 100mV before it completed the 30 min stress test.


Ahh I see. Well, when you get it stable, let me know the results please


----------



## xer0h0ur

Oh btw I don't know how it works out for you guys but the XTU is virtually useless for me when it comes to turbo boost overclocking since 3dmark will only recognize or use my max boost clock from BIOS settings. So I had to fumble about in my extremely limited BIOS settings to find out that for me my "VID Override for Max Turbo Ratio" setting was what modified my additional turbo boost voltage.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh btw I don't know how it works out for you guys but the XTU is virtually useless for me when it comes to turbo boost overclocking since 3dmark will only recognize or use my max boost clock from BIOS settings. So I had to fumble about in my extremely limited BIOS settings to find out that for me my "VID Override for Max Turbo Ratio" setting was what modified my additional turbo boost voltage.


Dang, that sucks... at least you're able to do it


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh btw I don't know how it works out for you guys but the XTU is virtually useless for me when it comes to turbo boost overclocking since 3dmark will only recognize or use my max boost clock from BIOS settings. So I had to fumble about in my extremely limited BIOS settings to find out that for me my "VID Override for Max Turbo Ratio" setting was what modified my additional turbo boost voltage.


Since you have an Alienware mobo, is it possible for you to disable 2 cores? If so, can you do that and run FSE? I want to see if there's a performance benefit when there's the same core count, same speeds (I'll downclock) same RAM speeds (I'll downclock again) but different bandwidth ( PCIE 3.0 @ 16x/8x for you vs PCIE 3.0 @ 16x/8x for me) for trifire.


----------



## xer0h0ur

Both of my cards are running 16x and far as I know I can't disable cores in the BIOS but I will check when I get off work later tonight.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Both of my cards are running 16x and far as I know I can't disable cores in the BIOS but I will check when I get off work later tonight.


Sounds good. And that's even better that they're running at 16x each!


----------



## xer0h0ur

Turns out there was an option to limit it to 4 active cores. This was the result: http://www.3dmark.com/3dm/4272681?


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Turns out there was an option to limit it to 4 active cores. This was the result: http://www.3dmark.com/3dm/4272681?


I copied your clocks (CCC went up to 1061 though for some reason -_-), RAM speed, and CPU speed. here's my results.
http://www.3dmark.com/3dm/4272752?

I ran both GPUs at PCIE 3.0 8x each

You were running at 16x for both right?

I think you were throttling on one of your cards?
CPU scores are slightly different too... hmmm....


----------



## xer0h0ur

Remember that only my 295X2 is running waterblocked while the 290X is air cooled at the moment. If you mean thermal throttling then no there isn't any thermal throttling as the clocks are running full speed. As for the PCIe slots yes they are both still running at 16x.

So what are you getting at anyways? I only see a 480 point difference between scores. My processor is not the same as yours so the scores will never be equal all things considered. Even if I turn off two cores its still has a larger cache and 40 PCIe lanes versus your 16.

Edit: I actually forgot the largest difference. Motherboard. My mobo is nothing special at all. I wouldn't be surprised if my scores were much higher on an Asus mobo if I ever change it.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Remember that only my 295X2 is running waterblocked while the 290X is air cooled at the moment. If you mean thermal throttling then no there isn't any thermal throttling as the clocks are running full speed. As for the PCIe slots yes they are both still running at 16x.
> 
> So what are you getting at anyways? I only see a 480 point difference between scores. My processor is not the same as yours so the scores will never be equal all things considered. Even if I turn off two cores its still has a larger cache and 40 PCIe lanes versus your 16.
> 
> Edit: I actually forgot the largest difference. Motherboard. My mobo is nothing special at all. I wouldn't be surprised if my scores were much higher on an Asus mobo if I ever change it.


I was mainly aiming to see if your 40 lanes would do open up your cards more than my 16 lanes. I understood that we have different architectures when it comes to CPU and a different mobo, just tried eliminating as many variables that were programmable as possible.

All your scores and my scores showed me was that PCIE 3.0 @ 8x is really all you need for a trifire (and possibly quad fire) setup.

DigitalStorm already had benches for a 6 core w/Rampage IV so I was able to compare to that when it comes to a faster CPU and more lanes than me, which was nearly identical performance.
Now your scores just further solidify that, at the same speeds for GPU, CPU, and RAM that 8x really is enough.

I know you didn't have to run a bench or anything with specific setting, but thank you for doing that.


----------



## xer0h0ur

So ultimately you just wanted to compare performance at 8x versus 16x? Its not going to be very much at least if measured through 3dmark benches.


----------



## Elmy

http://www.3dmark.com/3dm/4273717?

4770K @ 4.6
Ram @ 1600 8-8-8-24 2T
Cards @ 1049/1375 10% bump in power limit and core voltage.
Asus Sabertooth Z87 MB
with 2 Club3D 295X2's

14.7 RC3's ... 14.9 don't work .... only 14.7 RC3's and 14.8's work on my setup :-/


----------



## xer0h0ur

Can you elaborate what you mean by 14.9 doesn't work.


----------



## Elmy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Can you elaborate what you mean by 14.9 doesn't work.


for my 5 monitor setup....crazy screen tearing....


----------



## xer0h0ur

Oh riiiiiiiiiight. I completely forgot you had that setup. I have been hoping that they fix those issues in the near future so I can potentially do a multi-monitor setup too.


----------



## electro2u

Like I said, the difference between x16 and x8 isn't showing up in fps. It's a question of smoothness. I have a switch at the moment to go back and forth between x16 and x8 on just the card I'm outputting to my monitor (maybe it's even more obvious to me because I'm overclocking the monitor, idk). On x8 it is not smooth on my desktop. On x16 it is. Very obvious difference just moving folders and menus around. Maybe it's less noticeable on your Swift or maybe you just don't have a frame for comparison.

At any rate it's the same effect on the 295x2 and the 290x. They both output smoother to my Catleap @120hz when they are set to x16 lanes instead of x8. So I don't care about benchmark scores as long as they are ballpark where they should be.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> So ultimately you just wanted to compare performance at 8x versus 16x? Its not going to be very much at least if measured through 3dmark benches.


That was the general idea. We minimized as many variables as possible, with the mobo, OS, and CPU (and I guess RAM size) being the main differences.
3DMark was chosen because it's the most widely used benching software that practically everyone on this forum has. Also, it's synthetic, which means consistent tests and consistent results and is more GPU based. I didn't see anyone talking about games with bench software like Tomb Raider, BioShock Infinite, Shadow of Mordor, etc... so 3DMark won by default. And judging from a X79 vs Z97 setup that we both just did WHILE comparing my personal experience vs the DigitalStorm/HardOCP Trifire review, there is no evidence that there is anything to be gained from a GPU standpoint between platforms. They both give the same performance.
Quote:


> Originally Posted by *electro2u*
> 
> Like I said, the difference between x16 and x8 isn't showing up in fps. It's a question of smoothness. I have a switch at the moment to go back and forth between x16 and x8 on just the card I'm outputting to my monitor (maybe it's even more obvious to me because I'm overclocking the monitor, idk). On x8 it is not smooth on my desktop. On x16 it is. Very obvious difference just moving folders and menus around. Maybe it's less noticeable on your Swift or maybe you just don't have a frame for comparison.
> 
> At any rate it's the same effect on the 295x2 and the 290x. They both output smoother to my Catleap @120hz when they are set to x16 lanes instead of x8. So I don't care about benchmark scores as long as they are ballpark where they should be.


When it comes to smoothness, I personally haven't noticed or felt any hitching, micro-stutter,etc when running a dual monitor setup (BenQXL2420T @ 120Hz 1080P in Portrait, ROG Swift @ 120Hz 1440P in Landscape) whether it be in games or on the desktop. I also happen to run my computer extremely spartan, meaning that the only things that load with the OS are drivers. That could also boost performance/smoothness with less items being processed/using resources in the background.

There's too many freaking variables to make a perfect apple - apples comparison. So the data that was gathered can either be taken into account, or left alone. My own conclusion is that I don't need to move to a different platform for 40 PCIE lanes or need a mobo with a PLX chip. My Z97 and 16 lanes are all that's needed to maximize this trifire setup.


----------



## ljreyl

Joined the top 100 club for a 3 GPU setup on FireStrike Extreme. =)
http://www.3dmark.com/fs/2909766

I'm number 92 - http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/3+gpu


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> There's too many freaking variables to make a perfect apple - apples comparison. So the data that was gathered can either be taken into account, or left alone. My own conclusion is that I don't need to move to a different platform for 40 PCIE lanes or need a mobo with a PLX chip. My Z97 and 16 lanes are all that's needed to maximize this trifire setup.


No one is trying to say *should* change anything and I don't care if your cards are maximized or not. I care about my experience. On my system there is a noticeable difference running these cards at x16 and x8. FWIW I wouldn't have known there was an issue if I hadn't tested. The difference is obvious to me, but running the cards in 8x doesn't present a "broken" type scenario, it just isn't as smooth. I went from 8x/8x on a z97 w a very fast 4790k to a slower 4820k on x79 and only the 295x2 at 16x and immediately noticed better response and performance on the desktop, where I spend most of my time. As for gaming performance and benchmarking, the raw fps just isn't affected--I wish it were.

I'm actually pretty disappointed that I won't get to use my 4790k because it is blazing fast (and it is why you're consistently on top in firestrike extreme, because that benchmark is definitely affected by CPU power). But I value smoothness over benchmark numbers ultimately, so it's, unfortunately, an obvious choice for me.

Edit: If you actually wanted to test if you could tell the difference in "feel" just unplug the power to one of your cards so the one your swift is on is at x16 and move stuff around on the desktop. It's not a HUGE earthshattering difference I'm seeing, but it's obvious.


----------



## electro2u

Quote:


> Originally Posted by *Elmy*
> 
> http://www.3dmark.com/3dm/4273717?
> 
> 4770K @ 4.6
> Ram @ 1600 8-8-8-24 2T
> Cards @ 1049/1375 10% bump in power limit and core voltage.
> Asus Sabertooth Z87 MB
> with 2 Club3D 295X2's
> 
> 14.7 RC3's ... 14.9 don't work .... only 14.7 RC3's and 14.8's work on my setup :-/


Hi Elmy, thanks for weighing in on this. I imagine you are busy with more important things than this but do you ever consider switching the 4770k out for a X99 offering? I think it might make a significant difference in your cards frametime output. I'm shocked that AMD still has not fixed the eyefinity issues with the 295x2...


----------



## Elmy

Quote:


> Originally Posted by *electro2u*
> 
> Hi Elmy, thanks for weighing in on this. I imagine you are busy with more important things than this but do you ever consider switching the 4770k out for a X99 offering? I think it might make a significant difference in your cards frametime output. I'm shocked that AMD still has not fixed the eyefinity issues with the 295x2...


Yup....working on my new build right now ... just started ....one of sponsors is working with Intel to try to get me a 5960X....if that falls through might only get the 5930K....will know within the next couple weeks....

also thinking of switching out my 5 monitors for a LG 34UM95 just until the new 120Hz 21:9 comes in....waiting to find out about Freesync monitors that will be out 1stQ14 first though...


----------



## ljreyl

Not AMD cards but another look at 16x vs 8x for multi GPU. Posted today on "Linus Tech Tips"


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> No one is trying to say *should* change anything and I don't care if your cards are maximized or not. I care about my experience. On my system there is a noticeable difference running these cards at x16 and x8. FWIW I wouldn't have known there was an issue if I hadn't tested. The difference is obvious to me, but running the cards in 8x doesn't present a "broken" type scenario, it just isn't as smooth. I went from 8x/8x on a z97 w a very fast 4790k to a slower 4820k on x79 and only the 295x2 at 16x and immediately noticed better response and performance on the desktop, where I spend most of my time. As for gaming performance and benchmarking, the raw fps just isn't affected--I wish it were.
> 
> I'm actually pretty disappointed that I won't get to use my 4790k because it is blazing fast (and it is why you're consistently on top in firestrike extreme, because that benchmark is definitely affected by CPU power). But I value smoothness over benchmark numbers ultimately, so it's, unfortunately, an obvious choice for me.
> 
> Edit: If you actually wanted to test if you could tell the difference in "feel" just unplug the power to one of your cards so the one your swift is on is at x16 and move stuff around on the desktop. It's not a HUGE earthshattering difference I'm seeing, but it's obvious.


I'm not trying to fight with you dude lol. I just wanted to conduct the tests after reading the last couple pages of debate on PLX and 16x/8x.
I disabled crossfire so only 1 GPU is active AND I unplugged my 290x and I didn't notice a difference, although I run both monitors at 120Hz so I doubt I would have seen a difference anyway.

I really want to replicate this jitter and what not just so I can learn from it and understand what's causing it. O well...


----------



## StillClock1

Hey lreyjl,

I tried the DDU, safe mode thing you described above and now I am having issues with my computer's system time. It looks like it reset the time to about when I used the restore point but seems to be skipping around now. I'm thinking it could be the CMOS battery, but since its not setting to 1/1/1900 I am hoping that is not the case. The Asus formula mobo is very difficult to get to the CMOS battery so I am hoping that's not it.

Have you ever had any issues with the system time in connection to using a restore point?

I'm going to reinstall the OS next weekend anyway, so hopefully that will fix it. If not, I'll know its the CMOS (which would suck because then I need to remove the mobo to address it).

Thanks for any help


----------



## ljreyl

Quote:


> Originally Posted by *Hoff248*
> 
> Hey lreyjl,
> 
> I tried the DDU, safe mode thing you described above and now I am having issues with my computer's system time. It looks like it reset the time to about when I used the restore point but seems to be skipping around now. I'm thinking it could be the CMOS battery, but since its not setting to 1/1/1900 I am hoping that is not the case. The Asus formula mobo is very difficult to get to the CMOS battery so I am hoping that's not it.
> 
> Have you ever had any issues with the system time in connection to using a restore point?
> 
> I'm going to reinstall the OS next weekend anyway, so hopefully that will fix it. If not, I'll know its the CMOS (which would suck because then I need to remove the mobo to address it).
> 
> Thanks for any help


Did you try going into your time setting and setting it via internet time?
That's really strange that you're experiencing that stuff.


----------



## StillClock1

I set it manually, but I'll give that a shot. One part that is interesting is that my ROG front base panel gives a different time than the clock on the bottom right corner.


----------



## Syceo

Quote:


> Originally Posted by *Hoff248*
> 
> I set it manually, but I'll give that a shot. One part that is interesting is that my ROG front base panel gives a different time than the clock on the bottom right corner.


I experienced the exact same issue after flushing via CMOS. If you just sync with the internet clock it will fix it


----------



## StillClock1

That's great to hear, glad to hear I am not alone because these are very unexpected issues to have resulted just from trying an AMD beta driver for the r9 295x2.

What do you mean when you said flushing via CMOS? Did you have to replace your battery?


----------



## stxe34

i have two r9 295x2 ek water blocks for sale. they come complete with bolts to use the original backplate. thanks


----------



## axiumone

Getting rid of the cards already?


----------



## Syceo

Quote:


> Originally Posted by *Hoff248*
> 
> That's great to hear, glad to hear I am not alone because these are very unexpected issues to have resulted just from trying an AMD beta driver for the r9 295x2.
> 
> What do you mean when you said flushing via CMOS? Did you have to replace your battery?


No what I meant is , the board I am using has a button that you press which resets the battery so you dont have to physically pull it out... "it flushes the board" ... thats what i meant


----------



## stxe34

Quote:


> Originally Posted by *axiumone*
> 
> Getting rid of the cards already?


yep


----------



## StillClock1

Great advice, internet time fixed both the front base panel clock as well as the computer time. Thank you


----------



## Syceo

Sacrilege guys lol, I'm thinking of trying out a pair of water cooled 980s to go with this Asus Rog 144hz monitor. Do you guys think it's a worthwhile endeavour, you guys have been great in terms of constructive recommendations so I would like to get your input or thoughts on 295x2 vs 980 Sli. Im not a fan boy of either, it's just now I have this amazing monitor I'm starting to feel like it's under utilised. I would also like to completely reduce the noise of my systems, which is another factor. With 2 980s I could get rid of 2 fans on the top rad, and I'd pretty much have a near silent system.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Sacrilege guys lol, I'm thinking of trying out a pair of water cooled 980s to go with this Asus Rog 144hz monitor. Do you guys think it's a worthwhile endeavour, you guys have been great in terms of constructive recommendations so I would like to get your input or thoughts on 295x2 vs 980 Sli. Im not a fan boy of either, it's just now I have this amazing monitor I'm starting to feel like it's under utilised. I would also like to completely reduce the noise of my systems, which is another factor. With 2 980s I could get rid of 2 fans on the top rad, and I'd pretty much have a near silent system.


The 980 wins hands down! But dang man... buying 2 more GPUs after you just got the Swift. That's a lot of money in a short period of time.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> The 980 wins hands down! But dang man... buying 2 more GPUs after you just got the Swift. That's a lot of money in a short period of time.


What I was thinking was get them and worst case scenario sell the 295x2 and maybe the 290x.


----------



## dcombs108

Any one have a tutorial to add 240mm rad to replace the stock rad on the 295?


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> The 980 wins hands down! But dang man... buying 2 more GPUs after you just got the Swift. That's a lot of money in a short period of time.


Can't be done. The stock cooler is a "closed sealed loop", you simply can't cut tubes and add a rad
Quote:


> Originally Posted by *dcombs108*
> 
> Any one have a tutorial to add 240mm rad to replace the stock rad on the 295?


----------



## stxe34

I'm going for 4xTitan black 6gb for a 4k surround setup. From my experience amd have lost this because of there shoddy driver support and duff stock cooling setup.


----------



## Santho

Ok so I just benchmarked my r9 295x2 on Theif (The ingame benchmark) And i saw that only one of the gpu's was active. Gpu 2 only got up to 49% and when it got that high i went straight back to 0 again. gpu 2 jumped between 0-19% at average, but it only stayed at 19 for a second or two. Is this because im not stressing the card enough? Btw this was a 1920:1080p benchmark with everything at max. I know 1080p on that card is a waste but I just got the card and dont have money for a better monitor just yet. planning on going 4k very soon though. I just hope it's not faulty hardware









Plz help me on this!

Thx in advance


----------



## electro2u

Quote:


> Originally Posted by *Santho*
> 
> Ok so I just benchmarked my r9 295x2 on Theif (The ingame benchmark) And i saw that only one of the gpu's was active. Gpu 2 only got up to 49% and when it got that high i went straight back to 0 again. gpu 2 jumped between 0-19% at average, but it only stayed at 19 for a second or two. Is this because im not stressing the card enough? Btw this was a 1920:1080p benchmark with everything at max. I know 1080p on that card is a waste but I just got the card and dont have money for a better monitor just yet. planning on going 4k very soon though. I just hope it's not faulty hardware


I'll take a look at Thief after it donwloads, haven't ever played it but I do have it. I should say no matter what res you are at the GPUs should both be utilized in Crossfire.
#1 Game needs to be fullscreen
#2 Crossfire needs to be enabled in Catalyst Control Center


----------



## Santho

Quote:


> Originally Posted by *electro2u*
> 
> I'll take a look at Thief after it donwloads, haven't ever played it but I do have it. I should say no matter what res you are at the GPUs should both be utilized in Crossfire.
> #1 Game needs to be fullscreen
> #2 Crossfire needs to be enabled in Catalyst Control Center


Ok Thx dude!
I do have both fullscreen on and crossifre enabled.


----------



## joeh4384

Quote:


> Originally Posted by *Santho*
> 
> Ok so I just benchmarked my r9 295x2 on Theif (The ingame benchmark) And i saw that only one of the gpu's was active. Gpu 2 only got up to 49% and when it got that high i went straight back to 0 again. gpu 2 jumped between 0-19% at average, but it only stayed at 19 for a second or two. Is this because im not stressing the card enough? Btw this was a 1920:1080p benchmark with everything at max. I know 1080p on that card is a waste but I just got the card and dont have money for a better monitor just yet. planning on going 4k very soon though. I just hope it's not faulty hardware
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plz help me on this!
> 
> Thx in advance


I believe in Thief, you can use Mantle and crossfire when running in windowed mode.


----------



## electro2u

Quote:


> Originally Posted by *Santho*
> 
> I do have both fullscreen on and crossifre enabled.


I think I would definitely try to disable the ULPS function in MSI Afterburner.

Is your second GPU registering a temperature?


----------



## Santho

Quote:


> Originally Posted by *electro2u*
> 
> I think I would definitely try to disable the ULPS function in MSI Afterburner.
> 
> Is your second GPU registering a temperature?


I have disabled ULPS and yes my gpu2 is registering temps
Quote:


> Originally Posted by *joeh4384*
> 
> I believe in Thief, you can use Mantle and crossfire when running in windowed mode.


I have enabled the Mantle Crossfire feature.


----------



## Syceo

well guys , pulled the trigger on a paid of 980's so lets see how they stack up too this trifire setup


----------



## DividebyZERO

Quote:


> Originally Posted by *Syceo*
> 
> well guys , pulled the trigger on a paid of 980's so lets see how they stack up too this trifire setup


The tri fire should be way faster unless your at 1080p. Other wise give or take 2x980 vs 2x290x should yield about same results.


----------



## ljreyl

Quote:


> Originally Posted by *DividebyZERO*
> 
> The tri fire should be way faster unless your at 1080p. Other wise give or take 2x980 vs 2x290x should yield about same results.


http://www.guru3d.com/articles_pages/geforce_gtx_980_sli_review,18.html

This should give you a general idea. Of course, they don't have trifire benches, but add about 40% more than what the 295x2 puts out.
Ex - 295x2 is 100FPS, trifire should be around the 140fps mark.


----------



## Syceo

Quote:


> Originally Posted by *DividebyZERO*
> 
> The tri fire should be way faster unless your at 1080p. Other wise give or take 2x980 vs 2x290x should yield about same results.


Trifire will be faster, but for what i want to achieve @1440p , ive stepped back from 4K so 2x980 seems like a viable option too. Im trying to go for stealth and quiet now , and im getting an annoying hum from these fans that i have in push/pull. If i run 2x 980 underwater, I can remove a set of fans, have room to overclock the cards and still run max frames on this G sync monitor. If however it doesnt work out how i had intended, I'll be sticking with the current setup


----------



## DividebyZERO

Quote:


> Originally Posted by *ljreyl*
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_980_sli_review,18.html
> 
> This should give you a general idea. Of course, they don't have trifire benches, but add about 40% more than what the 295x2 puts out.
> Ex - 295x2 is 100FPS, trifire should be around the 140fps mark.


Metro LL is an exception - my own testing found it doesnt scale past 2 cards for AMD. It is an Nvidia title but looks like no scaling beyond 2xgpus either. Really a shame since it could use it.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Trifire will be faster, but for what i want to achieve @1440p , ive stepped back from 4K so 2x980 seems like a viable option too. Im trying to go for stealth and quiet now , and im getting an annoying hum from these fans that i have in push/pull. If i run 2x 980 underwater, I can remove a set of fans, have room to overclock the cards and still run max frames on this G sync monitor. If however it doesnt work out how i had intended, I'll be sticking with the current setup


Have you thought about undervolting the fans with a fan controller/mobo settings? Or changing the fans?
You don't need all those fans for cooling, really. You have more than I do and I'm below the 75c threshold by 10c with overclocks on all cards.
Granted, i'm using different fans than you are but yea.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Have you thought about undervolting the fans with a fan controller/mobo settings? Or changing the fans?
> You don't need all those fans for cooling, really. You have more than I do and I'm below the 75c threshold by 10c with overclocks on all cards.
> Granted, i'm using different fans than you are but yea.


Actually was thinking about that earlier today. What i failed to do is put the top rad fans on a controller as i did with the others. But i think the fan noise maybe a bearing issue rather than the speed (not 100% sure) . Im also curious to see how 2x980 will perform with a monitor it was intended for. Additionally there are reports in other threads that on water 2x980 can be pushed another 15% + in performance , so given all those variables in my opinion is worth a punt. If i discover that this is nowhere near what is claimed, they will be going back pronto. Ultimately as long as i can max out games at 1440, have a dead quiet system and use the monitor as intended then I'll be content


----------



## xer0h0ur

Syceo aren't you also using those 2000 RPM NF-F12 Industrial PPC fans? Mine are now making an annoying sound at anything higher than 96% fan speed. I can't really describe it per say but its like between a clicking sound and a rubbing sound. I was going to remove the fans and lube them to see if it will remedy the issue but I have been too lazy to take the system apart to take the fans off the radiators. I have to install a new CPU waterblock anyways so at least I can kill two birds with one stone.

As for whomever was having problems with Thief. I have horrendous crossfire scaling with my tri-fire setup in that game while using the 14.9 Cat. If left alone while running maxed out visuals I am getting roughly 60ish% usage across all three GPUs and my framerates are nothing to show off or be particularly proud of. Basically averaging in the 60's. However if you manually create a crossfire profile within the CCC and scroll down to the Crossfire options to select "AFR Friendly" then GPU1 will get loaded near 100% non-stop and GPU2/GPU3 will sit there twiddling their thumbs and your framerates will go up.


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Syceo aren't you also using those 2000 RPM NF-F12 Industrial PPC fans? Mine are now making an annoying sound at anything higher than 96% fan speed. I can't really describe it per say but its like between a clicking sound and a rubbing sound. I was going to remove the fans and lube them to see if it will remedy the issue but I have been too lazy to take the system apart to take the fans off the radiators. I have to install a new CPU waterblock anyways so at least I can kill two birds with one stone.
> 
> As for whomever was having problems with Thief. I have horrendous crossfire scaling with my tri-fire setup in that game while using the 14.9 Cat. If left alone while running maxed out visuals I am getting roughly 60ish% usage across all three GPUs and my framerates are nothing to show off or be particularly proud of. Basically averaging in the 60's. However if you manually create a crossfire profile within the CCC and scroll down to the Crossfire options to select "AFR Friendly" then GPU1 will get loaded near 100% non-stop and GPU2/GPU3 will sit there twiddling their thumbs and your framerates will go up.


I switched out those industrial fans when i added the loop. Im using High pressure corsair airs. But i can confirm that i did experience that hummm, tick sound with those fans which was also a contributing factor in me switching them out for the air series. I didn't notice it that much at first , but silly me... once i had locked onto that annoying sound that was it for me... pretty much couldn't ignore it after that. Its even worse when your not gaming, and just trying to get some normal pc usage. Although im experiencing a similar sound on this set its no way as pronounced as the NF-F12s . I figured it cant be the pump as that wasnt installed until after...


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> I switched out those industrial fans when i added the loop. Im using High pressure corsair airs. But i can confirm that i did experience that hummm, tick sound with those fans which was also a contributing factor in me switching them out for the air series. I didn't notice it that much at first , but silly me... once i had locked onto that annoying sound that was it for me... pretty much couldn't ignore it after that. Its even worse when your not gaming, and just trying to get some normal pc usage. Although im experiencing a similar sound on this set its no way as pronounced as the NF-F12s . I figured it cant be the pump as that wasnt installed until after...


Wow those fans are loud! Why didn't you just go with the regular NF-F12s?
The static pressure on those are still 1.83 mm h20 when undervolted.
As a benchmark, I'm running those fans undervolted (4 of them) on the radiators plus other Noctua fans (NF S12/NF A14) to keep my entire system UNDER 25 dBa


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Wow those fans are loud! Why didn't you just go with the regular NF-F12s?
> The static pressure on those are still 1.83 mm h20 when undervolted.
> As a benchmark, I'm running those fans undervolted (4 of them) on the radiators plus other Noctua fans (NF S12/NF A14) to keep my entire system UNDER 25 dBa


Those fans where only used when I was on air



I have NFF12's upfront now, they are hidden, the corsair ones im using because of the colour scheme ( coloured rings )


----------



## xer0h0ur

My fans lasted a couple of months running at full speed before they began making that sound. I assume its a problem with the bearings.


----------



## stxe34

EK blocks are now on ebay...thanks for all your help!


----------



## xer0h0ur

I was under the impression the problems with multiple monitor setups aren't ironed out for AMD or Nvidia for that matter. Hopefully your 980's work out better for you.


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> I was under the impression the problems with multiple monitor setups aren't ironed out for AMD or Nvidia for that matter. Hopefully your 980's work out better for you.


They really aren't. When I've had it with my 7970 quad crossfire set up problems, I switched to tri sli 780. In portrait mode, nvidia driver's gave me just as many issues. You'd like to adjust your bezel correction? Well, that'll make your games run at 5 fps. You'd like to disable surround and run the monitors separately? Well, that'll just produce a BSOD.

Here's the straw that broke the camels back though. Remember all that hoopla about lightboost? So, I went out to purchase 3 asus lightboost monitors to go along with the 3 780's. Next driver revision, nvidia breaks lightboost in surround. This was broken for 8 months. There was an epic 100+ page thread on nvidia forums about that.

So yeah, grass isn't always greener... so to speak.


----------



## shadow85

Guys, which card is better out of the

MSI GAMING GTX 980 and
Gigabyte G1 Gaming GTX 980?

They are both about the same price in my country, and I don't really plan on extra overclocking.


----------



## ljreyl

Quote:


> Originally Posted by *shadow85*
> 
> Guys, which card is better out of the
> 
> MSI GAMING GTX 980 and
> Gigabyte G1 Gaming GTX 980?
> 
> They are both about the same price in my country, and I don't really plan on extra overclocking.


You should really ask this in the 980 owners club forum.


----------



## stxe34

Quote:


> Originally Posted by *ljreyl*
> 
> You should really ask this in the 980 owners club forum.


i knew that was coming lol


----------



## xer0h0ur

So you knew guys who own AMD cards would refer people to Nvidia owners. How clairvoyant.


----------



## shadow85

Quote:


> Originally Posted by *ljreyl*
> 
> You should really ask this in the 980 owners club forum.


Zomg, I thought I had the Gtx 980 thread open on my phone lol didnt realise I was in 295x2 thread. Haha sorry guys, no harm intended.


----------



## dcombs108

Ok folks...bought a sapphire r9 295 17 days ago from microcenter....have my receipt box is kinda messed up...but anyway....heard a loud pop....video shut off...can't get signal....anyone have a similar experienc?


----------



## xer0h0ur

Oh man that sucks. I think you're one of like two maybe three people in this thread that has had that happen. Take it back for an exchange/refund.


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh man that sucks. I think you're one of like two maybe three people in this thread that has had that happen. Take it back for an exchange/refund.


maybe test another video card first, it could have been his PSU.


----------



## xer0h0ur

Well when he said no more video output I assumed he was talking about starting up the pc and not getting a video signal.


----------



## electro2u

Seems like Sapphire has had the most failures... Sorry it's happened


----------



## Mega Man

just got my 2 waterblocks in for the 295x2 !

had to go ek . i wish swiftech would be a bit quicker on the draw to put out blocks :/


----------



## xer0h0ur

You going fujipolys on those suckers?


----------



## Mega Man

I don't think it is worth it without voltage control.


----------



## xer0h0ur

People keep saying no voltage control yet afterburner and CCC give voltage controls or what do you mean by voltage control?


----------



## yifeng3007

Quote:


> Originally Posted by *Mega Man*
> 
> I don't think it is worth it without voltage control.


I know, that i have voltage control, but i had to disable ULPS otherwise my voltage control in Afterburner wasn't adjustable.

I am still hoping, that *axiumone* will show us how to control the vrm fan


----------



## axiumone

It's coming, I promise!

Here's the reason I haven't posted it. I'm waiting for a set of very specific cables from moddiy.com. They're coming from China, so it's taking a bit of time.

In my rig at the moment it's just a bunch of naked wire leads connecting to the VRM fan on the 295x2. It's really not classy to show that off. lol


----------



## xer0h0ur

Its not rocket science. Disconnect the fan's 2 pin connector from the video card and manually power it yourself wired to a rheostat or some other control method.


----------



## xer0h0ur

Just get yourself this cable: http://www.moddiy.com/products/4%252dPin-Molex-Connector-(Male)-to-2%252dPin-GPU-Mini-Fan-Connector-(Male-2.0mm).html then use a rheostat between that and the PSU.

Edit: Wrong cable, as axiumone said its this one: http://www.moddiy.com/products/4%252dPin-PWM-Fan-Connector-%28Female%29-to-4%252dPin-Mini-GPU-Fan-Connector-%28Male%29.html


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just get yourself this cable: http://www.moddiy.com/products/4%252dPin-Molex-Connector-(Male)-to-2%252dPin-GPU-Mini-Fan-Connector-(Male-2.0mm).html then use a rheostat between that and the PSU.


That's the wrong cable. You need to get this one.

http://www.moddiy.com/products/4%252dPin-PWM-Fan-Connector-%28Female%29-to-4%252dPin-Mini-GPU-Fan-Connector-%28Male%29.html

You also don't need an external fan controller. It's actually a 4 pin pwm fan and you can plug it into the motherboard if you motherboard supports pwm control.

The 2 pin out on the 295x2 is for the radeon LED light as it turns out.


----------



## xer0h0ur

Isn't the VRM fan a mini 2-pin connector?


----------



## xer0h0ur

Oh no its not. I don't know what I mixed it with then. The cable for the radiator fan perhaps?

Well that makes it a hell of a lot easier then. You can even control both VRM fans (dual 295X2's) off a single PWM fan header on your motherboard using a Y cable.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> People keep saying no voltage control yet afterburner and CCC give voltage controls or what do you mean by voltage control?


Quote:


> Originally Posted by *yifeng3007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> I don't think it is worth it without voltage control.
> 
> 
> 
> I know, that i have voltage control, but i had to disable ULPS otherwise my voltage control in Afterburner wasn't adjustable.
> 
> I am still hoping, that *axiumone* will show us how to control the vrm fan
Click to expand...

i have done this and no i dont have voltage


----------



## axiumone

Quote:


> Originally Posted by *Mega Man*
> 
> i have done this and no i dont have voltage


I'm not sure what you mean. The top control in your screen shot is core voltage.


----------



## xer0h0ur

His voltage sliders appear unlocked to me in that picture. What are you talking about then?


----------



## Mega Man

Quote:


> Originally Posted by *axiumone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i have done this and no i dont have voltage
> 
> 
> 
> 
> I'm not sure what you mean. The top control in your screen shot is core voltage.
Click to expand...
































i swear before upgrading to 14.9 it was not there...... i have not really cared as i wanted to wait to block them to care, but i swear i am not this crazy


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> That's the wrong cable. You need to get this one.
> 
> http://www.moddiy.com/products/4%252dPin-PWM-Fan-Connector-%28Female%29-to-4%252dPin-Mini-GPU-Fan-Connector-%28Male%29.html
> 
> You also don't need an external fan controller. It's actually a 4 pin pwm fan and you can plug it into the motherboard if you motherboard supports pwm control.
> 
> The 2 pin out on the 295x2 is for the radeon LED light as it turns out.


Quote:


> Originally Posted by *Mega Man*
> 
> i have done this and no i dont have voltage


So. guys, in order to control the vrm fan we would have to disassemble the card, right? Wouldn't we void the warranty in this case?


----------



## Mega Man

technically.... yes but it really depends on what brand and their stance on stuff one reason i like XFX they state it is ok to take it off and install a new cooler ( in the us ) however they have been being stupid ( locking voltage control, taking off dual bios switches )


----------



## xer0h0ur

All you have to do is open the front cover that is on top of the Asetek pumps to reach the cable and unplug it from the PCB. There is nothing that will show any signs you tampered with the card and voided your warranty unless you're horrible at removing painted screws and scratch them up to all hell or strip a head of a screw.


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> All you have to do is open the front cover that is on top of the Asetek pumps to reach the cable and unplug it from the PCB. There is nothing that will show any signs you tampered with the card and voided your warranty unless you're horrible at removing painted screws and scratch them up to all hell or strip a head of a screw.


Screws aren't painted, just plain silver. Also, there aren't any stickers on the shroud you need to remove to get to the fan wires. There wouldn't be any evidence of you tampering with the card.


----------



## xer0h0ur

Even better. You basically have to go out of your way then to leave evidence of voiding your warranty.


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> Screws aren't painted, just plain silver. Also, there aren't any stickers on the shroud you need to remove to get to the fan wires. There wouldn't be any evidence of you tampering with the card.


Well, it makes this a lot easier, guess i'll see what i can do somewhere in the near future. Do you guys think i can use this type of cable - http://www.noctua.at/main.php?show=productview&products_id=74&lng=en , to connect two fans from both of my cards to one connector?


----------



## yifeng3007

Have anyone checked the new 14.9.1 Beta drivers already? http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-9-1BetaWINReleaseNotes.aspx Generally it just fixes some random BSOD's, as far as i understood.


----------



## xer0h0ur

It took them 6 months to release a WHQL approved driver. I am not about to ditch the 14.9 that quickly.


----------



## electro2u

Interestingly, my FFxiv crossfire is broken again. Tried reinstalling win 8.1 but it's back to same behavior as before, works one time only.

New beta driver works well for me, back to win 7.

Puzzled by the FFxiv thing. Very puzzled.


----------



## xer0h0ur

AMD


----------



## DividebyZERO

Quote:


> Originally Posted by *electro2u*
> 
> Interestingly, my FFxiv crossfire is broken again. Tried reinstalling win 8.1 but it's back to same behavior as before, works one time only.
> 
> New beta driver works well for me, back to win 7.
> 
> Puzzled by the FFxiv thing. Very puzzled.


Can you confirm your actually fullscreen in win8.1? I have had issues with fullscreen working in windows 8.1 with crossfire. What monitor are you using?


----------



## electro2u

Quote:


> Originally Posted by *DividebyZERO*
> 
> Can you confirm your actually fullscreen in win8.1? I have had issues with fullscreen working in windows 8.1 with crossfire. What monitor are you using?


Yeah, FFXIV is actually working in crossfire once, and then the funny thing is after that one time I can change the name of the game folder and crossfire works again, but just once. Then in Windows 7 it works permanently (as long as I don't Alt-Tab or lose full screen). I'm using a Yamakasi Catleap.

Been doing quite a bit of testing with it this morning... The scaling in the actual game is so bad that it's almost not worth mentioning that there are problems with Crossfire. The 3 GPUs all work on the game in W7 but they are constantly ramping up and down GPU usage so that I don't end up getting much extra performance out of the extra power. A little bit, but this game just runs awful on the 295x2.

When Crossfire works in windows 8.1 with FFXIV, it works nicely.


----------



## DividebyZERO

Quote:


> Originally Posted by *electro2u*
> 
> Yeah, FFXIV is actually working in crossfire once, and then the funny thing is after that one time I can change the name of the game folder and crossfire works again, but just once. Then in Windows 7 it works permanently (as long as I don't Alt-Tab or lose full screen). I'm using a Yamakasi Catleap.
> 
> Been doing quite a bit of testing with it this morning... The scaling in the actual game is so bad that it's almost not worth mentioning that there are problems with Crossfire. The 3 GPUs all work on the game in W7 but they are constantly ramping up and down GPU usage so that I don't end up getting much extra performance out of the extra power. A little bit, but this game just runs awful on the 295x2.
> 
> When Crossfire works in windows 8.1 with FFXIV, it works nicely.


Are you using AMD pixel Clock patch by toastyx? What resolution and connections for the monitor?


----------



## electro2u

Quote:


> Originally Posted by *DividebyZERO*
> 
> Are you using AMD pixel Clock patch by toastyx? What resolution and connections for the monitor?


Yeah, the patch and CRU to 120hz but I test this game without it. It's just dvi at 1440p.
I'll post a screenshot of the GPU activity it looks like a lie detector test. All 3 GPUs are going haywire and never settle at a constant usage.


----------



## DividebyZERO

Quote:


> Originally Posted by *electro2u*
> 
> Yeah, the patch and CRU to 120hz but I test this game without it. It's just dvi at 1440p.
> I'll post a screenshot of the GPU activity it looks like a lie detector test. All 3 GPUs are going haywire and never settle at a constant usage.


I know everyone's results are different, but i could not get AMD pixel clock patch and CRU to work properly with 14.x drivers and crossfire. Everything i launched was forced to full screen windowed or 3d with only 1 card working. If i removed the patch and ran default native resolutions/60hz full screen worked perfectly. The thing was sometimes if i tampered enough with the game(any game) sometimes i could get crossfire working.

I have access to FFXIV through family members accounts, but i do not have 295x2's yet, only 290x. if you want me to try something i can if it would help you.


----------



## electro2u

Well that's just annoying that it wasn't working in crossfire at all for you. I'm curious, are you using a Qnix type monitor or an Overlord/Catleap? Maybe post your timings; here are mine:

and it downclocks properly and everything, but yeah--I'm looking forward to a native 120Hz 1440p monitor if I can ever get my hands on one.

Here's the GPU usage graphs for my 295x2 and the 3rd GPU is the 290x.

The part of the graph where all 3 GPUs are being utilized well is the character selection screen (vsync off--usually I actually use it).
In the actual game world they don't all get used well at all. It's not much better than just using 1 GPU.
I did just now test with only the 295x2 running 2 GPUs and it definitely helps adding the 3rd so I think this game is just not very well optimized, I get another 10-15fps at 1440p with trifire over crossfire. I don't know if you'd call that CPU limited?

Edit: if anyone is wondering... I got it working in Win8.1 by upgrading to 8.1 from 7... if I just install win8 from scratch I can't get crossfire working with FFXIV at all since the other day. Mindblown.


----------



## electro2u

double post..


----------



## ljreyl

I think I re-found how to show VRM temps for 295x2 in trifire. I'll do more testing to see if my process is correct.
For now, I'd use GPUZ.


----------



## cennis

Quote:


> Originally Posted by *ljreyl*
> 
> 
> I think I re-found how to show VRM temps for 295x2 in trifire. I'll do more testing to see if my process is correct.
> For now, I'd use GPUZ.


thats cool,
wonder how you, my hwinfo does not show it with 295x2 only


----------



## joeh4384

Interesting but the numbers are exactly the same as the reading below but flip flopped.


----------



## Syceo

Hey guys, anyone interested in these feel free to drop me a pm







will be dropping them in the marketplace too .


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Hey guys, anyone interested in these feel free to drop me a pm
> 
> 
> 
> 
> 
> 
> 
> will be dropping them in the marketplace too .


How're your new cards running? Since you're selling, I assume you're happy with them


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> How're your new cards running? Since you're selling, I assume you're happy with them


Hey mate , how are you doing?

yeah decided to stick with the 980's and the swift setup. Works for me as its dead silent and everything just works. The gsync is on point and the room temp has dropped . Wife is pissed though lol so im in the doghouse... fortunately for me the doghouse happens to be the same room as the PC is in ... wonder how that happened lol.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Hey mate , how are you doing?
> 
> yeah decided to stick with the 980's and the swift setup. Works for me as its dead silent and everything just works. The gsync is on point and the room temp has dropped . Wife is pissed though lol so im in the doghouse... fortunately for me the doghouse happens to be the same room as the PC is in ... wonder how that happened lol.


I'm good dude. Experimenting with more tweaks to my system right now (debating removing my 120mm and putting in a 240 in the front for 600mm cooling) and deluding the cpu. We'll see though. I already have a quiet and decently cool system so I'm not sure. I'm just obsessed with modding lol. I most likely won't add stuff though.

Ahh, the wife. I told myself to enjoy this while I can before I get married. Because once that ring is on, there's goes needless spending.
Wife - I'm gonna get a haircut. It costs 150 dollars
*looks the same afterward

Me - I'll get a haircut. It's $20
Future/Potential wife - that costs too much. I'll do it for you

Ahh... Good times. Glad you're happy with the 980s though.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> I'm good dude. Experimenting with more tweaks to my system right now (debating removing my 120mm and putting in a 240 in the front for 600mm cooling) and deluding the cpu. We'll see though. I already have a quiet and decently cool system so I'm not sure. I'm just obsessed with modding lol. I most likely won't add stuff though.
> 
> Ahh, the wife. I told myself to enjoy this while I can before I get married. Because once that ring is on, there's goes needless spending.
> Wife - I'm gonna get a haircut. It costs 150 dollars
> *looks the same afterward
> 
> Me - I'll get a haircut. It's $20
> Future/Potential wife - that costs too much. I'll do it for you
> 
> Ahh... Good times. Glad you're happy with the 980s though.


lmao .... i hear you mate haha.

Yeah a 240 + 360 ... and u should be good to go, you cant really go wrong with a delid unless you go at it like a maniac ... I know what you mean when you say your obsessed with modding, this has turned out to be a very expensive hobby for me and probably 90% of people here . Oh well it is what it is


----------



## joeh4384

Indeed this hobby is expensive. Lol why do I want to drop big bucks to make this 295x2 290x crossfire work instead of sell the 290x for dirt cheap.


----------



## ljreyl

I wonder if anyone has this problem.
My 295 has been throttling at 65c, 1060 core, 1340 mem, 25% power limit.
I took off the block and put on new thermal pads, paste, etc and the card still throttles.

Funny thing is, only the 1 of the GPU core throttles (from 1060, it goes down to 940). The 2nd 295x2 core and 290x never throttle. Anyone have any idea?
And, the core only throttles after a 30 minute bench of Heaven @ Max settings/1440p.


----------



## axiumone

Quote:


> Originally Posted by *ljreyl*
> 
> I wonder if anyone has this problem.
> My 295 has been throttling at 65c, 1060 core, 1340 mem, 25% power limit.
> I took off the block and put on new thermal pads, paste, etc and the card still throttles.
> 
> Funny thing is, only the 1 of the GPU core throttles (from 1060, it goes down to 940). The 2nd 295x2 core and 290x never throttle. Anyone have any idea?
> And, the core only throttles after a 30 minute bench of Heaven @ Max settings/1440p.


I have similar issues here, although only on heavy overclocking. When I bench these cards at 1150/1500, +100mv, +50% power, the cards will throttle no matter what. Temps are in the mid 60's, so it's not because of heat. I suspect it has something to do with the power draw. Probably some sort of overvolt, over current protection.


----------



## ljreyl

Strange. I'm on stock volts and everything. I'll try upping the power limit to 50. If that doesn't do it, I'll switch back stock clocks, then the next step if that doesn't solve it is to switch to the XFX bios from the Sapphire OC bios.


----------



## ljreyl

So stock clocks and the original bios later, still throttles. Wonder if it's because of a synthetic bench, over current protection, or a dud card.
Axiumone, would you mind going to stock and running heaven maxed out and letting me know if there's throttle for you after about 30 minutes?
My temps btw never exceed 65 for both cards. And I'm sure the VRM is fine because of Fujipoly Extreme and the PLX has Gelid GC extreme.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> So stock clocks and the original bios later, still throttles. Wonder if it's because of a synthetic bench, over current protection, or a dud card.
> Axiumone, would you mind going to stock and running heaven maxed out and letting me know if there's throttle for you after about 30 minutes?
> My temps btw never exceed 65 for both cards. And I'm sure the VRM is fine because of Fujipoly Extreme and the PLX has Gelid GC extreme.


Sometimes i feel like my cards start to throttle as well after a while, but it is only a feeling i don't know how to check that...
Also, one thing i've noticed, is that my Corsair Link won't show all 4 GPU temps, while GPU-Z shows all of them...

Does anyone know the diameter of the tubing on the cards? I would like to install spring wrap on them, but i don't know which to choose.


----------



## joeh4384

Quote:


> Originally Posted by *yifeng3007*
> 
> Sometimes i feel like my cards start to throttle as well after a while, but it is only a feeling i don't know how to check that...
> Also, one thing i've noticed, is that my Corsair Link won't show all 4 GPU temps, while GPU-Z shows all of them...
> 
> Does anyone know the diameter of the tubing on the cards? I would like to install spring wrap on them, but i don't know which to choose.


Do you see the core clocks drop in afterburner? I have only noticed troubles with the 295x2 on stock cooling when I run a 290x below it.


----------



## ljreyl

Just had a thought...
http://lab501.net/amd-radeon-r9-295x2-nuclear-launch-detected/6/

In this picture is the 81022 PWM (6th picture from the top). On the EK water block instructions, It says to cover this with a thermal pad...
Could that be causing the throttle? I don't think I added a thermal pad to pad -_-


----------



## xer0h0ur

What throttle? Are you literally seeing your GPU(s) underclock themselves? If you're talking about GPU usage that is a completely different ballgame and has nothing to do with throttling.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> What throttle? Are you literally seeing your GPU(s) underclock themselves? If you're talking about GPU usage that is a completely different ballgame and has nothing to do with throttling.


Core clocks throttling. Just the 1st gpu on the 295x2 from 1030 stock to 943 after about 30 minutes of heaven or crysis 3.
I just remounted the block and added missing thermal pads. Testing right now


----------



## ljreyl

Quote:


> Originally Posted by *ljreyl*
> 
> Core clocks throttling. Just the 1st gpu on the 295x2 from 1030 stock to 943 after about 30 minutes of heaven or crysis 3.
> I just remounted the block and added missing thermal pads. Testing right now


So it throttled again to 943. Looks like it's a thermal throttle issue. Weird though because I'm at 65c for the GPU cores...


----------



## axiumone

Quote:


> Originally Posted by *ljreyl*
> 
> So it throttled again to 943. I removed the front panel of my corsair 750D and after about 2 minutes, went back up. Looks like it's a thermal throttle issue. Weird though because I'm at 65c for the GPU cores...


Yeah, that's kind of what I'm figuring too. There are a bunch of other chips on this board which we have no thermal sensors for and I'm guessing they all get very hot an can cause thermal throttling.

It actually makes me think that it's the reason AMD set the thermal limit for these cards at 75c. It's not because the GPU chips cant take it, I mean look at the 290x for reference. AMD was touting how these GPU's are perfectly fine operating in the 90c range. It's probably some other chip on the board having a much lower thermal limit.

My guess would be the PLX chip.


----------



## ljreyl

Quote:


> Originally Posted by *axiumone*
> 
> Yeah, that's kind of what I'm figuring too. There are a bunch of other chips on this board which we have no thermal sensors for and I'm guessing they all get very hot an can cause thermal throttling.
> 
> It actually makes me think that it's the reason AMD set the thermal limit for these cards at 75c. It's not because the GPU chips cant take it, I mean look at the 290x for reference. AMD was touting how these GPU's are perfectly fine operating in the 90c range. It's probably some other chip on the board having a much lower thermal limit.
> 
> My guess would be the PLX chip.


I guess. The PLX chip is utilizing the water block, same with the GPU, VRMs, everything that's supposed to be covered.
I'm not adding voltage, everything is under stock clocks, and cooling is 10c below the threshold. And yet, it still throttles from 1030 stock, to 943. And it's only GPU 1 on the 295x2.
I can't figure it out. It's bugging me. I already remounted 3 times.


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> I guess. The PLX chip is utilizing the water block, same with the GPU, VRMs, everything that's supposed to be covered.
> I'm not adding voltage, everything is under stock clocks, and cooling is 10c below the threshold. And yet, it still throttles from 1030 stock, to 943. And it's only GPU 1 on the 295x2.
> I can't figure it out. It's bugging me. I already remounted 3 times.


I know it's a stupid question, but have you checked your max GPU temp settings in the Overdrive menu in CCC? Maybe, just maybe, it somehow lowered itself?
How was your 295x2 performing before you installed a custom waterblock on it?

And can anyone help me out, please? Does anyone know the diameter of the tubing on the cards? I would like to install spring wrap on them, but i don't know which to choose.


----------



## hiarc

Add me whenever you get the chance Nav.

I'm guessing validation is enough? Managed to get 1110/1600 on stock volts, will test to see if I can comfortably get 1200 core at a later time when I am not so busy with exams.









Validation, it is a regular Sapphire model.


----------



## yifeng3007

Quote:


> Originally Posted by *hiarc*
> 
> Add me whenever you get the chance Nav.
> 
> I'm guessing validation is enough? Managed to get 1110/1600 on stock volts, will test to see if I can comfortably get 1200 core at a later time when I am not so busy with exams.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Validation, it is a regular Sapphire model.


I think there were like 3 guys, who wanted to join the club, including me, but Nav haven't added us yet :C


----------



## axiumone

Might want to send him a PM. I don't think he checks this thread anymore.


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> Might want to send him a PM. I don't think he checks this thread anymore.


Good idea, thanks!


----------



## ljreyl

Quote:


> Originally Posted by *yifeng3007*
> 
> I know it's a stupid question, but have you checked your max GPU temp settings in the Overdrive menu in CCC? Maybe, just maybe, it somehow lowered itself?
> How was your 295x2 performing before you installed a custom waterblock on it?
> 
> And can anyone help me out, please? Does anyone know the diameter of the tubing on the cards? I would like to install spring wrap on them, but i don't know which to choose.


Setting is maxed out. I dunno... I don't think I'll ever figure it out lol...


----------



## ljreyl

Quote:


> Originally Posted by *ljreyl*
> 
> Setting is maxed out. I dunno... I don't think I'll ever figure it out lol...


I ran just the 295X2 and I get absolutely no throttle at all. When I add the 290x, the 295x2 GPU 1 throttles from 1030 stock, to 943. Even if I max out the power limit, still throttles after about 30 minutes of Heaven or Crysis 3 maxed out.

I don't think it's a power issue because I'm under 1050 watts and I have enough amps to push all 3 GPUs.
I have enough cooling since all 3 GPUs on stock everything hit 65c in my custom loop.
All components of the GPU that need a thermal pad or TIM have it (VRMs, PLX, GPU, Etc)

Anyone got any ideas?


----------



## DividebyZERO

Quote:


> Originally Posted by *ljreyl*
> 
> I ran just the 295X2 and I get absolutely no throttle at all. When I add the 290x, the 295x2 GPU 1 throttles from 1030 stock, to 943. Even if I max out the power limit, still throttles after about 30 minutes of Heaven or Crysis 3 maxed out.
> 
> I don't think it's a power issue because I'm under 1050 watts and I have enough amps to push all 3 GPUs.
> I have enough cooling since all 3 GPUs on stock everything hit 65c in my custom loop.
> All components of the GPU that need a thermal pad or TIM have it (VRMs, PLX, GPU, Etc)
> 
> Anyone got any ideas?


Because it takes 30 minutes it still sounds like a temp issue. Can you monitor your VRMs for temps? Also what driver version and is this stock or overclocked any?

edit missed the stock part, can you try a manual overclock of like 5mhz on all gpus just for kicks? 30 min is a long time of testing to try small things I guess


----------



## yifeng3007

Quote:


> Originally Posted by *ljreyl*
> 
> I ran just the 295X2 and I get absolutely no throttle at all. When I add the 290x, the 295x2 GPU 1 throttles from 1030 stock, to 943. Even if I max out the power limit, still throttles after about 30 minutes of Heaven or Crysis 3 maxed out.
> 
> I don't think it's a power issue because I'm under 1050 watts and I have enough amps to push all 3 GPUs.
> I have enough cooling since all 3 GPUs on stock everything hit 65c in my custom loop.
> All components of the GPU that need a thermal pad or TIM have it (VRMs, PLX, GPU, Etc)
> 
> Anyone got any ideas?


Well, maybe there is not enough space between two videocards for them to be cooled properly? Maybe, try to run R9 290x on top of the R9 295x2?
I was checking my temps constantly when i ran Unigine Valley Benchmark at Extreme HD preset in 1440p resolution yesterday, and i never saw my temps go above 60C on any GPU. I had no throttling at all, just this microstuttering stuff sometimes, even though it wasn't as bad as in BF4.


----------



## joeh4384

Quote:


> Originally Posted by *ljreyl*
> 
> I ran just the 295X2 and I get absolutely no throttle at all. When I add the 290x, the 295x2 GPU 1 throttles from 1030 stock, to 943. Even if I max out the power limit, still throttles after about 30 minutes of Heaven or Crysis 3 maxed out.
> 
> I don't think it's a power issue because I'm under 1050 watts and I have enough amps to push all 3 GPUs.
> I have enough cooling since all 3 GPUs on stock everything hit 65c in my custom loop.
> All components of the GPU that need a thermal pad or TIM have it (VRMs, PLX, GPU, Etc)
> 
> Anyone got any ideas?


Could it be a power issue with the PCIe slots on your board? Do you have the latest bios? This makes no sense on water cooling.


----------



## BrotherBeast

Some improvement in quadfire with the 14.9.1 beta driver.Not nearly as much micro stutter and lag in BF4 so far in my early testing.


----------



## ljreyl

Quote:


> Originally Posted by *DividebyZERO*


Quote:


> Originally Posted by *yifeng3007*


Quote:


> Originally Posted by *joeh4384*


I think it's radiant heat coming from the 290x on the bottom. Since I have 480mm worth of radiator space, they're basically getting maxed out with 4 components to cool, which makes it so less heat is pulled away from the cards and the left over heat is radiating through the case.
When running just the 295x2 and the 4790k, temps for the GPUs stay at 53c or less. Adding the 290x, temps hit 65c or less.

I don't want to put the 290x above the 295x2 because of even more radiant heat moving up to 1 card as opposed to 1 card heating up 2 GPUs. Makes sense?

I'm gonna open the side panel and stick a 1x1 foot fan there blowing into the case and run my tests again. If it doesn't throttle, then it's a cooling issue and I'll buy another radiator to pull more heat away from the cards (maybe even get a new case with more airflow too. Currently running a 750D which has super crappy airflow)

If the card throttles, then I think it's safe to say it's just the nature of the beast. Then maybe it would be a good idea to sell my 290x and overclock the crap out of the 4790k and 295x2 lol.


----------



## joeh4384

I think the 290x board might be better equipped to handle the heat compared to the 295x2 board if the problem is due to the heat from the bottom card.


----------



## ljreyl

Quote:


> Originally Posted by *joeh4384*
> 
> I think the 290x board might be better equipped to handle the heat compared to the 295x2 board if the problem is due to the heat from the bottom card.


True. I just don't like the asthetic look








But I'll try it anyway tonight.


----------



## yifeng3007

So guys, please, does anyone know the diameter of the tubing on the cards? I would like to install spring wrap on them, but i don't know which to choose. There has to be a specifications or a way to measure it...


----------



## Elmy

Quote:


> Originally Posted by *yifeng3007*
> 
> So guys, please, does anyone know the diameter of the tubing on the cards? I would like to install spring wrap on them, but i don't know which to choose. There has to be a specifications or a way to measure it...


Ever heard of this cool invention called the tape measure?


----------



## yifeng3007

Quote:


> Originally Posted by *Elmy*
> 
> Ever heard of this cool invention called the tape measure?


The problem is, that in Russia we use metric system, yet this:
http://overhard.ru/catalog/479/Koolance-Tubing-Spring-Wrap--Steel-Red-for-OD-16mm-%285-8in%29/
http://overhard.ru/catalog/479/Koolance-Tubing-Spring-Wrap--Red-%5BFor-OD%3A-10mm-%283-8-%29%5D/
http://overhard.ru/catalog/479/Koolance-Tubing-Spring-Wrap--Steel-Red-for-OD-13mm-%281-2in%29/
Are measured in American system, which i don't understand at all...


----------



## joeh4384

Can also convert and see what matches.

http://www.sciencemadesimple.net/length.php


----------



## DividebyZERO

Quote:


> Originally Posted by *Elmy*
> 
> Ever heard of this cool invention called the tape measure?


Somehow i read this with your avatar looking at me and imagined it was number 5 saying it sarcastically


----------



## SantaClaw

Basic question, My rig is water cooled, properly with a HUGE 24 fan radiator and enough thermal dissipation for a small nuclear reactor, so adding a 120mm radiator seems.. well stupid, anyway, can I just cut the stock water cooling lines on the 295x2, disconnect the wire, and connect it to my current watercooling loop ?


----------



## Elmy

Quote:


> Originally Posted by *DividebyZERO*
> 
> Somehow i read this with your avatar looking at me and imagined it was number 5 saying it sarcastically


LMAO

Can't wait till they make a remake of Short Circuit


----------



## Syceo

Quote:


> Originally Posted by *SantaClaw*
> 
> Basic question, My rig is water cooled, properly with a HUGE 24 fan radiator and enough thermal dissipation for a small nuclear reactor, so adding a 120mm radiator seems.. well stupid, anyway, can I just cut the stock water cooling lines on the 295x2, disconnect the wire, and connect it to my current watercooling loop ?


I doubt very much that would work since the 295x2 has its own integrated pumping system. Im guessing if you wanted to attempt a ghetto mod, you could try figure out how to disable the internal pump and then see if you can jerry rig it into your loop, lol. But have you thought about resale value?


----------



## SantaClaw

Quote:


> Originally Posted by *Syceo*
> 
> I doubt very much that would work since the 295x2 has its own integrated pumping system. Im guessing if you wanted to attempt a ghetto mod, you could try figure out how to disable the internal pump and then see if you can jerry rig it into your loop, lol. But have you thought about resale value?


Well thnx, I figured it out myself after doing some thinking... I've got one of these on order now:

https://www.visiontek.com/graphics-cards/visiontek-cryovenom-r9-295x2-detail.html


----------



## Mega Man

Quote:


> Originally Posted by *SantaClaw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syceo*
> 
> I doubt very much that would work since the 295x2 has its own integrated pumping system. Im guessing if you wanted to attempt a ghetto mod, you could try figure out how to disable the internal pump and then see if you can jerry rig it into your loop, lol. But have you thought about resale value?
> 
> 
> 
> Well thnx, I figured it out myself after doing some thinking... I've got one of these on order now:
> 
> https://www.visiontek.com/graphics-cards/visiontek-cryovenom-r9-295x2-detail.html
Click to expand...

you can do that or just buy a block, usually it is cheaper
Quote:


> Originally Posted by *Elmy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> Somehow i read this with your avatar looking at me and imagined it was number 5 saying it sarcastically
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> LMAO
> 
> Can't wait till they make a remake of Short Circuit
Click to expand...

i dont think they will most people will never understand the basics of that movie ( motives ect ) in this day and age


----------



## ljreyl

Quote:


> Originally Posted by *ljreyl*
> 
> True. I just don't like the asthetic look
> 
> 
> 
> 
> 
> 
> 
> 
> But I'll try it anyway tonight.


So, I moved my 290x to the top slot and my 295x2 to the bottom slot. I also redid my entire loop by removing some 90 degree angle and 45 degree angle adapters (went from four 90s and five 45s to 2 each, which is better for flow)

Temps went down a massive total of 5c!

BUT...
GPU 1 on the 295x2 still throttles after some time...

I'm starting to think it's what others have said which is an electrical/overcurrent protection or something.

I wanna try a higher wattage power supply but that's way too much money AND I believe the ax1200i that I'm using really is enough.


----------



## ragulih

Have u guys got any pics from GPU-Z from your cards under load?
My max stats are:
VDDC: 1.238V
VDDC Current: 105.1 A
VDDC Power: 124.4W


----------



## joeh4384

Quote:


> Originally Posted by *ragulih*
> 
> Have u guys got any pics from GPU-Z from your cards under load?
> My max stats are:
> VDDC: 1.238V
> VDDC Current: 105.1 A
> VDDC Power: 124.4W


My clocks stay steady with setting the power limit on afterburner to +20.


----------



## xer0h0ur

My cards are overclocked and one is still on air and I don't get throttling. I don't have to increase the power limit or increase the voltage to stay stable or avoid throttling. My singular gripe is literally all about GPU usage right now as crossfire scaling seems to have gone to hell in a hand basket. Scaling works fine in benchmarks but games, NOPE NOPE NOPE.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> My cards are overclocked and one is still on air and I don't get throttling. I don't have to increase the power limit or increase the voltage to stay stable or avoid throttling. My singular gripe is literally all about GPU usage right now as crossfire scaling seems to have gone to hell in a hand basket. Scaling works fine in benchmarks but games, NOPE NOPE NOPE.


Man... when i run just the 295, I get no core downclocking. When i run it in trifire, even with the 290x as the top card, GPU 1 on the 295 still throttles to the same number.
I've tried raising the power limit, redid my loop to improve flow (removed 90 and 45 bends), etc.
I have no idea what's causing it. I'm gonna do thermal imaging tonight to see if it might be heat that the block isn't able to pull


----------



## xer0h0ur

You mentioned that you have re-mounted the block several times but were you actually checking to make sure all of your pads were in fact making contact? I had two or three pads that had to get doubled up for them to actually make contact.


----------



## Noufel

Hi
the 295x2 here in algeria cost only 80000da = 700 $ is it a good deal ??


----------



## electro2u

Quote:


> Originally Posted by *Noufel*
> 
> Hi
> the 295x2 here in algeria cost only 80000da = 700 $ is it a good deal ??


That's firesale territory, imo... which is sort of where it should be. That's less than half what I paid for mine in the US. Very good price.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> You mentioned that you have re-mounted the block several times but were you actually checking to make sure all of your pads were in fact making contact? I had two or three pads that had to get doubled up for them to actually make contact.


Yup, I replaced the pads each time too. I really can't figure it out. With just the 295 running, it never downclocks. Add the 290x and it downclocks after 20-30 minutes of max load benching/gaming.


----------



## xer0h0ur

That is the absolute strangest behavior I have ever heard of. I really come up empty with any more suggestions.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is the absolute strangest behavior I have ever heard of.


I think Kim Jong Un's behavior is pretty strange.

Let's see how the thermal imaging goes


----------



## axiumone

Quote:


> Originally Posted by *ljreyl*
> 
> Yup, I replaced the pads each time too. I really can't figure it out. With just the 295 running, it never downclocks. Add the 290x and it downclocks after 20-30 minutes of max load benching/gaming.


Do you have anything that you can measure wattage with?


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> I think Kim Jong Un's behavior is pretty strange.
> 
> Let's see how the thermal imaging goes


Quote:


> Originally Posted by *axiumone*
> 
> Do you have anything that you can measure wattage with?


I didn't get to do thermal imaging yesterday. Too much stuff at work -_- And I used corsair link, which said i was at 1050 max (with overvolting and overclocking)(with stock/below stock volts and overclocked, I never went above 1000 watts.)
I'm gonna give it one last shot tomorrow before I give up.
I bought a 240mm radiator to replace my 120mm which comes in tomorrow. While I drain my loop and install it, I'll take apart the 295 one last time and add an extra layer of .5mm thermal pads to the vrms to ensure contact. Could be that my block was milled badly.. Who knows...

BTW, The Evil Within is scary as all hell. I recommend the game!


----------



## estu87

hii, does anyone know how to fix dxdiag memory read error, because it shows only 181mb memory









Display Devices

Card name: AMD Radeon R9 200 Series
Manufacturer: Advanced Micro Devices, Inc.
Chip type: AMD Radeon Graphics Processor (0x67B9)
DAC type: Internal DAC(400MHz)
Device Key: Enum\PCI\VEN_1002&DEV_67B9&SUBSYS_0B2A1002&REV_00
Display Memory: 181 MB
Dedicated Memory: 4031 MB
Shared Memory: 246 MB

and i suspect this read error is what makes ff xiii failed to boot on my pc, because it always says not sufficient vram.


----------



## ljreyl

Quote:


> Originally Posted by *estu87*
> 
> hii, does anyone know how to fix dxdiag memory read error, because it shows only 181mb memory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Display Devices
> 
> Card name: AMD Radeon R9 200 Series
> Manufacturer: Advanced Micro Devices, Inc.
> Chip type: AMD Radeon Graphics Processor (0x67B9)
> DAC type: Internal DAC(400MHz)
> Device Key: Enum\PCI\VEN_1002&DEV_67B9&SUBSYS_0B2A1002&REV_00
> Display Memory: 181 MB
> Dedicated Memory: 4031 MB
> Shared Memory: 246 MB
> 
> and i suspect this read error is what makes ff xiii failed to boot on my pc, because it always says not sufficient vram.


http://steamcommunity.com/app/292120/discussions/0/613938693093404302/
http://forums.eu.square-enix.com/showthread.php?t=17327

Other users are having problems with this game. It's not your card, it's the game.

As for the 181 memory, I suspect that's what your card is using just to display normal items (browsing the internet, etc.)
It shows you have 4GB dedicated which is correct.


----------



## estu87

Quote:


> Originally Posted by *ljreyl*
> 
> http://steamcommunity.com/app/292120/discussions/0/613938693093404302/
> http://forums.eu.square-enix.com/showthread.php?t=17327
> 
> Other users are having problems with this game. It's not your card, it's the game.
> 
> As for the 181 memory, I suspect that's what your card is using just to display normal items (browsing the internet, etc.)
> It shows you have 4GB dedicated which is correct.


i know, already read all those steam forum about vram problem, and yes the game is a horrible port even worse than dark souls.

i'll try to reinstal my 6990 when i get back home to see the if dxdiag reporting same error, my work pc here using 4870 1gb and reported 2808mb display memory, didn't test the game though.


----------



## atnadeb

Hi guys, I am trying to replace the stock fan on my 295x2 with a Noctua NF-F12 (which is a 4 pin fan). I bought an extension which lets me connect it to the 3 pin header of the fan on the 295x2. However the fan does not work when connected to my 295x2 fan header. The fan itself looks fine because it works when I connect it to the motherboard (with or without the extension)

Has anyone successfully used this fan? Will there be any problems if I run the fan off the motherboard?


----------



## electro2u

Quote:


> Originally Posted by *atnadeb*
> 
> Hi guys, I am trying to replace the stock fan on my 295x2 with a Noctua NF-F12 (which is a 4 pin fan). I bought an extension which lets me connect it to the 3 pin header of the fan on the 295x2. However the fan does not work when connected to my 295x2 fan header. The fan itself looks fine because it works when I connect it to the motherboard (with or without the extension)
> 
> Has anyone successfully used this fan? Will there be any problems if I run the fan off the motherboard?


Hiya. I guess the pwm fan you have doesn't like the 12v feed from the 295x2. It will be fine to run it off the motherboard.


----------



## atnadeb

Quote:


> Originally Posted by *electro2u*
> 
> Hiya. I guess the pwm fan you have doesn't like the 12v feed from the 295x2. It will be fine to run it off the motherboard.


Thanks for the reply. I have the fan running at 100% now. Will look into something like SpeedFan to map the fan speed to the GPU temperature.


----------



## Kraius

I'd argue you'd actually get better results by managing the radiator fan separate from the 295x2 (using the motherboard or something else automatic). Since you can't control it's speed and those stock fans (and the card itself) seem to try and straddle the balance between efficient and quiet. You really want the fan going all out when it's a question of thermal throttling vs noise during gaming (which you probably wouldn't hear over explosions).

I moved mine both over to the motherboard and let the Asus xpert 3 software manage it based off of the temp of the radiator (using a thermistor also plugged into the motherboard and taped to the radiator). Works well. When i had one 295x2 running stock speed while the other was running off of the motherboard, the motherboard-managed one had cooler temps (other variables could certainly play into this).


----------



## atnadeb

Quote:


> Originally Posted by *Kraius*
> 
> I'd argue you'd actually get better results by managing the radiator fan separate from the 295x2 (using the motherboard or something else automatic). Since you can't control it's speed and those stock fans (and the card itself) seem to try and straddle the balance between efficient and quiet. You really want the fan going all out when it's a question of thermal throttling vs noise during gaming (which you probably wouldn't hear over explosions).
> 
> I moved mine both over to the motherboard and let the Asus xpert 3 software manage it based off of the temp of the radiator (using a thermistor also plugged into the motherboard and taped to the radiator). Works well. When i had one 295x2 running stock speed while the other was running off of the motherboard, the motherboard-managed one had cooler temps (other variables could certainly play into this).


I agree. I want the fans going full blast when gaming. Is thermister the way to go or is there a software alternative I can use to control fan speed based on gpu temperature? Could you link the thermister you have?


----------



## Kraius

I haven't looked to see if there is a software alternative that can look at GPU utilization/temp and affect fan speeds. That'd be pretty cool. For me it was low hanging fruit to just use the existing stuff that came with my motherboard. The thermistors i'm using are from Asus:
http://us.estore.asus.com/index.php?l=product_detail&p=6789

They hook into a fan extension board that came with my motherboard but if you have any t_sensor ports on your motherboard you could plug them directly into that. it really depends on what you have.


----------



## estu87

Quote:


> Originally Posted by *ljreyl*
> 
> http://steamcommunity.com/app/292120/discussions/0/613938693093404302/
> http://forums.eu.square-enix.com/showthread.php?t=17327
> 
> Other users are having problems with this game. It's not your card, it's the game.
> 
> As for the 181 memory, I suspect that's what your card is using just to display normal items (browsing the internet, etc.)
> It shows you have 4GB dedicated which is correct.


Quote:


> Originally Posted by *estu87*
> 
> i know, already read all those steam forum about vram problem, and yes the game is a horrible port even worse than dark souls.
> 
> i'll try to reinstal my 6990 when i get back home to see the if dxdiag reporting same error, my work pc here using 4870 1gb and reported 2808mb display memory, didn't test the game though.


ok so i installed back my 6990

dxdiag report shows as..

Display Devices

Card name: AMD Radeon HD 6900 Series
Manufacturer: Advanced Micro Devices, Inc.
Chip type: AMD Radeon Graphics Processor (0x671D)
DAC type: Internal DAC(400MHz)
Device Key: Enum\PCI\VEN_1002&DEV_671D&SUBSYS_31601682&REV_00
Display Memory: 2280 MB
Dedicated Memory: 2034 MB
Shared Memory: 246 MB

now i can boot ff13, and tom clancy's blacklist don't prompt dialog box that didn't support r9 200 series









as i suspect the culprit is the dxdiag


----------



## atnadeb

Quote:


> Originally Posted by *Kraius*
> 
> I haven't looked to see if there is a software alternative that can look at GPU utilization/temp and affect fan speeds. That'd be pretty cool. For me it was low hanging fruit to just use the existing stuff that came with my motherboard. The thermistors i'm using are from Asus:
> http://us.estore.asus.com/index.php?l=product_detail&p=6789
> 
> They hook into a fan extension board that came with my motherboard but if you have any t_sensor ports on your motherboard you could plug them directly into that. it really depends on what you have.


I have the ASUS Z97-AR. It's not in the list of the compatible motherboards in the link you gave me but I suspect it should be compatible.


----------



## yifeng3007

So i recently bought two more Dell U2414H (so i have 3 of them now) but for some reason my system cannot recognize those monitors :C It just doesn't see them, like they are not even connected. I got all of them hooked up to the PC by 1 individual miniDP-DP (that came with the monitors) from one 295x2 to each monitor. I don't know, if i explained it properly, but it looks something like this:
1st video card:
miniDP -> miniDP-DP -> DP monitor
miniDP -> miniDP-DP -> DP monitor
R9 295x2 miniDP -> miniDP-DP -> DP monitor
miniDP -> *empty*
DVI-D (Dual Link) -> *empty*
2nd video card:
miniDP -> *empty*
miniDP -> *empty*
R9 295x2 miniDP -> *empty*
miniDP -> *empty*
DVI-D (Dual Link) -> *empty*


----------



## joeh4384

Anyone cross-firing this with a 290x on air with stock cooling? My cards do not like being neighbors. When I run the Twin Frozr 290x on the bottom slot, the GPU temps are ok but the 295x2 VRMs throttle causing it to lose its clocks for moments at a time. When I run the 290x on top, the 290x hits 95
Quote:


> Originally Posted by *yifeng3007*
> 
> So i recently bought two more Dell U2414H (so i have 3 of them now) but for some reason my system cannot recognize those monitors :C It just doesn't see them, like they are not even connected. I got all of them hooked up to the PC by 1 individual miniDP-DP (that came with the monitors) from one 295x2 to each monitor. I don't know, if i explained it properly, but it looks something like this:
> 1st video card:
> miniDP -> miniDP-DP -> DP monitor
> miniDP -> miniDP-DP -> DP monitor
> R9 295x2 miniDP -> miniDP-DP -> DP monitor
> miniDP -> *empty*
> DVI-D (Dual Link) -> *empty*
> 2nd video card:
> miniDP -> *empty*
> miniDP -> *empty*
> R9 295x2 miniDP -> *empty*
> miniDP -> *empty*
> DVI-D (Dual Link) -> *empty*


They are all connected to the same card?


----------



## yifeng3007

Quote:


> Originally Posted by *joeh4384*
> 
> Anyone cross-firing this with a 290x on air with stock cooling? My cards do not like being neighbors. When I run the Twin Frozr 290x on the bottom slot, the GPU temps are ok but the 295x2 VRMs throttle causing it to lose its clocks for moments at a time. When I run the 290x on top, the 290x hits 95


There are some people, that have Tr-Fire setup, but i completely forgot who, sorry. But they are frequent forum users, so i am sure you'll get the answer from them.
Quote:


> Originally Posted by *joeh4384*
> 
> They are all connected to the same card?


Yep, they are connected to the same card. I somehow fixed the issue by simply unplugging the monitors and plugging them back again. But now i really want to use Eyefinity setup, but i just hate when i move any open window to the top, expecting it to get to the size of just one monitor, it goes all the way to three monitors >:C

A little update:
When i tried to configure the Eyefinity, every time i connect the third monitor, it turns off. If i disconnect any other monitor, the last connected monitor will turn back on again. ***?
Another update: every time i try to "identify displays" it only shows 2 monitors, when i press the "find" button, all the monitors restart and the two, that worked normally continue to do so, but the third turns on and Windows says, that it is not a PnP monitor and i can't change anything about it. This AMD, man... I never knew i'll get so much trouble with the damn things >:'C
Update 3: well, somehow i got them all working, but every game i launch, even BF4 just go triple, lol


----------



## xer0h0ur

Quote:


> Originally Posted by *joeh4384*
> 
> Anyone cross-firing this with a 290x on air with stock cooling? My cards do not like being neighbors. When I run the Twin Frozr 290x on the bottom slot, the GPU temps are ok but the 295x2 VRMs throttle causing it to lose its clocks for moments at a time. When I run the 290x on top, the 290x hits 95
> They are all connected to the same card?


Indeed. I am running my 290X on the top of my 295X2. Only the 290X is air-cooled though and I am using a custom fan curve in Afterburner for the 290X to keep it from going past the 80's. I mean, logically speaking I would always give preference to the 295X2's cooling versus a single gpu 290X.


----------



## Mega Man

Quote:


> Originally Posted by *atnadeb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kraius*
> 
> I'd argue you'd actually get better results by managing the radiator fan separate from the 295x2 (using the motherboard or something else automatic). Since you can't control it's speed and those stock fans (and the card itself) seem to try and straddle the balance between efficient and quiet. You really want the fan going all out when it's a question of thermal throttling vs noise during gaming (which you probably wouldn't hear over explosions).
> 
> I moved mine both over to the motherboard and let the Asus xpert 3 software manage it based off of the temp of the radiator (using a thermistor also plugged into the motherboard and taped to the radiator). Works well. When i had one 295x2 running stock speed while the other was running off of the motherboard, the motherboard-managed one had cooler temps (other variables could certainly play into this).
> 
> 
> 
> I agree. I want the fans going full blast when gaming. Is thermister the way to go or is there a software alternative I can use to control fan speed based on gpu temperature? Could you link the thermister you have?
Click to expand...

you are going to say " i dont want to spend that kind of money" but i will be the first to say neither did i, now that i own one ( seven now ) i will state i will NEVER build another pc without one
AQ6 ! best way ever !
Quote:


> Originally Posted by *yifeng3007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Anyone cross-firing this with a 290x on air with stock cooling? My cards do not like being neighbors. When I run the Twin Frozr 290x on the bottom slot, the GPU temps are ok but the 295x2 VRMs throttle causing it to lose its clocks for moments at a time. When I run the 290x on top, the 290x hits 95
> 
> 
> 
> There are some people, that have Tr-Fire setup, but i completely forgot who, sorry. But they are frequent forum users, so i am sure you'll get the answer from them.
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> They are all connected to the same card?
> 
> Click to expand...
> 
> Yep, they are connected to the same card. I somehow fixed the issue by simply unplugging the monitors and plugging them back again. But now i really want to use Eyefinity setup, but i just hate when i move any open window to the top, expecting it to get to the size of just one monitor, it goes all the way to three monitors >:C
> 
> A little update:
> When i tried to configure the Eyefinity, every time i connect the third monitor, it turns off. If i disconnect any other monitor, the last connected monitor will turn back on again. ***?
> Another update: every time i try to "identify displays" it only shows 2 monitors, when i press the "find" button, all the monitors restart and the two, that worked normally continue to do so, but the third turns on and Windows says, that it is not a PnP monitor and i can't change anything about it. This AMD, man... I never knew i'll get so much trouble with the damn things >:'C
> Update 3: well, somehow i got them all working, but every game i launch, even BF4 just go triple, lol
Click to expand...

DP can be a pain you need to verify the cables ! also can be monitor specific issues i have found


----------



## electro2u

Yah maybe different monitor firmware. Idk. Multimonitor is a quagmire red or green.


----------



## Mega Man

personally i dont trust many OEM cables ( ones that come with monitors )


----------



## yifeng3007

Quote:


> Originally Posted by *electro2u*
> 
> Yah maybe different monitor firmware. Idk. Multimonitor is a quagmire red or green.


Quote:


> Originally Posted by *Mega Man*
> 
> personally i dont trust many OEM cables ( ones that come with monitors )


Well, now that i played around with monitor and CCC settings i got everything working pretty well, but still, when i launch BF4 it doesn't look like "left side monitor - left side of the soldier, middle monitor - center view with HUD elements of the soldier, right side monitor - right side view of the soldier", it is literally more like "left monitor - center view with HUD elements of the soldier, middle monitor - center view with HUD elements of the soldier, right side monitor - center view with HUD elements of the soldier". So it kind of looks like i have 3 BF4's running at the same time showing the exact same picture, lol


----------



## Mega Man

Adj the res it happens to me when running in 1080p


----------



## yifeng3007

Quote:


> Originally Posted by *Mega Man*
> 
> Adj the res it happens to me when running in 1080p


Thanks a lot for the tip!
For some reason, when i enter the Test range everything works like 1 fps per 3-4 secs and it is impossible to do anything. I managed to get into the campaign and adjust the resolution, so every time i make the game to run in 1080p, it does exactly what you said: the game triples itself across all monitors. But when i set the resolution to 5760x1080, there is only one middle screen, that shows the game as if it would run in 1080p, while the monitors on the sides show absolutely nothing, but a black screen.


----------



## ljreyl

So, got some news on my build.
I took out the 120mm radiator and added a 240mm radiator. (Now have 600mm worth of radiator space, 360+240)
I took apart the 295x2 and remounted new thermal pads to the VRMs (Ram was fine, just put some TIM on top before mounting)
This time, I added an extra .5mm layer of pads on the VRMs and the controller (so 1.5mm instead of 1mm)
Also, for the backplate, there is a VRM controller on the rear of the PCB, added 1.5mm to that as well.

Reinstalled, ran my tests, NO DOWNCLOCK. Overclocked to 1061/1400, 0% power limit, NO DOWNCLOCK.

BOOM.

Temps went down an additional 3-4c with the extra 120mm of radiator space.

I'm finally a happy camper.

I believe that the VRMs weren't being fully cooled from lack of thermal pad thickness. (Xer0h0ur pointed this out)
I still haven't check my power consumption via kill-a-watt but Corsair Link shows up to 1005 watts used. That's 1095 watts left on my AX1200i.
As for amperage, I have a 100A on a single 12v rail. Completely enough for the system.

OH YEA!!
About microstutter on this system... I found microstutter and found how to turn it off...
DO NOT MONITOR THE VRM TEMPS.
I turned on my 290x VRM temp monitor in HWInfo64. As I was running benches and games, I noticed stutter every 2 seconds.
I checked to make sure frame pacing was on, put CCC to default, etc.
I then went to HWInfo and unticked "Show" for GPU/VRAM VRM temps. Still stuttered. Went back and unticked "Monitor" and BOOM, stutter gone.

Hope that info helps any of you that are getting stutter when monitoring VRM temps.


----------



## joeh4384

Nice. Do you have pictures of your updated rig?
Quote:


> Originally Posted by *xer0h0ur*
> 
> Indeed. I am running my 290X on the top of my 295X2. Only the 290X is air-cooled though and I am using a custom fan curve in Afterburner for the 290X to keep it from going past the 80's. I mean, logically speaking I would always give preference to the 295X2's cooling versus a single gpu 290X.


I figured as much that I need to upgrade the cooling on the 290x to get the heat away from the 295x2. I like the looks of that corsair bracket and probably will go with that when it releases rather than give away my 290x for crap.


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> OH YEA!!
> About microstutter on this system... I found microstutter and found how to turn it off...
> DO NOT MONITOR THE VRM TEMPS.
> I turned on my 290x VRM temp monitor in HWInfo64. As I was running benches and games, I noticed stutter every 2 seconds.
> I checked to make sure frame pacing was on, put CCC to default, etc.
> I then went to HWInfo and unticked "Show" for GPU/VRAM VRM temps. Still stuttered. Went back and unticked "Monitor" and BOOM, stutter gone.
> 
> Hope that info helps any of you that are getting stutter when monitoring VRM temps.


Some good info there. Glad you got your cooling sorted out.

I noticed something about Afterburner today:
My MSI 295x2 had voltage adjustment completely greyed out and the way to get access to it was to
Enable CSM (compatibility Support Module) in MB BIOS


----------



## ljreyl

Quote:


> Originally Posted by *joeh4384*
> 
> Nice. Do you have pictures of your updated rig?




Here's a cell picture I took yesterday.

Res > D5 > 240MM > 4790k > 360mm > 295X2 > 290X > Res

Both radiators are exhausting out, with the rear and bottom fans taking in air. (Front has 1 fan in push, other in pull. Waiting for another NF F12 to make the top portion push/pull and the bottom just pull.)
It's a negative pressure flow. (218CFM Out, 187CFM in)

Not sure if I should make the front as intake and the rear as exhaust though. I don't like the idea of taking in hot air. It seems like Syceo/Electro2U's rig is good though... Would it make a cooling difference?


----------



## electro2u

Well hmm it supposedly does for the coolant temp but it will definitely affect your case temp. I have one rad in push pull intake and the 360 up top is just push exhaust. I've been told I should switch the 360 to intake but I'm happy.

Btw. I just switched my cpu block from XSPC Raystorm to a full monoblock EK cpu/vrm/pch block... And my flow went up a good amount  the Raystorm isn't clogged either. Just restrictive. It affected my temps a lot.


----------



## xer0h0ur

Quote:


> Originally Posted by *joeh4384*
> 
> Nice. Do you have pictures of your updated rig?
> I figured as much that I need to upgrade the cooling on the 290x to get the heat away from the 295x2. I like the looks of that corsair bracket and probably will go with that when it releases rather than give away my 290x for crap.


Its not prettied up or anything since I didn't care about cable management. I just wanted to eliminate a metal plate and remove two HDD cages but this is how its currently sitting until I extend the loop and slap a matching EK block and backplate on the 290X.


----------



## Mega Man

Quote:


> Originally Posted by *yifeng3007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Adj the res it happens to me when running in 1080p
> 
> 
> 
> Thanks a lot for the tip!
> For some reason, when i enter the Test range everything works like 1 fps per 3-4 secs and it is impossible to do anything. I managed to get into the campaign and adjust the resolution, so every time i make the game to run in 1080p, it does exactly what you said: the game triples itself across all monitors. But when i set the resolution to 5760x1080, there is only one middle screen, that shows the game as if it would run in 1080p, while the monitors on the sides show absolutely nothing, but a black screen.
Click to expand...

i have this issue with random games ( looking at you batman idr which off the top of my head )


----------



## yifeng3007

Quote:


> Originally Posted by *Mega Man*
> 
> i have this issue with random games ( looking at you batman idr which off the top of my head )


But the BF4 HAS to work fine, since it is an AMD approved or whatever game! But i just can't make it run on all my monitors! It can only launch itself in 1080p on three monitors at the same time, if i choose 1920x1080 in the settings, or it will run itself in just one 1080p in the middle monitor, if i choose 5760x1080 in the settings tab







However, Crysis 3 and Metro: LL work absolutely flawlessly in eyefinity, all maxed out with AA on medium!


----------



## Sploosh

Hey folks.

I currently have an R9 295x2 in my system that replaced my old Gigabyte R9 290 OC. I'm interested in continuing to use my R9 290 in tri-fire, and when I do so I'm planning on removing the Gigabyte windforce cooler and putting on a Kraken G10 bracket and a 280mm rad.

Perhaps in the future I'd look at a complete custom loop, but it really depends on money and things.

But regardless! The question I wanted to ask was - Would a 1350W Silver PSU work for a Tri-fire setup?

Currently I'm running a 4960k w/ an H110 cooler and 4x 4gb sticks of G.skill ares ram (1600 right now, rated for 1866). +Blu-ray optical, 1TB SSD and 2TB regular internal HHD.


----------



## xer0h0ur

Unless you're running insane overclocks system-wide you should not have any trouble running those two cards together.

Edit: More important than the wattage is the amperage and whether the PSU is multi-rail or single 12V rail. I would suggest going with a single rail design over multi-rail but that is just one man's opinion.


----------



## Sploosh

I'm not much of an overclocker - yet. Right now I just want to make the most of my existing hardware so I can count on it for the next year or two hopefully. And having things run cool and relatively quiet is always nice.

The amp specs are: [email protected], [email protected], [email protected], [email protected], [email protected], [email protected]

My current supply is 62 Amps on one 12v rail, which is nice. Only 1000W capacity though. I'd prefer the single rail, but its a nice deal on the power supply.


----------



## xer0h0ur

Yeah you should be fine then. You're just going to need to figure out which 8-pins go to which 12V rail so that you have the 295X2 connected by itself on one 12V rail while the 290X is connected on the other 12V rail. You also want to connect any other 12V components to that 290X rail since the 295X2 alone demands 50A.


----------



## axiumone

VRM fan control mod video is up - 




It's my first try at this, so go easy









Hopefully someone will still find it useful!


----------



## DividebyZERO

Quote:


> Originally Posted by *axiumone*
> 
> VRM fan control mod video is up -
> 
> 
> 
> 
> It's my first try at this, so go easy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully someone will still find it useful!


repped!! Thank you for your effort to help others.


----------



## axiumone

Thank you!

By the way, are you still having any issues with BF4 in crossfire? I finally found a fix for it! It totally eliminated any stuttering for me, the game is buttery smooth now in crossfire + eyefinity.


----------



## joeh4384

Thanks for the video +1 rep.


----------



## electro2u

Do tell!


----------



## axiumone

Alrighty, so let me preface this with a warning. After reading everything about this on the BF4 forums, you do risk the chance of possibly getting banned by EA. They consider this modifying the game, akin to the sweetfx mod we have. So use this at your own risk.

Apparently there's a memory leak in the game and we can fix it by doing the following. Go to your battlefield 4/data/win32 folder. In there is a file called "weaponchunks.sb", if you rename this file to anything else, ALL of the stuttering was fixed for me. I just couldn't believe how smooth the game was running in quad crossfire.

So anyway, there is a small drawback, besides the possibility of getting banned. You will lose some of the sound effects. I've noticed that my defib sound, MBT engine and quad engine sounds were gone. I'm sure there are others, but I wasn't paying too much attention. In my opinion, a small price to pay to actually have the game in a state that I could enjoy.

BTW - I didn't "create" this fix. I don't really know who did, so I can't give proper credit. I just found it among lots of other info on BF4 forums.


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, are you still having any issues with BF4 in crossfire? I finally found a fix for it! It totally eliminated any stuttering for me, the game is buttery smooth now in crossfire + eyefinity.


No, sir, thank you!
So, just by maxing up the fan speed you eliminated BF4 stuttering? Wow, srsly, AMD, what the hell?


----------



## yifeng3007

Quote:


> Originally Posted by *axiumone*
> 
> Alrighty, so let me preface this with a warning. After reading everything about this on the BF4 forums, you do risk the chance of possibly getting banned by EA. They consider this modifying the game, akin to the sweetfx mod we have. So use this at your own risk.
> 
> Apparently there's a memory leak in the game and we can fix it by doing the following. Go to your battlefield 4/data/win32 folder. In there is a file called "weaponchunks.sb", if you rename this file to anything else, ALL of the stuttering was fixed for me. I just couldn't believe how smooth the game was running in quad crossfire.
> 
> So anyway, there is a small drawback, besides the possibility of getting banned. You will lose some of the sound effects. I've noticed that my defib sound, MBT engine and quad engine sounds were gone. I'm sure there are others, but I wasn't paying too much attention. In my opinion, a small price to pay to actually have the game in a state that I could enjoy.
> 
> BTW - I didn't "create" this fix. I don't really know who did, so I can't give proper credit. I just found it among lots of other info on BF4 forums.


Oh, ok, i just noticed this message.
Now i have another question: srsly, DICE, what the hell? Is it already the time to shout: "DICE, FIX THE GAME"?


----------



## Nichismo

Not sure if anyone has already said or noticed this, or if its even worth mentioning but the ASUS ARES III is now on Newegg and Amazon. At 1500$ too, im quite suprised, I figured it would have been considerably higher. I feel this is a pretty darn good price, considering the 295x2 ref is about 1000$ everywhere, and a waterblock and back plate for would run about 250-300$, and would also void the warranty.

Im pretty tempted in purchasing one of these, however I only run 2560x1080 res, and I already recently purchased the first 780ti of 2 I had planned (and potentially a third down the line), along with a waterblock for it. Also, given the cards size, id have to completely dismantle my current GPU plumbing system and completely redesign it.....Also, even though I really like the idea of a single slot design, im worried about the weight of the card and potential stress it might place on the other components.

But man, I cant stop looking at the darn thing! so appealing....


----------



## Sgt Bilko

Sign me up!



XFX R9 295x2 at stock (1018/1250)


----------



## joeh4384

Nice, I have the XFX one too.


----------



## boss9967

The 295x2 stock water cooling system works very good but I want to use my KOOLANCE ERM-3K4U with my 295x2 , I set 2 olds koolance GPU-180-H06 on the card , now runs very nice playing aliens isolation 2560x1600 all ultra temp max 53c , here some pics


----------



## Mega Man

both of my 295x2s have ek blocks waiting @!


----------



## xer0h0ur

Isn't it bad to use dyed coolants?


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Isn't it bad to use dyed coolants?


Yes it can be. Its actually slightly reduces the cooling effictivness vs distilled water alone. Is it enough to notice not really. I have used dye coolant and distilled I prefer distilled for a few reasons. The dye will gunk up your blocks depending on the dye brand and make you have to tear down your loop and blocks to clean them good. Another bad thing is if there is ever a leak it can stain your parts, carpet, desk etc. Last thing is tubing, on plastic tubing it can cause it to haze and give the tube a cloudy look. I found that even Primochill Rigid acrylic tubing will get the cloudy look also but e22 tubing doesn't seem to get the cloudy looks. For those reasons I just run distilled water in my loops the only time I may use dye is is I am going to take pictures of the rig or going to my buddies lan center then I will use it for the few hours but other then that I drain my loops right back out flush real well and then put distilled water back in.


----------



## Someone09

Just ordered one of these bad boys for 750€. Can´t wait to see how it performs compared to my current 780 Ti SLI.


----------



## StillClock1

Alright, I've concluded that it wasn't the driver and have fixed the system clock by resetting the bios and a clean windows 7 install on a brand new 840 EVO.

Rome 2 Total War
1 GPU is being utilized, temps in the low 60sC

Arma III
Both GPUs at load, 74C on both, 100% utilization on both

I am running 6,000x1,920 across 5x24 inch monitors with game settings near ultra. (vsync is enabled)

Are my temps a product of having too high of settings.

Are these temps worrysome? I just checked all the fans and airflow in my case is good.

I might add another fan but I would need to move my radiator to fit another fan at the top.


----------



## xer0h0ur

The best you're going to manage with the stock cooler is to stick a 2nd fan on the radiator for a push/pull setup and that will drop your temps a few degrees at best. Other than that you can do axiumone's VRM fan mod to control that fan yourself.

When you say GPU utilization is at 100% did you scroll down on afterburner to check if your clock speeds are throttling? Since its at 74C I wouldn't be surprised if it is throttling the clocks.


----------



## StillClock1

I will look, in the event it is throttling the clocks and I see lots of ups and downs in the chart - what should I do?


----------



## xer0h0ur

Like I said, add a 2nd fan on the 295X2's radiator for a push/pull setup. If that doesn't back off the temps enough to stop throttling then you really have no other option aside from putting a waterblock on the card in a custom loop.


----------



## StillClock1

Understood, is throttling of the clock bad for the long-term health of the card?


----------



## xer0h0ur

Oh no not even in the slightest. In fact the reason its underclocking is for the sake of reducing the temperature. In reality though AMD set the temp limit too low on the 295X2 in my opinion since these exact same Hawaii cores have a 95C temp limit in the air cooled 290X's so its quite dumb they decided to limit the 295X2's Hawaii cores to 75C just because they slapped some Asetek AIO cooler on the cores.

Basically its only enthusiasts like ourselves here that want to push our hardware to its limits without any throttling that slap full coverage waterblocks on these things. In other words you're just giving up a few frames per second here or there by allowing the GPU clocks to throttle. Its absolutely and without a doubt NOT causing any damage whatsoever though.


----------



## joeh4384

I think AMD should have set the throttle point at 80 - 85.


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh no not even in the slightest. In fact the reason its underclocking is for the sake of reducing the temperature. In reality though AMD set the temp limit too low on the 295X2 in my opinion since these exact same Hawaii cores have a 95C temp limit in the air cooled 290X's so its quite dumb they decided to limit the 295X2's Hawaii cores to 75C just because they slapped some Asetek AIO cooler on the cores.
> 
> Basically its only enthusiasts like ourselves here that want to push our hardware to its limits without any throttling that slap full coverage waterblocks on these things. In other words you're just giving up a few frames per second here or there by allowing the GPU clocks to throttle. Its absolutely and without a doubt NOT causing any damage whatsoever though.


I wondered about the thermal limit for a while as well. Why would this card be limited to 75c when all of the other hawaii chips are set at 95c? I don't remember where I read it, but the reason for that are the pumps. From what I remember the tumps can't handle temperatures that high.


----------



## xer0h0ur

I fully agree. By using a 20C lower max temp it really seems to me as if AMD didn't bother to actually test the 295X2 at high resolution or eyefinity for that matter which further stresses the GPU temps.


----------



## xer0h0ur

Quote:


> Originally Posted by *axiumone*
> 
> I wondered about the thermal limit for a while as well. Why would this card be limited to 75c when all of the other hawaii chips are set at 95c? I don't remember where I read it, but the reason for that are the pumps. From what I remember the tumps can't handle temperatures that high.


Holy hell. If there is any truth behind that, it means AMD would rather sacrifice the performance of their product than to use a proper cooling solution. Not sure who I would be more pissed at, AMD or Asetek for offering up a garbage solution that can't even handle what it was "designed" for.


----------



## StillClock1

What kind of load temps are you guys getting on the stock cooler?


----------



## GreenGoblinGHz

Yo.
My R R295X2 build. I keep changin it at the mo quite alot...been doin that since launch. Also soon blockin the card and adding a decent rad to it. Rly 120 for the beast







...

Im usin Trifire configuration on this rig ( I call it Intel-rig. Chip is oc'd i5-4690k. Psu 1250w. Third gpu is XFX's R R290A (flashed,blocked n hooked to loop)


----------



## xer0h0ur

Quote:


> Originally Posted by *Hoff248*
> 
> What kind of load temps are you guys getting on the stock cooler?


Problem is that you're running a 5 monitor eyefinity setup so you would need to compare with users running a similar setup otherwise its not a fair comparison. You should be running higher temps than someone using a single monitor unless its a 4K monitor.


----------



## StillClock1

Damn that's nice looking.


----------



## StillClock1

Okay, that's good to know - since I've been finding my temps higher than most reviews and other people's posts but that's because I'm trying to push my GPUs harder I suppose.


----------



## magicdave26

Quote:


> Originally Posted by *xer0h0ur*
> 
> I fully agree. By using a 20C lower max temp it really seems to me as if AMD didn't bother to actually test the 295X2 at high resolution or eyefinity for that matter which further stresses the GPU temps.


Just got one of these myself, I have a 3 way Eyefinity setup and playing C3 for quite a while with a modest OC to 1100/1350 +10mV +30PL I never saw the temps pass 69c


----------



## joeh4384

I get upper 60s when I run a benchmark on the stock cooler at 1440p. I also replaced the fan with Corsair SP120s performance edition.


----------



## xer0h0ur

I have noted that certain games load the GPUs more than others. Note that his problem has been while his GPUs are fully loaded. If you're using/playing something that has inconsistent GPU usage then your temps will absolutely and without a doubt not be anywhere near as high.

My gaming temps even though I am waterblocked (295X2 only) are inconsistent from game to game all depending on the GPU usage. I run a modest overclock without any increase to power limit or voltage. Well, modest on the cores. vRAM is at 1500MHz and can push to 1700 without problems.


----------



## joeh4384

I have noticed that too. So far on my system, Far Cry 3 and Heaven benchmark are pretty consistent at loading them up close to 100%.


----------



## Someone09

My 295x2 arrived earlier today.
Currently putting it through its paces.
However, does perform slightly less than my 780 Ti SLI...but much cooler.

Also, what tool/s do you guys use to monitor temps/usage/etc?
FOr some reason my usual monitor tools don´t work properly with this card.


----------



## xer0h0ur

I don't know why you made a lateral move. You could get better performance out of two 780 TI's or two 290X's than a 295X2. The entire purpose of this card really is to get two GPUs through a single PCIe slot. In my case it allowed me to Tri-Fire with only having two PCIe slots.


----------



## Someone09

As weird as it sounds it was the easiest (= cheapest way) to get my overall temperatures down.

Also, more VRAM.

EDIT: Plus, I could still return it if it doesn´t suit my needs.


----------



## Elmy

Quote:


> Originally Posted by *Someone09*
> 
> As weird as it sounds it was the easiest (= cheapest way) to get my overall temperatures down.
> 
> Also, more VRAM.
> 
> EDIT: Plus, I could still return it if it doesn´t suit my needs.


Welcome to the club! These 295X2's are beast. I have 2 of them and they are running flawlessly right now pushing 5400X1920 across 5 Asus VG248QE monitors.


----------



## StillClock1

Elmy, I have a similar setup (albeit at half your GPU horsepower). I have 6000x1920 on 5 Dell 2412Us.

How are your temps? Are you water cooling or do you use the two stock coolers? I am kind of considering adding another 290x (its not like I haven't already spent too much on my computer) if the added GPU would reduce overall workload and keep the temps down. I currently have 1 R9 295x2 with stock cooling.


----------



## Someone09

So....been playing around with this thing a bit.

Compared to my 780 Ti SLI it surely is lacking a few horsepowers BUT for some reason my FPS seem to be more stable.
Which means with Tis my max. FPS (and probably the avg FPS too) were higher but the minimum FPS on the 295x2 are much higher.

But the stock fan is bugging me a bit, it´s way too loud in idle for my likings. So I hooked up two SP120s in push/pull but my temps went up about 5°C. WHich isn´t bad but they made this annoying clicking noise when connected to the card.
So, what fans are you guys using?


----------



## joeh4384

Are the SP performance or quiet editions? I use two SP 120 performance editions and they helped lower my temps 3-5 degrees.


----------



## Someone09

Quiet edition.

But I could live with the temps...that damn clicking noise though.

Talking about fan noise, why wouldn´t the stock fan slow down to its minimum RPM? I mean you really don´t need more than that in idle.


----------



## Elmy

Quote:


> Originally Posted by *Hoff248*
> 
> Elmy, I have a similar setup (albeit at half your GPU horsepower). I have 6000x1920 on 5 Dell 2412Us.
> 
> How are your temps? Are you water cooling or do you use the two stock coolers? I am kind of considering adding another 290x (its not like I haven't already spent too much on my computer) if the added GPU would reduce overall workload and keep the temps down. I currently have 1 R9 295x2 with stock cooling.


Temps are in the 60's because of a bad radiator ( I had the copper fins nickel plated which reduced the thermal efficiency of the rad ) . Re-doing everything here soon and both Club3D 295X2's will be on their own triple Coolgate rad with 6 Enermax fans in push/pull. I am using EK Nickel/Plexi waterblocks connected to a XSPC 480 rad. Also going to 2 Corsair AX1500i's because when I push my system in certain games the kill-a-watt starts reading 1600 watts and the PSU OCP kicks in.... I don't know if EVGA 1600 G2 would do any better.... Going to run 2 AX1500i's in my new build also in anticipation for AMD 390X's So I can push them to the brink of destruction.... LoL


----------



## xer0h0ur

Quote:


> Originally Posted by *Elmy*
> 
> Temps are in the 60's because of a bad radiator ( I had the copper fins nickel plated which reduced the thermal efficiency of the rad ) . Re-doing everything here soon and both Club3D 295X2's will be on their own triple Coolgate rad with 6 Enermax fans in push/pull. I am using EK Nickel/Plexi waterblocks connected to a XSPC 480 rad. Also going to 2 Corsair AX1500i's because when I push my system in certain games the kill-a-watt starts reading 1600 watts and the PSU OCP kicks in.... I don't know if EVGA 1600 G2 would do any better.... Going to run 2 AX1500i's in my new build also in anticipation for AMD 390X's So I can push them to the brink of destruction.... LoL


You must be sponsored or a baller lol. Either way I envy you.


----------



## ColeriaX

Thought I'd drop in and see how the club was doing. Just finally got to test out Firestrike Ultra and it definitely taxes the card.


----------



## joeh4384

Quote:


> Originally Posted by *Hoff248*
> 
> Elmy, I have a similar setup (albeit at half your GPU horsepower). I have 6000x1920 on 5 Dell 2412Us.
> 
> How are your temps? Are you water cooling or do you use the two stock coolers? I am kind of considering adding another 290x (its not like I haven't already spent too much on my computer) if the added GPU would reduce overall workload and keep the temps down. I currently have 1 R9 295x2 with stock cooling.


I tried crossfiring a 290x with a 295x2 and you need adequate airflow or the 290x's heat will cause the 295x2's vrms to overheat and cause the card to drop its clock down to 300mhz. This happened to me when I ran the 290x below the 295x2. There is also serious heat coming from the backplate of the 295x2 which in my case caused the 290x to overheat when it ran above the 295x2. I didn't want to deal with a loud fan profile on the 290x to prevent this so I just abandoned the idea. My computer seems to run plenty smooth with 2 GPUs and it isn't too loud either. If you want to add a 290x, I would look into water cooling it or doing the kraken g10 or corsair hg10 (comes out soon for AMD 290 reference cards) mod on the 290x.


----------



## Elmy

Quote:


> Originally Posted by *xer0h0ur*
> 
> You must be sponsored or a baller lol. Either way I envy you.


Both I am sponsored and I own my own electrical company out here in Seattle.

My build now was sponsored by Club3D, Enermax, Crucial, Adata, EKWB and Azzatek.





GO HAWKS!!!!


----------



## Someone09

Looks like I will have to switch back to my 780 Tis until I get a better shielded DP cable.

So, see you in a few days.


----------



## Orivaa

Hello peeps.
I've been considering buying a 295x2 for a while now, but my wallet isn't very deep. My options for manufactures are Sapphire, XFX, and 3D Club. (I live in Europe.)
I was considering buying Sapphire, which is slightly more expensive than 3D Club, and slightly less expensive than XFX. However, I've read around 150 pages of the thread, and seems a lot of people have been complaining about Sapphire. Are either XFX or 3D Club any better? I won't have the money to buy a new cooler or anything like that, btw. At most I'll get a cable to unlock the VRM fan controls.

So yeah, any advice on which to buy?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Orivaa*
> 
> Hello peeps.
> I've been considering buying a 295x2 for a while now, but my wallet isn't very deep. My options for manufactures are Sapphire, XFX, and 3D Club. (I live in Europe.)
> I was considering buying Sapphire, which is slightly more expensive than 3D Club, and slightly less expensive than XFX. However, I've read around 150 pages of the thread, and seems a lot of people have been complaining about Sapphire. Are either XFX or 3D Club any better? I won't have the money to buy a new cooler or anything like that, btw. At most I'll get a cable to unlock the VRM fan controls.
> 
> So yeah, any advice on which to buy?


Whichever has the best warranty.......they are all the same card just different stickers


----------



## Orivaa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Whichever has the best warranty.......they are all the same card just different stickers


Really? Then why did people complain about the Sapphire version as if it were a lot worse than the others?


----------



## axiumone

Quote:


> Originally Posted by *Orivaa*
> 
> Really? Then why did people complain about the Sapphire version as if it were a lot worse than the others?


Not sure why. It's kind of ironic actually. Since sapphire is actually the only brand that has their own factory to manufacture these videocards. They are the ones that produce the 295x2 for every other brand as well.


----------



## Orivaa

Quote:


> Originally Posted by *axiumone*
> 
> Not sure why. It's kind of ironic actually. Since sapphire is actually the only brand that has their own factory to manufacture these videocards. They are the ones that produce the 295x2 for every other brand as well.


Do they make Asus' 295x2 cards as well? Then why the heck is it 50% more expensive than Sapphire's?


----------



## Orivaa

New question: I am not at all familiar with water-cooling. I know there's something called a "water block" you can put on your card to get better temps and some such. I've seen a bit about it, but I'm confused. Usually, you'd need a water-cooling setup for this, but since the 295x2 already uses water-cooling, can you just "slap it on"?

Also, finally, would it be better to place the radiator at the top exhaust or rear exhaust? Thanks ^_^


----------



## Sgt Bilko

Quote:


> Originally Posted by *Orivaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *axiumone*
> 
> Not sure why. It's kind of ironic actually. Since sapphire is actually the only brand that has their own factory to manufacture these videocards. They are the ones that produce the 295x2 for every other brand as well.
> 
> 
> 
> Do they make Asus' 295x2 cards as well? Then why the heck is it 50% more expensive than Sapphire's?
Click to expand...

"Asus Tax"
Quote:


> Originally Posted by *Orivaa*
> 
> New question: I am not at all familiar with water-cooling. I know there's something called a "water block" you can put on your card to get better temps and some such. I've seen a bit about it, but I'm confused. Usually, you'd need a water-cooling setup for this, but since the 295x2 already uses water-cooling, can you just "slap it on"?
> 
> Also, finally, would it be better to place the radiator at the top exhaust or rear exhaust? Thanks ^_^


The stock water cooling on the 295x2 is very different from a custom water cooling loop

The 295x2 uses an "AIO" (all in one cooler) which has the coolant, pump, radiator and tubing included and sealed up.

A custom loop has all these seperate and requires you to purchase water blocks separate, custom loops always have better cooling but the AIO is perfectly fine for stock speeds and even a small overclock.

Someone else might be able to explain it better.....mobile sucks


----------



## magicdave26

^^ basically the difference between buying a car, and buying a kit car


----------



## StillClock1

Wanted to give credit to someone on this forum who suggested to pull air in through the stock radiator. I just installed a push/pull brining in air from the back of the tower and my temps are much better (probably 5C). Now I just need to make sure the tower is pushing out the hot air elsewhere.


----------



## pompss

White Ares 3


----------



## xer0h0ur

So you went from complaining about the card to selling it to buying another one?


----------



## Orivaa

Quote:


> Originally Posted by *pompss*
> 
> White Ares 3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


You painted it? You should have left the letters and icon in black. Would've looked so badass.


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> So you went from complaining about the card to selling it to buying another one?


Believe or not this one its working much better then 295x2 which i had some much trouble .

This one Runs flawless !! Asus made a really good job. The only think is that there is not red led lights. So i had to build and add two red led strips to light up the Logo Ares.


----------



## pompss

Quote:


> Originally Posted by *Orivaa*
> 
> You painted it? You should have left the letters and icon in black. Would've looked so badass.


I couldn't leave the letter black since my build will have red and white colors.Also there is no way to paint the letter in a different color and the plate white.
The letter are to small .


----------



## Syceo

well look who it is lol , you gotta be crapping me ... its pompss lol , i swear the last time i saw you in here you were bashing the hell out of the 295x2 or was that just me .......


----------



## DividebyZERO

I think the more interesting questions is, didn't he buy 2 980gtx's? Or am i mistaken?

Nice card btw, i def like the looks of the Ares, i wonder how well it performs.


----------



## Syceo

Quote:


> Originally Posted by *DividebyZERO*
> 
> I think the more interesting questions is, didn't he buy 2 980gtx's? Or am i mistaken?
> 
> Nice card btw, i def like the looks of the Ares, i wonder how well it performs.


The card is lovley, would love one in an ITX or small form case, however.. £1200 doesn't sit well with me for a cleaned up 295x2. Would be nice to see some benches though . I still have my 295x2 but not 100% sure what to do with it

Sell it and pay the difference on an Ares III or keep it and do a small build


----------



## DividebyZERO

Quote:


> Originally Posted by *Syceo*
> 
> The card is lovley, would love one in an ITX or small form case, however.. £1200 doesn't sit well with me for a cleaned up 295x2. Would be nice to see some benches though . I still have my 295x2 but not 100% sure what to do with it
> 
> Sell it and pay the difference on an Ares III or keep it and do a small build


I thought Ares 3 has more input power, 3x8pin?


----------



## Syceo

Quote:


> Originally Posted by *DividebyZERO*
> 
> I thought Ares 3 has more input power, 3x8pin?


yup it has 3x8 pins


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> The card is lovley, would love one in an ITX or small form case, however.. £1200 doesn't sit well with me for a cleaned up 295x2. Would be nice to see some benches though . I still have my 295x2 but not 100% sure what to do with it
> 
> Sell it and pay the difference on an Ares III or keep it and do a small build


Keep it and do a small build. The Ares is not going to give you anything that a regular 295X2 with a waterblock can't already do for you.
IMO, it's not worth it to spend even more for a product you already have, just because Asus added more "bells and whistles" and other marketing mumbo jumbo. The last time they put out an ares product, the price tag did not justify the performance "gained" from using an Ares vs a regular card with a water block.
If you really want more performance, just flash the 295x2 with the Sapphire OC bios. The master and slave files are already in this forum (somewhere, just find it). That'll bump you up to 1030 for the cores and 1300 for the memory (Ares 3 is 1030/1250).

Water cooled, the 295x2 never goes above 60c (that's my temp WITH an overclock of 1061/1400), so the Ares 3 having an 85c throttle limit doesn't really do much.

Then again, it's up to you. I just think it's a waste of money since you already have a 295x2. Feed those kids! Lol


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Keep it and do a small build. The Ares is not going to give you anything that a regular 295X2 with a waterblock can't already do for you.
> IMO, it's not worth it to spend even more for a product you already have, just because Asus added more "bells and whistles" and other marketing mumbo jumbo. The last time they put out an ares product, the price tag did not justify the performance "gained" from using an Ares vs a regular card with a water block.
> If you really want more performance, just flash the 295x2 with the Sapphire OC bios. The master and slave files are already in this forum (somewhere, just find it). That'll bump you up to 1030 for the cores and 1300 for the memory (Ares 3 is 1030/1250).
> 
> Water cooled, the 295x2 never goes above 65c, so the Ares 3 having an 85c throttle limit doesn't really do much.
> 
> Then again, it's up to you. I just think it's a waste of money since you already have a 295x2. Feed those kids! Lol


Haha whats up Lj, oh yeah those kids ( nearly forgot about them ) , I guess that makes sense , gonna look for a case and a decent micro mobo, then i'll see whats what...


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Haha whats up Lj, oh yeah those kids ( nearly forgot about them ) , I guess that makes sense , gonna look for a case and a decent micro mobo, then i'll see whats what...


I would wait til Christmas. there's a case i'm eyeing which is the Phanteks Enthoo Mini XL. It's a Micro ATX which can also hold a Mini ITX (2 systems in 1).
If the price is right, I'll sell my Maximus VII Hero and get a Maximus VII Gene and build in that case.
The water cooling potential is EPIC in that case. We'll see though.

For me, I'm most likely gonna stick with what I have and upgrade my pump from a D5 vario to the Swiftech MCP50x. Should provide more flow which will result in better cooling for my system.


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> I would wait til Christmas. there's a case i'm eyeing which is the Phanteks Enthoo Mini XL. It's a Micro ATX which can also hold a Mini ITX (2 systems in 1).
> If the price is right, I'll sell my Maximus VII Hero and get a Maximus VII Gene and build in that case.
> The water cooling potential is EPIC in that case. We'll see though.
> 
> For me, I'm most likely gonna stick with what I have and upgrade my pump from a D5 vario to the Swiftech MCP50x. Should provide more flow which will result in better cooling for my system.


Good choice of case, geeze could easily do a trifire in that , from what i can see its got enough room for 2x 240 rads and 1x 360 all in that smaller form, thats some serious cooling options. That would pretty much allow for a silent operation. Might have to hold off till xmas... any idea when it might fully come to market?


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Good choice of case, geeze could easily do a trifire in that , from what i can see its got enough room for 2x 240 rads and 1x 360 all in that smaller form, thats some serious cooling options. That would pretty much allow for a silent operation. Might have to hold off till xmas... any idea when it might fully come to market?


I heard rumors of Christmas. I don't like the idea of having a motherboard and a case lying around though. I think it'll be better for me to just upgrade the pump. Maybe if I feel rich, I'll spring for the Enthoo Primo and go 360+240+120 radiators (Since I have a spare 120 lying around)


----------



## electro2u

New Trifire bridge is... tubing. Such a pain connecting GPU terminals that don't match up.
Will sell the 295x2 soon.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> 
> 
> New Trifire bridge is... tubing. Such a pain connecting GPU terminals that don't match up.
> Will sell the 295x2 soon.


dont disrepair lol use one of these



Bitspower G 1/4" Matte Black Triple Rotary 90 Degree G 1/4" Adapter

will take some giggling ... but should fit


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> dont disrepair lol use one of these
> 
> Bitspower G 1/4" Matte Black Triple Rotary 90 Degree G 1/4" Adapter
> 
> will take some giggling ... but should fit


You mean like this?


I'm the guy that came up with that









This X79 setup has more space between cards.


----------



## pompss

Quote:


> Originally Posted by *ljreyl*
> 
> Keep it and do a small build. The Ares is not going to give you anything that a regular 295X2 with a waterblock can't already do for you.
> IMO, it's not worth it to spend even more for a product you already have, just because Asus added more "bells and whistles" and other marketing mumbo jumbo. The last time they put out an ares product, the price tag did not justify the performance "gained" from using an Ares vs a regular card with a water block.
> If you really want more performance, just flash the 295x2 with the Sapphire OC bios. The master and slave files are already in this forum (somewhere, just find it). That'll bump you up to 1030 for the cores and 1300 for the memory (Ares 3 is 1030/1250).
> 
> Water cooled, the 295x2 never goes above 60c (that's my temp WITH an overclock of 1061/1400), so the Ares 3 having an 85c throttle limit doesn't really do much.
> 
> Then again, it's up to you. I just think it's a waste of money since you already have a 295x2. Feed those kids! Lol


The ares 3 its a different card in terms of performance, stability and overclock.
Since the regular 295x2 have two 8 pin and the ares 3 have 3x 8pin. I leave to you the conclusion .
Better components then the r9 295x2 and better overclock.
But yes you don't upgrade from a 295x2 to ARES if you have a waterblock. I needed for my build design this is why .
I love the ares 3 design so i gave the 295x2 another chance And also its ASUS ROG BRAND which I LOVE


----------



## electro2u

Quote:


> Originally Posted by *pompss*
> 
> The ares 3 its a different card in terms of performance, stability and overclock.
> Since the regular 295x2 have two 8 pin and the ares 3 have 3x 8pin. I leave to you the conclusion .
> Better components then the r9 295x2 and better overclock.


Does it get different drivers?


----------



## pompss

I installed the 14.9


----------



## Orivaa

What is the difference between Sapphire 295x2 and Sapphire 295x2 OC? Any improved components, or just different BIOS? If the latter, can the OC BIOS be installed on the non-OC version and still function fine, even with stock cooling?


----------



## xer0h0ur

Unless you are buying something like a Devil 13 or an Ares 3 then any card you buy is the exact same hardware with only difference being the BIOS and the warranty being offered on the card. So in other words buy the card that gives you the best warranty or doesn't void your warranty when you put a waterblock on it.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Unless you are buying something like a Devil 13 or an Ares 3 then any card you buy is the exact same hardware with only difference being the BIOS and the warranty being offered on the card. So in other words buy the card that gives you the best warranty or doesn't void your warranty when you put a waterblock on it.


Eh, it seems my only options are Sapphire and 3D Club. Was originally gonna go with Sapphire, but 3D Club costs slightly less, and while I'm unsure about whether or not to do a full water-cooling setup in the future (Looks way too complicated for me, bruh), I wanna unlock the VRM fan speed, so I'm gonna have to pry it open, which would void Sapphire's warranty.

Regardless, though, it seems like I'll have to wait 'till Christmas before I get the card. My family would _kill_ me if I spent all my money on a graphics card and not on presents.
Ugh, the wait is absolutely terrible.


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> Eh, it seems my only options are Sapphire and 3D Club. Was originally gonna go with Sapphire, but 3D Club costs slightly less, and while I'm unsure about whether or not to do a full water-cooling setup in the future (Looks way too complicated for me, bruh), I wanna unlock the VRM fan speed, so I'm gonna have to pry it open, which would void Sapphire's warranty.
> 
> Regardless, though, it seems like I'll have to wait 'till Christmas before I get the card. My family would _kill_ me if I spent all my money on a graphics card and not on presents.
> Ugh, the wait is absolutely terrible.


Since all you're doing to manually control the VRM fan involves opening the metal shell covering the Asetek pumps I can assure you not a single company would figure out you did it. There are no seals or stickers to show you did so, so unless you flat out told them then your warranty will still be honored in the event of a hardware failure. It only becomes more complicated when you waterblock it. That is when you have to save all of the original thermal pads for the PLX chip, VRMs, mosfets, vRAM and such.


----------



## joeh4384

Quote:


> Originally Posted by *Orivaa*
> 
> What is the difference between Sapphire 295x2 and Sapphire 295x2 OC? Any improved components, or just different BIOS? If the latter, can the OC BIOS be installed on the non-OC version and still function fine, even with stock cooling?


I flashed the OC bios to my XFX card. You just need both the slave and master bioses.


----------



## crystaldark

I am having serious issues. Sometimes when I change resolutions or resume from standby my computer restarts. No BSOD (so no minidump), no warning or diagnostic information whatsoever. It is driving me crazy.

Here's my setup:

ASUS Rampage IV Black Edition
Intel 4960X
2x 295x2
2x 1200w Superflower Leadex Platinum
16gb Corsair Dominator Platinum
2x Dell Ultrasharp 4K (UP2414Q)

I think it probably has to do with the 4K monitors as the way 4K is implemented is a total hack. They give me a lot of issues beyond possibly causing my computer to restart.

I just reinstalled Windows and installed every conceivable driver and I'm still having the same problems.


----------



## electro2u

Quote:


> Originally Posted by *crystaldark*
> 
> I am having serious issues. Sometimes when I change resolutions or resume from standby my computer restarts. No BSOD (so no minidump), no warning or diagnostic information whatsoever. It is driving me crazy.
> 
> Here's my setup:
> 
> ASUS Rampage IV Black Edition
> Intel 4960X
> 2x 295x2
> 2x 1200w Superflower Leadex Platinum
> 16gb Corsair Dominator Platinum
> 2x Dell Ultrasharp 4K (UP2414Q)
> 
> I think it probably has to do with the 4K monitors as the way 4K is implemented is a total hack. They give me a lot of issues beyond possibly causing my computer to restart.
> 
> I just reinstalled Windows and installed every conceivable driver and I'm still having the same problems.


Is it possible it has to do with mb settings? Have you had the setup working without this issue using different monitors? I have been messing with my r4be a lot lately and have been getting similar restarts. I think I have it narrowed down to vccsa mode (was using offset) which I've changed now to just auto and seems to have stopped occurring.


----------



## crystaldark

Quote:


> Originally Posted by *electro2u*
> 
> Is it possible it has to do with mb settings? Have you had the setup working without this issue using different monitors? I have been messing with my r4be a lot lately and have been getting similar restarts. I think I have it narrowed down to vccsa mode (was using offset) which I've changed now to just auto and seems to have stopped occurring.


Thanks for the reply. I am going to turn off DP 1.2 on the monitors and run them at 1080P so they act like normal monitors. If I still have the issue I will start looking at the motherboard. I haven't changed any options in the BIOS besides setting the SATA controller to RAID.


----------



## axiumone

It's the motherboard. I can stake my pinky on it. There are a ton of people that have issues with the R4BE. I had the exact same set up as you, except for the monitor. The system would randomly reboot for no reason, even at stock settings.

Take a look at the official R4BE thread, there are lots and lots of suggestions to get the board stable, as it's incredibly finicky.


----------



## electro2u

Quote:


> Originally Posted by *axiumone*
> 
> It's the motherboard. I can stake my pinky on it. There are a ton of people that have issues with the R4BE. I had the exact same set up as you, except for the monitor. The system would randomly reboot for no reason, even at stock settings.
> 
> Take a look at the official R4BE thread, there are lots and lots of suggestions to get the board stable, as it's incredibly finicky.


Quote:


> Originally Posted by *crystaldark*
> 
> Thanks for the reply. I am going to turn off DP 1.2 on the monitors and run them at 1080P so they act like normal monitors. If I still have the issue I will start looking at the motherboard. I haven't changed any options in the BIOS besides setting the SATA controller to RAID.


One more thing I wanted to mention:
My 295x2 has what MSI reps called a "hybrid" BIOS. This means it will work without Compatibility Support Module in UEFI Mode on Win7 or 8.1, but also that it works *better* with CSM turned on (again, this was according to MSI reps).
It seems they were right about the CSM option being better. This goes for the Sapphire OC BIOS, too, which is what I'm currently using.
It *works* with CSM turned off, but weird things happen... like restarts (I can't really tell where they were coming from, but they seem to be gone after this and the VCCSA auto change I made)edit: it was cpu ppl voltage I forgot to set it after bios flashing
AND I am now positive it's causing 1080p videos to look strange in Chrome (for me). With CSM on, my 1080p video streams on Chrome from YouTube look pixelated and strange, while the same videos look fine at 720p.
I turn CSM off, and the videos look great at 1080p.
Weird.
The 295x2 is weird.
Take this all with a grain of salt because I'm using a weird monitor setup, too. I run 120Hz on a Korean OC monitor @1440p using a pixel clock patcher that breaks HDCP handshake and some other strange issues that I live with.


----------



## Zerothaught

I am planning on building a case here soon. Do you guys think it would be more beneficial to have an intake fan blowing on the VRM fan; or an exhaust fan expelling the air from the case? I planned on having two fans blowing on the card, then two exhaust fans off to the side pulling hot air from the case. Would be about 1.5 feet away though.


----------



## xer0h0ur

Neither. The best option is opening the metal shell, disconnecting the VRM fan from the PCB, connecting it to an open PWM fan header on your motherboard or fan controller and manually controlling it yourself.

Other than that, turning the radiator into a push/pull configuration if you already haven't is the other thing to do.


----------



## Zerothaught

Quote:


> Originally Posted by *xer0h0ur*
> 
> Neither. The best option is opening the metal shell, disconnecting the VRM fan from the PCB, connecting it to an open PWM fan header on your motherboard or fan controller and manually controlling it yourself.
> 
> Other than that, turning the radiator into a push/pull configuration if you already haven't is the other thing to do.


I actually have my radiator set up in push pull, but I wasn't sure if having no a fan blowing directly into the fan on the card would be better or worse than just setting up an exhaust directly under the fan.


----------



## xer0h0ur

Quote:


> Originally Posted by *Zerothaught*
> 
> I actually have my radiator set up in push pull, but I wasn't sure if having no a fan blowing directly into the fan on the card would be better or worse than just setting up an exhaust directly under the fan.


Not sure if I am misunderstanding you but why would you put a fan directly below, robbing airflow, from a fan that is trying to suck air in towards the PCB?


----------



## electro2u

Quote:


> Originally Posted by *Zerothaught*
> 
> I actually have my radiator set up in push pull, but I wasn't sure if having no a fan blowing directly into the fan on the card would be better or worse than just setting up an exhaust directly under the fan.


Intakes towards the card will be much more beneficial than exhaust. I've tried it both ways because it was not obvious to me either. Side panel fans help vrm's a ton. People would use them a lot more except they want to show off the components with windows.

Particularly with a fan at the bottom of the case, that intake will bring in the coldest (and diritiest) air.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Unless you are buying something like a Devil 13 or an Ares 3 then any card you buy is the exact same hardware with only difference being the BIOS and the warranty being offered on the card. So in other words buy the card that gives you the best warranty or doesn't void your warranty when you put a waterblock on it.


Wait, so the only difference between Sapphire 195x2 and Sapphire 295x2 OC is the bios? Then why is it so much more expensive??

Also, is it possible to replace either the radiator fan or VRM fan? If so, would it be a good idea to replace them, and what fans would be best to use instead?


----------



## electro2u

Quote:


> Originally Posted by *Orivaa*
> 
> Wait, so the only difference between Sapphire 195x2 and Sapphire 295x2 OC is the bios? Then why is it so much more expensive??




Quote:


> Also, is it possible to replace either the radiator fan or VRM fan? If so, would it be a good idea to replace them, and what fans would be best to use instead?


Yes, but I'd probably leave the VRM fan alone.
Any high static pressure fan would be good on the radiator. Push/pull helps a little.


----------



## Orivaa

Quote:


> Originally Posted by *electro2u*
> 
> 
> Yes, but I'd probably leave the VRM fan alone.
> Any high static pressure fan would be good on the radiator. Push/pull helps a little.


But the radiator and such are fine when it comes to temps. It's the VRMs that need tending, so I just wanted to know what one could do besides unlocking fan speed. (Maybe Corsair had some kind of super VRM fan or something.)

Also, dat suitcase.


----------



## pompss

The competition its open. IF you guys like mine or other Users builds go vote for it. We need more votes .

Thanks

http://www.overclock.net/t/1517172/ocn-mod-of-the-month-october-2014-professional-class-nominations/50#post_23074548


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> Wait, so the only difference between Sapphire 195x2 and Sapphire 295x2 OC is the bios? Then why is it so much more expensive??
> 
> Also, is it possible to replace either the radiator fan or VRM fan? If so, would it be a good idea to replace them, and what fans would be best to use instead?


A shiny aluminum case


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> A shiny aluminum case


To be fair, it _is_ shiny.


----------



## shadow85

So do people still opt for 295x2 over a say 2x GTX 980 these days?


----------



## rdr09

Quote:


> Originally Posted by *shadow85*
> 
> So do people still opt for 295x2 over a say 2x GTX 980 these days?


what do you currently have?


----------



## xer0h0ur

Quote:


> Originally Posted by *shadow85*
> 
> So do people still opt for 295x2 over a say 2x GTX 980 these days?


Well until Nvidia stops milking the 900 series and releases a dual GPU version, this card is still the best dual GPU card money can buy.


----------



## shadow85

Quote:


> Originally Posted by *rdr09*
> 
> what do you currently have?


Well I have ordered two 980's, but I always did love the design of the 295x2. The only thing that put me off was the power consumption. Otherwise I would have definatly grabbed one by now.

But I was just wondering, if people buy the 295X2 over two gtx 980's.

Other than that I don't think there is anything wrong with either.


----------



## kaltek2599

Has anyone managed to get a 295x2, 290x trifire setup working with the latest catalyst drivers? I had BSODs and system slowness with Catalyst 14.9 upon release and rolled back to 14.4 which runs well. Was curious if anyone had been able to get 14.9 or its newer beta variants to work without issue?


----------



## electro2u

Yeah I'm using 14.9 with a295x2+290x


----------



## Sgt Bilko

14.9 here with a 295x2 + 2 x r9 290's


----------



## xer0h0ur

14.9 and 14.9.1 were giving me all sorts of headaches. BSODs like crazy. I finally gave the 14.9.2 a shot a few days ago and so far I have not experienced any BSODs or crashes for that matter. I did note some light texture flickering in Skyrim but nothing big enough to make me revert to 14.7 RC3 again. I however avoid crossfire at all cost in DX9 games because I get stuttering.


----------



## ImperialOne

I had issues with trifire 295x2 + 290x in Star Trek Online, weird yes, in 14.9.1 and 14.9.2 so went back to 14.7


----------



## pompss

secondary card deactived .
Cannot find anywhere the crossfire option any ideas??


----------



## rdr09

Quote:


> Originally Posted by *shadow85*
> 
> Well I have ordered two 980's, but I always did love the design of the 295x2. The only thing that put me off was the power consumption. Otherwise I would have definatly grabbed one by now.
> 
> But I was just wondering, if people buy the 295X2 over two gtx 980's.
> 
> Other than that I don't think there is anything wrong with either.


i am not in a position to compare the two but i think you will be pleased with those beasts. and you are right, if you are concern about power consumption, then yah, you made the right move.

Quote:


> Originally Posted by *xer0h0ur*
> 
> 14.9 and 14.9.1 were giving me all sorts of headaches. BSODs like crazy. I finally gave the 14.9.2 a shot a few days ago and so far I have not experienced any BSODs or crashes for that matter. I did note some light texture flickering in Skyrim but nothing big enough to make me revert to 14.7 RC3 again. I however avoid crossfire at all cost in DX9 games because I get stuttering.


currently using 14.9.2 and played about 3 hours of skyrim yesterday to investigate the sound cracking issue one member is having and found none. that and the game play was quite smooth. it is as if i was using one card, which i should since i am using 1080. i just want to know if i'll suffer the same sound issue. using 2 290s, btw.


----------



## xer0h0ur

I don't know if the sound issue would be evident while using a USB headset as is how I play but at least in my 8 hour Skyrim binge yesterday I didn't have any sound issues.


----------



## rdr09

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know if the sound issue would be evident while using a USB headset as is how I play but at least in my 8 hour Skyrim binge yesterday I didn't have any sound issues.


i'm using on board audio. tried headphones and 5.1.


----------



## kaltek2599

Interesting, no issues at all? I couldn't get anywhere until I used DDU and rolled back to 14.4. I have a sapphire 295x2 and a PowerColor 290x. Can't imagine that the brands should cause an issue though.


----------



## kaltek2599

That is really a quad crossfire setup isn't it? However given 4 way is more finicky, leads me to believe that perhaps it has to do with some other interactions and not the drivers in of themselves. Fwiw I performed a clean install of windows 8.1 prior to starting with 14.9 so I would have assumed minimal overlap of other junk...ah well....very YMMV it appears. Oddly enough this is the first time in 13 years that I had AMD driver issues, guess my luck had to run out.


----------



## kaltek2599

This sounds closer to my experience, hoping 14.9.2 (third time) is the charm







I'll try it and post back based on what I find. Thanks all for responding!


----------



## Aarent

http://www.overclock.net/t/1521115/cooler-master-130-pandora-2-0-295x2-build-log


----------



## pompss

Someone have some good bench result for a single r9 295x2 with valley and 3d mark.
i need it to comapre with my asus ares 3 and gtx 780 ti hof v20


----------



## joeh4384

Quote:


> Originally Posted by *shadow85*
> 
> So do people still opt for 295x2 over a say 2x GTX 980 these days?


Quote:


> Originally Posted by *pompss*
> 
> Someone have some good bench result for a single r9 295x2 with valley and 3d mark.
> i need it to comapre with my asus ares 3 and gtx 780 ti hof v20


Stock clocks with the OC bios.

http://www.3dmark.com/fs/3097001


----------



## pompss

Thanks but i need some single 295x2 bench. your its dual r9 295x2


----------



## joeh4384

I only have 1 295x2. Are you trying to get a bench using one of the GPUs?


----------



## pompss

Ok my mistake i though was two r9 295x2
i try to compare the ares 3 with gtx 780 ti and r9 295x2
thanks for the screenshot


----------



## StillClock1

Hey guys, apologies for the totally noob question here: I would like to add the R9 295x2 Owners Club to my signature but I can't figure out how - how did you guys do it?


----------



## xer0h0ur

I guess you need to be added to the club by the OP. Try PMing him as I don't think he's keeping up with this thread anymore.


----------



## yifeng3007

Guys, is it just me, or the new COD: Advanced Warfare can't be played on two R9 295x2's? The game starts to get so laggy, that the character can't even move, lol. However, if i turn off the CrossFireX, which means, that the game now utilizes 2 GPU's. instead of 4, even though it only shows one R9 200 series GPU, in the settings menu, the game gets so smooth!
I've also noticed, that when i turn on all my monitors and set the resolution at 5760x1080, the image looks stretched, but if i turn off two monitors and leave only one working (1920x1080), the image looks normal. Does this mean, that the game just stretches the FHD resolution and applies it to 5760x1080, or there is something wrong at my end?


----------



## boredmug

I don't own the game or two 295x2's so I can't comment on the freezing but for the stretched out look in eyefinity, but see if the program wide screen fixer has a profile for advanced warfare yet. I use it for a lot of games that stretch weird like that in eyefinity 5760x1080
Quote:


> Originally Posted by *yifeng3007*
> 
> Guys, is it just me, or the new COD: Advanced Warfare can't be played on two R9 295x2's? The game starts to get so laggy, that the character can't even move, lol. However, if i turn off the CrossFireX, which means, that the game now utilizes 2 GPU's. instead of 4, even though it only shows one R9 200 series GPU, in the settings menu, the game gets so smooth!
> I've also noticed, that when i turn on all my monitors and set the resolution at 5760x1080, the image looks stretched, but if i turn off two monitors and leave only one working (1920x1080), the image looks normal. Does this mean, that the game just stretches the FHD resolution and applies it to 5760x1080, or there is something wrong at my end?


----------



## axiumone

No surprise there. CoD uses the same engine for every game so far and not ONE of them has ever supported eyefinity. There was a tool made by someone in the wsgf.com for CoD Ghosts that would fix that issue. It's called widescreen fixer, I don't think the tool supports the new CoD yet though.

Also, there's no crossfire profile for the game, so it won't perform smooth until that's created.


----------



## xer0h0ur

I'm also fairly certain that disabling crossfire will only use one GPU not two. At least this has been how tri-fire works with crossfire disabled.


----------



## jelly4ish

Can I ask what driver you are using?


----------



## magicdave26

Selling my 295X2 after 2 weeks of owning it - drivers / crossfire support is non existent - no point in having a GPU like this when you can only use half of it in 70% of games


----------



## xer0h0ur

Seems like you made an ill-informed impulse buy if you were expecting the vast majority of games to have support for crossfire or sli for that matter. I jumped into this headfirst expecting issues so I am not surprised.

Have you even tried manually creating crossfire profiles within the CCC for the 70% of games you're experiencing issues with?


----------



## yifeng3007

Quote:


> Originally Posted by *boredmug*
> 
> I don't own the game or two 295x2's so I can't comment on the freezing but for the stretched out look in eyefinity, but see if the program wide screen fixer has a profile for advanced warfare yet. I use it for a lot of games that stretch weird like that in eyefinity 5760x1080


Quote:


> Originally Posted by *axiumone*
> 
> No surprise there. CoD uses the same engine for every game so far and not ONE of them has ever supported eyefinity. There was a tool made by someone in the wsgf.com for CoD Ghosts that would fix that issue. It's called widescreen fixer, I don't think the tool supports the new CoD yet though.
> 
> Also, there's no crossfire profile for the game, so it won't perform smooth until that's created.


I really can't understand this. Is it so hard to make an Eyefinity support after all these years, especially considering the same engine used for those games?
I'll try that widescreen fixer somewhere in the near future, I just really don't like the idea of using any program to play the game properly >:l
Quote:


> Originally Posted by *xer0h0ur*
> 
> I'm also fairly certain that disabling crossfire will only use one GPU not two. At least this has been how tri-fire works with crossfire disabled.


This is probably what is going on with my build now. Funny, that this crazy amount of power is pretty much useless in this game.
Quote:


> Originally Posted by *jelly4ish*
> 
> Can I ask what driver you are using?


I'm using the 14.9 driver, i guess, since this is what CCC shows me.


----------



## Santho

I'm getting like 40fps on Call of duty advanced warfare with a r9 295x2 ;_; I know it does not support cfx but shouldn't i get a little bit more :c ?


----------



## DividebyZERO

Quote:


> Originally Posted by *Santho*
> 
> I'm getting like 40fps on Call of duty advanced warfare with a r9 295x2 ;_; I know it does not support cfx but shouldn't i get a little bit more :c ?


what resolution?


----------



## jelly4ish

How do you set the card to single core use for games that have issues with the dual cores?


----------



## Santho

Quote:


> Originally Posted by *DividebyZERO*
> 
> what resolution?


1080p...


----------



## boredmug

Quote:


> Originally Posted by *yifeng3007*
> 
> I really can't understand this. Is it so hard to make an Eyefinity support after all these years, especially considering the same engine used for those games?
> I'll try that widescreen fixer somewhere in the near future, I just really don't like the idea of using any program to play the game properly >:l
> This is probably what is going on with my build now. Funny, that this crazy amount of power is pretty much useless in this game.
> I'm using the 14.9 driver, i guess, since this is what CCC shows me.


This is exactly why i vowed to never buy another game in the call of duty franchise after Ghost. Crossfire STILL does not work properly in that game and NEVER will. They care very little about PC players. It's all about the consoles because that is where they make the money. Widescreen fixer is a nice app. If you are new to Eyefinity get used to using it. LOTS of games do not correctly scale to the resolutions we are playing at. The latest game ive been using it for is Middle-earth shadows of Mordor. Without it you only get the middle screen plus about a 5th of the side monitors.

I also find it pretty asinine that the developers make games with such huge requirements that often tax even high-end cards, but fail to implement proper support for multi-gpu systems.


----------



## DividebyZERO

Quote:


> Originally Posted by *boredmug*
> 
> This is exactly why i vowed to never buy another game in the call of duty franchise after Ghost. Crossfire STILL does not work properly in that game and NEVER will. They care very little about PC players. It's all about the consoles because that is where they make the money. Widescreen fixer is a nice app. If you are new to Eyefinity get used to using it. LOTS of games do not correctly scale to the resolutions we are playing at. The latest game ive been using it for is Middle-earth shadows of Mordor. Without it you only get the middle screen plus about a 5th of the side monitors.
> 
> I also find it pretty asinine that the developers make games with such huge requirements that often tax even high-end cards, but fail to implement proper support for multi-gpu systems.


Whats more depressing is I thought with the new game consoles things would potentially get better.
Now its even more clear its probably going to get worse. New consoles barely can make [email protected] and thats becoming a standard set in games. Not even talking about Vram issues also becoming a pain.


----------



## jelly4ish

Quote:


> Originally Posted by *boredmug*
> 
> This is exactly why i vowed to never buy another game in the call of duty franchise after Ghost. Crossfire STILL does not work properly in that game and NEVER will. They care very little about PC players. It's all about the consoles because that is where they make the money. Widescreen fixer is a nice app. If you are new to Eyefinity get used to using it. LOTS of games do not correctly scale to the resolutions we are playing at. The latest game ive been using it for is Middle-earth shadows of Mordor. Without it you only get the middle screen plus about a 5th of the side monitors.
> 
> I also find it pretty asinine that the developers make games with such huge requirements that often tax even high-end cards, but fail to implement proper support for multi-gpu systems.


I'm having serious issues playing shadows of mordor on this card even at 1080? how did you resolve the menu bugs and flickering objects in the backround?


----------



## boredmug

Quote:


> Originally Posted by *jelly4ish*
> 
> I'm having serious issues playing shadows of mordor on this card even at 1080? how did you resolve the menu bugs and flickering objects in the backround?


Well I don't have a 295x2.. I'm currently running two 7950's in crossfire but the latest beta drivers cleared up the menu bugs, but i still have some flickering objects in the background. It has to do with crossfire in that game.

but ATLEAST the gpu's scale correctly in shadows of mordor. It's good for me because my old 7950's struggle to hit 60fps at 5760x1080. Would love a 295x2 but i'll probably go 2x290x's. Cant justify the single card unless i was gunna go tri-fire with another 290x


----------



## xer0h0ur

Quote:


> Originally Posted by *jelly4ish*
> 
> How do you set the card to single core use for games that have issues with the dual cores?


By manually creating a crossfire profile for the game within the CCC then selecting the profile and scrolling to the bottom to pick disabled for the crossfire setting. This is if you specifically want it for that given game otherwise just disable crossfire altogether in the CCC when you go to play that game and re-enable afterwards.


----------



## magicdave26

Quote:


> Originally Posted by *xer0h0ur*
> 
> Seems like you made an ill-informed impulse buy if you were expecting the vast majority of games to have support for crossfire or sli for that matter. I jumped into this headfirst expecting issues so I am not surprised.
> 
> Have you even tried manually creating crossfire profiles within the CCC for the 70% of games you're experiencing issues with?


No, I knew exactly how bad CrossFire was when I got it - I just hoped since the last time I tried CF that drivers/profiles had somewhat matured, but nope, exactly the same as they were around 6 years back with my last CF setup

Yep, Ive tried all-sorts to get CF working, when it works it's amazing, loads of power, but that's basically in benchmarks and if you're lucky, the odd game - for the rest I have to create profiles and disable it - not a lot of point considering I upgraded from a 290 only to be stuck with a 290X most of the time

I'd prefer a bit of cash in my pocket and a decent single GPU again, I won't be going CF again - my brother has 780ti SLI and reports the opposite, just about every game that doesn't work with my CF works fine in SLI

Shame because the sheer amount of GPU power the 295X2 has could have been good, it's like putting ford fiesta wheels and tyres onto a supercar - all the power is there, but it can't put it on the road


----------



## DividebyZERO

Quote:


> Originally Posted by *magicdave26*
> 
> No, I knew exactly how bad CrossFire was when I got it - I just hoped since the last time I tried CF that drivers/profiles had somewhat matured, but nope, exactly the same as they were around 6 years back with my last CF setup
> 
> Yep, Ive tried all-sorts to get CF working, when it works it's amazing, loads of power, but that's basically in benchmarks and if you're lucky, the odd game - for the rest I have to create profiles and disable it - not a lot of point considering I upgraded from a 290 only to be stuck with a 290X most of the time
> 
> I'd prefer a bit of cash in my pocket and a decent single GPU again, I won't be going CF again - my brother has 780ti SLI and reports the opposite, just about every game that doesn't work with my CF works fine in SLI
> 
> Shame because the sheer amount of GPU power the 295X2 has could have been good, it's like putting ford fiesta wheels and tyres onto a supercar - all the power is there, but it can't put it on the road


What games doesn't crossfire work in for you? Not trying to debate crossfire/SLI compatability margins. I do have to say Most of the games i've played do work in crossfire, Eyefinity is the harder one to get working most of the time.


----------



## xer0h0ur

Quote:


> Originally Posted by *magicdave26*
> 
> No, I knew exactly how bad CrossFire was when I got it - I just hoped since the last time I tried CF that drivers/profiles had somewhat matured, but nope, exactly the same as they were around 6 years back with my last CF setup
> 
> Yep, Ive tried all-sorts to get CF working, when it works it's amazing, loads of power, but that's basically in benchmarks and if you're lucky, the odd game - for the rest I have to create profiles and disable it - not a lot of point considering I upgraded from a 290 only to be stuck with a 290X most of the time
> 
> I'd prefer a bit of cash in my pocket and a decent single GPU again, I won't be going CF again - my brother has 780ti SLI and reports the opposite, just about every game that doesn't work with my CF works fine in SLI
> 
> Shame because the sheer amount of GPU power the 295X2 has could have been good, it's like putting ford fiesta wheels and tyres onto a supercar - all the power is there, but it can't put it on the road


Can't fault you there. Out of curiosity which games were giving you problems.


----------



## magicdave26

Quote:


> Originally Posted by *DividebyZERO*
> 
> What games doesn't crossfire work in for you? Not trying to debate crossfire/SLI compatability margins. I do have to say Most of the games i've played do work in crossfire, Eyefinity is the harder one to get working most of the time.


Quote:


> Originally Posted by *xer0h0ur*
> 
> Can't fault you there. Out of curiosity which games were giving you problems.


Off hand: Watch Dogs, The Evil Within, COD: AW (Although they promise a CF drvier/profile is coming), Ryse Son Of Rome, Shadow OF Mordor (Apparently latest betas fix that but haven't tried again since), Crysis 1 refuses to load at all (Works fine for others on 8.1 with nvidia), GTAIV refuses to load but works fine on 8.1 with 780ti sli, a good few others that I can't remember off hand as I wiped the machine in a last ditch attempt to solve problems, I`ll update this thread as I come across them, but my card is on a 7 day auction atm with 6 days left so hopefully won't be having problems for much longer


----------



## Orivaa

You using Windows 8?


----------



## magicdave26

Quote:


> Originally Posted by *Orivaa*
> 
> You using Windows 8?


8.1 Update 1 x64 yea

Tried compatibility mode etc for the games that just crash - pretty sure it's down to AMD drivers, with Crysis 1 I get DX errors, even trying -dx9 flag gives me a DX9 error instead of 10

Fault Module Name: CryRenderD3D10.dll

Fault Module Name: CryRenderD3D9.dll


----------



## Orivaa

Betting 10 bucks that's the problem.


----------



## magicdave26

Quote:


> Originally Posted by *Orivaa*
> 
> Betting 10 bucks that's the problem.


That what's the problem? AMD Drivers?

8.1 is not the problem, because everyone else I know with nvidia/intel setups can run the same games fine under the same OS


----------



## Orivaa

You could still try down-grading to Windows 7 and see if it makes any difference.


----------



## magicdave26

Nah, I'm not downgrading my entire OS for the sake of Crysis 1 and GTAIV lol - I know they work on 7 anyway and I know they work on 8.1, they just refuse to work for me with 8.1 / AMD drivers, not even sure how far back Id have to go with drivers to get them working again, pretty sure I had at least GTAIV working with my 290 and 14.4, Crysis 1 has never worked with any AMD drivers and 8.1 for me - anyway I'm not that bothered about those games, I was just listing them as games that are also not working

The way I see it, when you spend this much money on a PC, you shouldn't be held back and crippled because of drivers and profiles, to me it's as ridiculous and annoying as that scene on Bad Boys where the bad guys money is being eaten by rats, QUOTE: "It's a stupid f***n problem, but it is still a problem"


----------



## Orivaa

Say, since 8GB versions of the 290x are coming out (Or are they already out? Not sure.), is it possible we're getting a 16GB 295x2?


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicdave26*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Orivaa*
> 
> Betting 10 bucks that's the problem.
> 
> 
> 
> That what's the problem? AMD Drivers?
> 
> 8.1 is not the problem, because everyone else I know with nvidia/intel setups can run the same games fine under the same OS
Click to expand...

I just installed Crysis to test it out, (Installed Win 8.1 last night) DX10, fullscreen at 1440p and it's working fine, GPU usage on all 3 cards is a bit up and down but fps is fine so i'm not worried


Quote:


> Originally Posted by *Orivaa*
> 
> Betting 10 bucks that's the problem.


Would Paypal work for you?


----------



## magicdave26

Well Ive tried reinstalling Crysis multiple times over multiple 8.1 installs, with different AMD GPUs and different AMD drivers, and it always crashes, every single time before it even loads anything - with the DX error above - updated DX, updated all C++ and other things normally needed installed and updated, can't figure it out if you have a very similar system to me and yours works

Just had a thought though, I am running Eyefinity - I do switch to single screen mode when games crash and sometimes that sorts it, but not in Crysis' case, if I can be bothered in the next few days i`ll disconnect 2 of the LCDs and try it again

EDIT - Actually, I haven't been running Eyefinity for as long as Crysis has always crashed, so it's not that


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicdave26*
> 
> Nah, I'm not downgrading my entire OS for the sake of Crysis 1 and GTAIV lol - I know they work on 7 anyway and I know they work on 8.1, they just refuse to work for me with 8.1 / AMD drivers, not even sure how far back Id have to go with drivers to get them working again, pretty sure I had at least GTAIV working with my 290 and 14.4, Crysis 1 has never worked with any AMD drivers and 8.1 for me - anyway I'm not that bothered about those games, I was just listing them as games that are also not working
> 
> The way I see it, when you spend this much money on a PC, you shouldn't be held back and crippled because of drivers and profiles, to me it's as ridiculous and annoying as that scene on Bad Boys where the bad guys money is being eaten by rats, QUOTE: "It's a stupid f***n problem, but it is still a problem"


Well GTA IV was an unoptimised piece of crap, same goes with Watch_Dogs imo

Crysis works well but it requires raw CPU speed more than anything, I have Ryse: Son of Rome and iirc Crossfire was working on Win7, (haven't got it going atm to test) and the rest of the games you mentioned i either don't have or don't have installed.

and to be fair, with a 23" 1080p monitor you should have went with a single 290 or 290x, 295x2 is even overkill for me at 1440p.
Quote:


> Originally Posted by *Orivaa*
> 
> Say, since 8GB versions of the 290x are coming out (Or are they already out? Not sure.), is it possible we're getting a 16GB 295x2?


Very doubtful, that's alot of vram chips they'd need to squeeze onto one PCB


----------



## magicdave26

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well GTA IV was an unoptimised piece of crap, same goes with Watch_Dogs imo
> 
> Crysis works well but it requires raw CPU speed more than anything, I have Ryse: Son of Rome and iirc Crossfire was working on Win7, (haven't got it going atm to test) and the rest of the games you mentioned i either don't have or don't have installed.
> 
> and to be fair, with a 23" 1080p monitor you should have went with a single 290 or 290x, 295x2 is even overkill for me at 1440p.


3 x 23" LCDs in Eyefinity @ 6048x1080

Ryse gives negative fps and stuttering with CF for me - don't have any issues with Warhead, just original Crysis - CPU is not the issue @ 4.8GHz


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicdave26*
> 
> Well Ive tried reinstalling Crysis multiple times over multiple 8.1 installs, with different AMD GPUs and different AMD drivers, and it always crashes, every single time before it even loads anything - with the DX error above - updated DX, updated all C++ and other things normally needed installed and updated, can't figure it out if you have a very similar system to me and yours works
> 
> Just had a thought though, I am running Eyefinity - I do switch to single screen mode when games crash and sometimes that sorts it, but not in Crysis' case, if I can be bothered in the next few days i`ll disconnect 2 of the LCDs and try it again
> 
> EDIT - Actually, I haven't been running Eyefinity for as long as Crysis has always crashed, so it's not that


Going to take a stab in the dark and say that you are running a Steam version?

Mine's a Retail copy if that makes any differance?


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicdave26*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well GTA IV was an unoptimised piece of crap, same goes with Watch_Dogs imo
> 
> Crysis works well but it requires raw CPU speed more than anything, I have Ryse: Son of Rome and iirc Crossfire was working on Win7, (haven't got it going atm to test) and the rest of the games you mentioned i either don't have or don't have installed.
> 
> and to be fair, with a 23" 1080p monitor you should have went with a single 290 or 290x, 295x2 is even overkill for me at 1440p.
> 
> 
> 
> 3 x 23" LCDs in Eyefinity @ 6048x1080
> 
> Ryse gives negative fps and stuttering with CF for me - don't have any issues with Warhead, just original Crysis - CPU is not the issue @ 4.8GHz
Click to expand...

Ahhh, You should update Rigbuilder to say that









Ryse is on another SSD atm and i'm not pulling my GPU's out to plug them back in tonight, haven't tried Warhead in a very long time


----------



## Orivaa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> and to be fair, with a 23" 1080p monitor you should have went with a single 290 or 290x, 295x2 is even overkill for me at 1440p.


I use a 1080p monitor and don't plan on doing 4K gaming any time soon, but I also intend to do Let's Plays at 60fps on 1080p, so is it still overkill for me?


----------



## magicdave26

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Going to take a stab in the dark and say that you are running a Steam version?
> 
> Mine's a Retail copy if that makes any differance?


It was a retail key I was given, that I contacted Origin/EA and they added it to my Origin account - I tried downloading the ISO and no-cd before that and got the same crashing

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ahhh, You should update Rigbuilder to say that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ryse is on another SSD atm and i'm not pulling my GPU's out to plug them back in tonight, haven't tried Warhead in a very long time


yea my bad, I`ll update it, I wasn't a big fan of Ryse anyway tbh, it's just the sheer amount of games you have to mess around in profiles with and dont work with this, and dont work with that, and you have to change this and mod that just to get them running, badly, Ive had enough - like I say, when you spend this much on a PC, you shouldn't have to mess about like this, that's what the guys who code the profiles and drivers job is, not ours


----------



## electro2u

Quote:


> Originally Posted by *magicdave26*
> 
> It was a retail key I was given, that I contacted Origin/EA and they added it to my Origin account - I tried downloading the ISO and no-cd before that and got the same crashing
> yea my bad, I`ll update it, I wasn't a big fan of Ryse anyway tbh, it's just the sheer amount of games you have to mess around in profiles with and dont work with this, and dont work with that, and you have to change this and mod that just to get them running, badly, Ive had enough - like I say, when you spend this much on a PC, you shouldn't have to mess about like this, that's what the guys who code the profiles and drivers job is, not ours


SLI/XF is pretty niche. I see the same complaints about SLI on other threads. That said, I find crossfire a pain in the neck with the 295x2 and prefer SLI.


----------



## magicdave26

Quote:


> Originally Posted by *electro2u*
> 
> SLI/XF is pretty niche. I see the same complaints about SLI on other threads. That said, I find crossfire a pain in the neck with the 295x2 and prefer SLI.


Yea, I think I'm gonna be staying with a single GPU for a while next, had next to no issues with my 290 other than some Eyefinity mishaps, single screen, single GPU, no problems at all - start adding more of either and things seem to break very quickly


----------



## magicdave26

Another one to add to the list - Metro: LL REDUX - major stuttering with CF enabled


----------



## DividebyZERO

Your running a very high resolution for just a 295x2. Not saying it isn't capable but if your jacking up settings also I dont see you getting very good FPS avgs in these titles. Most of the games you mention have issues regardless of brand gpus. I think your on the right track though going for single gpu.(if you dont like tweaking) Eyefinity, surround and multigpus require some tinkering to setup and run. My last nvidia multi gpu setup was 680gtx and I wasnt happy with them either.

If your in portrait eyefinity, I am having issues with triple4k eyefinity potrait in dx9 titles. I cannot get fullscreen to work, where as in landscape they work fullscreen. I am not sure if its a dx9 or AMD driver issue.


----------



## pompss

Someone have MB Asrock x99 with 295x2 ??
Seems i'm not able to enable the crossfire because the second gpu of my ares 3 its disable.
Tried everything also update the bios but nada .Checked the bios but seems i'm stuck.


----------



## magicdave26

the 295X2 is more than capable of running LL REDUX maxed out in Eyefinity (minus SSAA obviously), I get over 30fps with CF disabled - I don't get much more than that with it enabled and I get horrible stuttering

The amount of power the 295 has - if CF worked, should be able to max out just about anything at that rez, my 290 could max out the majority of games (if you don't count AA) - so having more than twice the power should not have any issues at all

My Eyefinity is landscape, tried Portrait but AMD drivers struck again and my machine went nuts trying to run Valley benchmark, 0-3fps, freezing, almost BSOD BRRRRRR noises constantly, stuck it back in landscape and it runs fine

Triple 4k Eyefinity is quite a massive rez, does the GPU even support it ?


----------



## DividebyZERO

Quote:


> Originally Posted by *magicdave26*
> 
> the 295X2 is more than capable of running LL REDUX maxed out in Eyefinity (minus SSAA obviously), I get over 30fps with CF disabled - I don't get much more than that with it enabled and I get horrible stuttering
> 
> The amount of power the 295 has - if CF worked, should be able to max out just about anything at that rez, my 290 could max out the majority of games (if you don't count AA) - so having more than twice the power should not have any issues at all
> 
> My Eyefinity is landscape, tried Portrait but AMD drivers struck again and my machine went nuts trying to run Valley benchmark, 0-3fps, freezing, almost BSOD BRRRRRR noises constantly, stuck it back in landscape and it runs fine
> 
> Triple 4k Eyefinity is quite a massive rez, does the GPU even support it ?





Thats metro 2033 redux shots. All the metro titles have issues running high settings /max for me. I usually run medium with no aa. Metro titles ran really bad for me once I passed 5760x1080 in landscape. I wish I had a way to check vram because I think its vram limit.


----------



## magicdave26

MSI Afterburner OSD that you have showing in the top left shows vRAM usage if you enable it in settings

If I drop settings to High with everything else maxed (Bar SSAA) I get pretty much close to 60fps in 6048x1080

I can imagine the vRAM running out with 4k Eyefinity though, there are a few titles that max out 2x4GB on the 295 @ 1080p - that Evolve beta maxed out all 8GB straight away @ 1080p, but I have a feeling that could have been a memory leak with it being a beta


----------



## christefan

Did you update the SRES bios? There's a newer bios for the card on kingpin cooling


----------



## electro2u

Quote:


> Originally Posted by *christefan*
> 
> Did you update the SRES bios? There's a newer bios for the card on kingpin cooling


What's SRES?
Can you post the bios link?


----------



## axiumone

I think he meant to say ARES, as in the asus 290x2.

Edit - Yep, it's definitely new bios for ARES. Here's the thread - http://kingpincooling.com/forum/showthread.php?t=2901&highlight=ares


----------



## electro2u

Ooo :blush:


----------



## Blc2uk

This will have been covered in this forum already no doubt, but I have to ask... Are both cards enabled by default on this beast of a card? I've enabled set up crossfire profile for games that have none. Reason I'm asking is I'm having real problems running call of duty advanced warfare at 1080p. I'm sure it's a patch waiting to happen, but any feedback will be appreciated.


----------



## axiumone

Quote:


> Originally Posted by *Blc2uk*
> 
> This will have been covered in this forum already no doubt, but I have to ask... Are both cards enabled by default on this beast of a card? I've enabled set up crossfire profile for games that have none. Reason I'm asking is I'm having real problems running call of duty advanced warfare at 1080p. I'm sure it's a patch waiting to happen, but any feedback will be appreciated.


Both gpu's are enabled by default, you don't need to do anything special. However, if there no associated crossfire profile in the drivers, then it will be default to running only on one gpu.


----------



## magicdave26

Quote:


> Originally Posted by *Blc2uk*
> 
> This will have been covered in this forum already no doubt, but I have to ask... Are both cards enabled by default on this beast of a card? I've enabled set up crossfire profile for games that have none. Reason I'm asking is I'm having real problems running call of duty advanced warfare at 1080p. I'm sure it's a patch waiting to happen, but any feedback will be appreciated.


Dont bother trying to run COD AW in CF yet, it doesn't work, gives negative scaling, AMD say they are releasing a driver for it "soon"

But yea, both GPUs are running in CF permanently unless you create a profile for a game and disable it / run a game that has no CF profile


----------



## DividebyZERO

Selecting another CF profile under CCC for one of the previous COD games doesn't help?
Maybe a little experimenting with each one?


----------



## pompss

Thanks guyz i will give it a shot.
Seems he had the same issues i had


----------



## jelly4ish

Im getting issues in starcraft 2 at ulta settings. It seems to be a lighting/shader issue causing me to have large flickering boxes of light on areas of the map that should be greyed out by the fog of war. I've tried setting a profile for starcraft in ccc and disabling cf but I still get the issue, however it goes away entirely if I set the graphics to low


----------



## Orivaa

Quote:


> Originally Posted by *jelly4ish*
> 
> Im getting issues in starcraft 2 at ulta settings. It seems to be a lighting/shader issue causing me to have large flickering boxes of light on areas of the map that should be greyed out by the fog of war. I've tried setting a profile for starcraft in ccc and disabling cf but I still get the issue, however it goes away entirely if I set the graphics to low


I'm not familiar with Starcraft 2, but try messing around with the individual quality settings to find the problem. Should be easier to find a solution from there.


----------



## jelly4ish

I will it's strange as I didn't have the issue with a single 290x


----------



## magicdave26

Quote:


> Originally Posted by *jelly4ish*
> 
> I will it's strange as I didn't have the issue with a single 290x


Sure you're disabling CF properly? If you are, there should be no difference, you are only using 1 x 290X with it disabled


----------



## xer0h0ur

Virtually certain he isn't manually creating a crossfire profile for the game's executable.


----------



## magicdave26

Quote:


> Originally Posted by *xer0h0ur*
> 
> Virtually certain he isn't manually creating a crossfire profile for the game's executable.


Maybe making the same mistake I made when I first got mine, and thinking the "Enable CF for games that have no profile" was the enable and disable CF globally setting


----------



## xer0h0ur

Yeah I never bother with the global CCC crossfire setting. I just manually create a crossfire profile with games that aren't playing nice in crossfire and disable it. Assuming that AFR Friendly or the other setting don't make it work.


----------



## cennis

Quote:


> Originally Posted by *pompss*
> 
> Someone have some good bench result for a single r9 295x2 with valley and 3d mark.
> i need it to comapre with my asus ares 3 and gtx 780 ti hof v20


----------



## jelly4ish

I am Going to ccc , gaming, 3d application settings selecting the sc2 executable scrolling to amd crossfire and turning amd crossfire mode and frame pacing off. Is this the correct way?


----------



## magicdave26

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I never bother with the global CCC crossfire setting. I just manually create a crossfire profile with games that aren't playing nice in crossfire and disable it. Assuming that AFR Friendly or the other setting don't make it work.


There is no global CF disable for the 295, bit weird because disabling it per game is possible - personally I'd prefer to be able to do it system wide too

Quote:


> Originally Posted by *jelly4ish*
> 
> I am Going to ccc , gaming, 3d application settings selecting the sc2 executable scrolling to amd crossfire and turning amd crossfire mode and frame pacing off. Is this the correct way?


Yea that's the right way, if you install MSI Afterburner 4.0.0 - and RTSS that comes bundled with it, you'll be able to enable an on screen display that shows the GPU usage of each GPU, if one of them stays at 300MHz, CF is disabled in that game - if they both ramp up to full clock speeds, it's not disabled


----------



## jelly4ish

Also i'm running my r9 295x2 with an i5 4670 and stock clocks will this be bottle necking the card? i'm getting frame drops below 30 fps in crysis 3 at max.

(below 30 is a split second drop but is very noticeable as its such a sharp spike)

I realise crysis 3 is a very demanding game but i would think i would be able to get a constant 60fps at 1080 with this card? can some other users chime in with what they average


----------



## magicdave26

I get a solid 60 with C3 maxed out @ 1080p


----------



## jelly4ish

Quote:


> Originally Posted by *magicdave26*
> 
> I get a solid 60 with C3 maxed out @ 1080p


what's your setup?


----------



## magicdave26

Quote:


> Originally Posted by *jelly4ish*
> 
> what's your setup?


Says at the bottom of each of my posts









Click the small blue arrow


----------



## axiumone

Anyone looking to get rid of their 295x2 waterblocks?


----------



## jelly4ish

Quote:


> Originally Posted by *axiumone*
> 
> Anyone looking to get rid of their 295x2 waterblocks?


I've got one spare how much?


----------



## jelly4ish

Quote:


> Originally Posted by *magicdave26*
> 
> Says at the bottom of each of my posts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click the small blue arrow


That's strange you get better performance with an AMD cpu maybe I need to over clock mine from 3.4ghz


----------



## magicdave26

Quote:


> Originally Posted by *jelly4ish*
> 
> That's strange you get better performance with an AMD cpu maybe I need to over clock mine from 3.4ghz


Wouldn't have thought the 4670 would be a bottleneck, but my brother has the same CPU and has his at 4.6 stable, so you've got a lot of OCing headroom for better performance


----------



## jelly4ish

Quote:


> Originally Posted by *magicdave26*
> 
> Wouldn't have thought the 4670 would be a bottleneck, but my brother has the same CPU and has his at 4.6 stable, so you've got a lot of OCing headroom for better performance


I wouldn't have either but it's the only thing i can think would explain the poor framerate in crysis. It's strange as I get 100+ frames in the firestrike gpu test but low 20's in the cpu test


----------



## magicdave26

Quote:


> Originally Posted by *jelly4ish*
> 
> I wouldn't have either but it's the only thing i can think would explain the poor framerate in crysis. It's strange as I get 100+ frames in the firestrike gpu test but low 20's in the cpu test


Yea you're always gonna see pretty low frame in Physics CPU tests, but I'd give OCing it a shot, it's a free upgrade, nothing to lose - those CPUs are beasts especially OCd


----------



## electro2u

a 4670k should be getting about 28-38fps in the Firestrike CPU bm at 4.6Ghz.


----------



## xer0h0ur

Quote:


> Originally Posted by *magicdave26*
> 
> There is no global CF disable for the 295, bit weird because disabling it per game is possible - personally I'd prefer to be able to do it system wide too
> Yea that's the right way, if you install MSI Afterburner 4.0.0 - and RTSS that comes bundled with it, you'll be able to enable an on screen display that shows the GPU usage of each GPU, if one of them stays at 300MHz, CF is disabled in that game - if they both ramp up to full clock speeds, it's not disabled


Oh crap you're right. I forgot I only have that global setting since its tri-fired with a 290X. It really makes no sense that this option is only available when mixing a 295X2 with another card.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh crap you're right. I forgot I only have that global setting since its tri-fired with a 290X. It really makes no sense that this option is only available when mixing a 295X2 with another card.


I'm not terribly familiar with crossfire, but since 295x2 is just two 290's, couldn't one just use the global settings for 290 crossfire? Like download it or something?


----------



## xer0h0ur

The problem is that the CCC crossfire enable/disable option is only there if you have a 295X2 along with another card. I never saw it as a listed option when I only had the 295X2 but once I tri-fired it with the 290X then it prompted setting it up upon driver installation. Its pretty dumb that the driver options aren't the same with a 295X2 as it is with dual 290X's.


----------



## magicdave26

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh crap you're right. I forgot I only have that global setting since its tri-fired with a 290X. It really makes no sense that this option is only available when mixing a 295X2 with another card.


I know, especially when AMD have to know just how many games need CF disabled to run properly

Another one to add to the list is Alien Isolation - although it does run with CF, and Frames are good, there is continuous micro-stutter all the time, really nasty on such a polished and optimised game that runs super smooth with 1 GPU


----------



## Orivaa

Quote:


> Originally Posted by *magicdave26*
> 
> I know, especially when AMD have to know just how many games need CF disabled to run properly
> 
> Another one to add to the list is Alien Isolation - although it does run with CF, and Frames are good, there is continuous micro-stutter all the time, really nasty on such a polished and optimised game that runs super smooth with 1 GPU


You mean Crossfire as in 1 295x2, right?
Speaking of that, I have a question. If one wishes to record gameplay and the game doesn't play well with Crossfire, is it possible to set 1 GPU to play the game, and 1 to record the gameplay?


----------



## magicdave26

Quote:


> Originally Posted by *Orivaa*
> 
> You mean Crossfire as in 1 295x2, right?
> Speaking of that, I have a question. If one wishes to record gameplay and the game doesn't play well with Crossfire, is it possible to set 1 GPU to play the game, and 1 to record the gameplay?


Yea 1 x 295X2

The GPU doesn't record the gameplay - the PC does that via some 3rd party program like Afterburner - but yea you could still record gameplay with 1 GPU disabled


----------



## Orivaa

Quote:


> Originally Posted by *magicdave26*
> 
> Yea 1 x 295X2
> 
> The GPU doesn't record the gameplay - the PC does that via some 3rd party program like Afterburner - but yea you could still record gameplay with 1 GPU disabled


I was under the impression that the 3rd party program used the GPU to do so.


----------



## magicdave26

Quote:


> Originally Posted by *Orivaa*
> 
> I was under the impression that the 3rd party program used the GPU to do so.


Well the GPU renders the game - so Afterburner may well get it's images from there, unsure, but the GPU does not record anything, the software does that

But if you have the 2nd GPU disabled, it's not rendering the game, so it couldn't be used to grab the video/images from it - it's idle when disabled


----------



## Orivaa

Quote:


> Originally Posted by *magicdave26*
> 
> Well the GPU renders the game - so Afterburner may well get it's images from there, unsure, but the GPU does not record anything, the software does that
> 
> But if you have the 2nd GPU disabled, it's not rendering the game, so it couldn't be used to grab the video/images from it - it's idle when disabled


Just because you have disabled Crossfire for that specific game, it doesn't mean that the 2nd GPU becomes unusable, right? Couldn't it be doing some other task? Like if you were rendering a video in Vegas (albeit at very low priority so it doesn't ruin your gaming experience)?

To be fair, I just read a bunch of places that one needs a powerful GPU to record games. I need to upgrade my GPU regardless, so I'm still gonna buy the 295x2, but I'd be nice if some place with clear instructions could be found. But this is not the place for that, so I'll leave that be.


----------



## magicdave26

I've never looked into it - but afaik, Afterburner uses the CPU to run / do it's tasks, not the GPU, but Im not the dev

You don't need a powerful GPU to record games, Ive been doing it since a few GPUs ago, way less powerful than this 295

Not sure why you'd want to render something in Vegas while playing a game though, but each to their own


----------



## Orivaa

Quote:


> Originally Posted by *magicdave26*
> 
> I've never looked into it - but afaik, Afterburner uses the CPU to run / do it's tasks, not the GPU, but Im not the dev
> 
> You don't need a powerful GPU to record games, Ive been doing it since a few GPUs ago, way less powerful than this 295
> 
> Not sure why you'd want to render something in Vegas while playing a game though, but each to their own


Lol, I wouldn't. I'm just considering the possibility of having the other GPU being used for something else while gaming.

Also, it seems AMD GVR uses the GPU to record games, so maybe that one could use the idle GPU?


----------



## magicdave26

I don't use Raptr / GVR, not a fan of it at all - Afterburner does everything I need, screenshots, videos, Overclocking, OSD for just about everything your hardware is doing - and no annoying bloat

No doubt you could probably get the other GPU doing something else with enough know-how / coding, so long as the game does not lock the GPU out from other apps but I've never read about it, or felt the need to try it

I guess you could bitcoin mine with one and game on the other, but I have a feeling something would go wrong somewhere along the line

Personally I use my GPU for gaming, that's it - and when I'm not gaming, browsing


----------



## Orivaa

Finally ordered my 295x2 from Amazon!
It's a Sapphire, so I'll cross my fingers and hope it won't be defect.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Orivaa*
> 
> Finally ordered my 295x2 from Amazon!
> It's a Sapphire, so I'll cross my fingers and hope it won't be defect.


All the 295x2's are made by Sapphire


----------



## Orivaa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> All the 295x2's are made by Sapphire


I'm well aware that they produce all the reference cards, but still.
Who knows, maybe the other brands pay them to make their own cards slightly higher quality.
But as I said, I just hope it's not defect.


----------



## jelly4ish

I'm getting huge frame drops in 3d mark fire strike test. it's around the high 100 - 130 for most of the graphics test but drops to the low 30's towards the mid end of the test? also when firestrike renders full animation at the start of the test it gets incredibly framey and jittery at the end of the sequence as the camera backs out (like 4 fps) anyone know what could be causing this? I'm running 14.9 stable driver with an i5 4670 at stock clocks and 8gb of ddr3 1333 ram windows 7 ultimate


----------



## axiumone

Sounds like overheating somewhere. Either core or vrm. Are you monitoring your temps?


----------



## Sgt Bilko

Quote:


> Originally Posted by *jelly4ish*
> 
> I'm getting huge frame drops in 3d mark fire strike test. it's around the high 100 - 130 for most of the graphics test but drops to the low 30's towards the mid end of the test? also when firestrike renders full animation at the start of the test it gets incredibly framey and jittery at the end of the sequence as the camera backs out (like 4 fps) anyone know what could be causing this? I'm running 14.9 stable driver with an i5 4670 at stock clocks and 8gb of ddr3 1333 ram windows 7 ultimate


Quote:


> Originally Posted by *axiumone*
> 
> Sounds like overheating somewhere. Either core or vrm. Are you monitoring your temps?


I have the same issue when I use an R9 290 with the 295x2

Do a full DDU wipe and reinstall the drivers.

Its not an overheating issue in my case, I know that much


----------



## jelly4ish

Quote:


> Originally Posted by *axiumone*
> 
> Sounds like overheating somewhere. Either core or vrm. Are you monitoring your temps?


Quote:


> Originally Posted by *Sgt Bilko*
> 
> I have the same issue when I use an R9 290 with the 295x2
> 
> Do a full DDU wipe and reinstall the drivers.
> 
> Its not an overheating issue in my case, I know that much


it's running at around 40 idle and goes up to 74 max under load are these normal operational temps? I'm using MSI to monitor temps is there a way to monitor individual component temps such as vram? I bought this used with a full water block, but had the seller replace original stock cooling solution before shipping. Is it possible he has incorrectly attached the stock cooler?


----------



## Sgt Bilko

Quote:


> Originally Posted by *jelly4ish*
> 
> Quote:
> 
> 
> 
> Originally Posted by *axiumone*
> 
> Sounds like overheating somewhere. Either core or vrm. Are you monitoring your temps?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I have the same issue when I use an R9 290 with the 295x2
> 
> Do a full DDU wipe and reinstall the drivers.
> 
> Its not an overheating issue in my case, I know that much
> 
> Click to expand...
> 
> it's running at around 40 idle and goes up to 74 max under load are these normal operational temps? I'm using MSI to monitor temps is there a way to monitor individual component temps such as vram? I bought this used with a full water block, but had the seller replace original stock cooling solution before shipping. Is it possible he has incorrectly attached the stock cooler?
Click to expand...

The 295x2 will throttle at 75c on the core(s) so yes you are throttling, I've got a single Noctua NF-F12 on my rad and it's working quite well for me.

Hows your ambient, case airflow and Rad setup?


----------



## jelly4ish

I'm not sure of the ambient temp but its winter in the UK so pretty bloody cold. I'm using the stock fan on the rad i hadn't thought to change that i thought with me never haven gone over 74 then i wouldn't be throttling as it wasn't hitting that 75 degree mark? as for case airflow i had to mount the radiator on the front of my case making me lose one of my intake fans so i'm on a single intake at the front, 2 outakes on the top and a outake on the rear. I don't really see how I can remedy this is the case i have as my rear outake fan is a closed loop radiator push/pull setup for my cpu and i dont have clearance for the outake at the top because of the cpu rad and ram height


----------



## Sgt Bilko

Quote:


> Originally Posted by *jelly4ish*
> 
> I'm not sure of the ambient temp but its winter in the UK so pretty bloody cold. I'm using the stock fan on the rad i hadn't thought to change that i thought with me never haven gone over 74 then i wouldn't be throttling as it wasn't hitting that 75 degree mark? as for case airflow i had to mount the radiator on the front of my case making me lose one of my intake fans so i'm on a single intake at the front, 2 outakes on the top and a outake on the rear. I don't really see how I can remedy this is the case i have as my rear outake fan is a closed loop radiator push/pull setup for my cpu and i dont have clearance for the outake at the top because of the cpu rad and ram height


It throttles the same way as the 290/x cards do, in my experience with my 290's they never hit 95c bt they did hit 94.c and the core clocks dropped like a rock

Same thing happens here, the 295x2 cores hit 74c they start to throttle.

get MSI Afterburner if you don't have it already and note the GPU Utilization on each core when the temps start to rise.

And you being in the UK your ambients should be fine









You can fill out the Rigbuilder and put it up in your sig, might be able to help a bit more if we can all see what you are running


----------



## jelly4ish

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It throttles the same way as the 290/x cards do, in my experience with my 290's they never hit 95c bt they did hit 94.c and the core clocks dropped like a rock
> 
> Same thing happens here, the 295x2 cores hit 74c they start to throttle.
> 
> get MSI Afterburner if you don't have it already and note the GPU Utilization on each core when the temps start to rise.
> 
> And you being in the UK your ambients should be fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can fill out the Rigbuilder and put it up in your sig, might be able to help a bit more if we can all see what you are running


Thanks that makes sense I think what may be happening is the hot air exhausted from the card is being pulled straight back in as the front intake/outake fans are right next to each other on the front. Whats strange is i did the firstrike test a few days ago and didn't have this issue at all, it wasn't until i started seeing frame drops in crysis 3 (down to low 30's from high 80's) that i ran firestrike and noticed the isse.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jelly4ish*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It throttles the same way as the 290/x cards do, in my experience with my 290's they never hit 95c bt they did hit 94.c and the core clocks dropped like a rock
> 
> Same thing happens here, the 295x2 cores hit 74c they start to throttle.
> 
> get MSI Afterburner if you don't have it already and note the GPU Utilization on each core when the temps start to rise.
> 
> And you being in the UK your ambients should be fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can fill out the Rigbuilder and put it up in your sig, might be able to help a bit more if we can all see what you are running
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks that makes sense I think what may be happening is the hot air exhausted from the card is being pulled straight back in as the front intake/outake fans are right next to each other on the front. Whats strange is i did the firstrike test a few days ago and didn't have this issue at all, it wasn't until i started seeing frame drops in crysis 3 (down to low 30's from high 80's) that i ran firestrike and noticed the isse.
Click to expand...

I have mine set up as 2 Front intake, H100i with 2 fans on exhaust and the 295x2's rad as rear exhaust

Better to have it as exhaust than intake because you'll only get the warm air from the rad blowing back towards the card imo


----------



## jelly4ish

yeah that's what I'm thinking but if i have the other front fan on exhaust aswell then i have no intake and 5 outake. The only other option I have is to switch the tops fans over to intakes but that seems counter productive due to the whole heat rises palava


----------



## Sgt Bilko

Quote:


> Originally Posted by *jelly4ish*
> 
> yeah that's what I'm thinking but if i have the other front fan on exhaust aswell then i have no intake and 5 outake. The only other option I have is to switch the tops fans over to intakes but that seems counter productive due to the whole heat rises palava


Heat will only rise if the fans don't push it away first, If you have 5 fans then why not try and have the front as intake and the top/rear as exhaust just to test it out.

A simple test would be to have the Rad for the 295x2 outside of your case for an hour or so and see if that solves your temps issues.
If it does then you have to re-think your fan placement


----------



## jelly4ish

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Heat will only rise if the fans don't push it away first, If you have 5 fans then why not try and have the front as intake and the top/rear as exhaust just to test it out.
> 
> A simple test would be to have the Rad for the 295x2 outside of your case for an hour or so and see if that solves your temps issues.
> If it does then you have to re-think your fan placement


I'll do exactly that tonight thanks for the advice. Am I right in assuming the best driver to use is 14.9 stable? amd drivers >desktop graphics > r9xxx series > windows 7


----------



## Sgt Bilko

Quote:


> Originally Posted by *jelly4ish*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Heat will only rise if the fans don't push it away first, If you have 5 fans then why not try and have the front as intake and the top/rear as exhaust just to test it out.
> 
> A simple test would be to have the Rad for the 295x2 outside of your case for an hour or so and see if that solves your temps issues.
> If it does then you have to re-think your fan placement
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll do exactly that tonight thanks for the advice. Am I right in assuming the best driver to use is 14.9 stable? amd drivers >desktop graphics > r9xxx series > windows 7
Click to expand...

14.9 WHQL is a very good driver in my experience, they released a new Beta driver today with some CoD: AW improvements as well if that would suit you better?


Spoiler: 14.9.11 Beta Release Notes



http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-11-1BetaWINReleaseNotes.aspx

Compatible Operating Systems
The latest version of the AMD Catalyst™ Software Suite, AMD Catalyst™ 14.11.1 Beta Driver is designed to support the following Microsoft Windows® platforms:
Windows® 8.1 64-bit
Windows® 7 64-bit version with SP1 or higher
Highlights of AMD Catalyst™ 14.11.1 Windows® Beta Driver

Performance Improvements
Call of Duty®: Advanced Warfare performance optimizations
Assassin's Creed® Unity performance optimizations
Known Issues
[408930]: Occasional stuttering in Assassin's Creed® Unity in CrossFire mode under specific game settings
[409235]: Small chance of intermittent screen tearing or corruption in Call of Duty®: Advanced Warfare on high settings 4K resolution in AMD CrossFire™ mode
[408706]: Quad CrossFire AMD Radeon™ R9 295X2 may sometimes black screen when loading a game in Call of Duty®: Advanced Warfare
[409199]: AMD Radeon™ R9 280X/280 may experience a crash when playing Call of Duty®: Advanced Warfare and a video at the same time. If you experience this issue a work around is turning off the video while playing the game.
[409177]: CrossFire users may experience intermittent flickering in Call of Duty®: Advanced Warfare menus. As a work around if this issue is seen restarting the game may cause the issue to disappear.
Important Notes

AMD is currently working with Activision to resolve the following issues in Call of Duty®: Advanced Warfare:
Users in AMD CrossFire™ mode sometimes experience texture corruption
Negative scaling sometimes being observed at specific resolutions with AMD CrossFire™



There are some notes about Quad 295x2 Crossfire there as well so they are testing that as well


----------



## magicdave26

14.9 WHQL was giving black screens to some people during/after install/first boot, 14.9.1 was supposed to be the fix, 14.9.2 was the actual fix iirc

Id go with the latest Betas Sgt posted


----------



## jelly4ish

Quote:


> Originally Posted by *magicdave26*
> 
> 14.9 WHQL was giving black screens to some people during/after install/first boot, 14.9.1 was supposed to be the fix, 14.9.2 was the actual fix iirc
> 
> Id go with the latest Betas Sgt posted


Quote:


> Originally Posted by *Sgt Bilko*
> 
> 14.9 WHQL is a very good driver in my experience, they released a new Beta driver today with some CoD: AW improvements as well if that would suit you better?
> 
> 
> Spoiler: 14.9.11 Beta Release Notes
> 
> 
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-11-1BetaWINReleaseNotes.aspx
> 
> Compatible Operating Systems
> The latest version of the AMD Catalyst™ Software Suite, AMD Catalyst™ 14.11.1 Beta Driver is designed to support the following Microsoft Windows® platforms:
> Windows® 8.1 64-bit
> Windows® 7 64-bit version with SP1 or higher
> Highlights of AMD Catalyst™ 14.11.1 Windows® Beta Driver
> 
> Performance Improvements
> Call of Duty®: Advanced Warfare performance optimizations
> Assassin's Creed® Unity performance optimizations
> Known Issues
> [408930]: Occasional stuttering in Assassin's Creed® Unity in CrossFire mode under specific game settings
> [409235]: Small chance of intermittent screen tearing or corruption in Call of Duty®: Advanced Warfare on high settings 4K resolution in AMD CrossFire™ mode
> [408706]: Quad CrossFire AMD Radeon™ R9 295X2 may sometimes black screen when loading a game in Call of Duty®: Advanced Warfare
> [409199]: AMD Radeon™ R9 280X/280 may experience a crash when playing Call of Duty®: Advanced Warfare and a video at the same time. If you experience this issue a work around is turning off the video while playing the game.
> [409177]: CrossFire users may experience intermittent flickering in Call of Duty®: Advanced Warfare menus. As a work around if this issue is seen restarting the game may cause the issue to disappear.
> Important Notes
> 
> AMD is currently working with Activision to resolve the following issues in Call of Duty®: Advanced Warfare:
> Users in AMD CrossFire™ mode sometimes experience texture corruption
> Negative scaling sometimes being observed at specific resolutions with AMD CrossFire™
> 
> 
> 
> There are some notes about Quad 295x2 Crossfire there as well so they are testing that as well


Ok ill install the latest beta drivers and test with 3d mark wtih rad outside the case.

Just to confirm the best way to reinstall drivers is to uninstall ccc via ad remove programs, delete the ccc folder on the c driver and un install drviers via the device manager? then do a fresh install|?


----------



## magicdave26

Quote:


> Originally Posted by *jelly4ish*
> 
> Ok ill install the latest beta drivers and test with 3d mark wtih rad outside the case.
> 
> Just to confirm the best way to reinstall drivers is to uninstall ccc via ad remove programs, delete the ccc folder on the c driver and un install drviers via the device manager? then do a fresh install|?


The recommended method these days is to use DDU in safemode (It will boot you into safemode itself)
http://www.wagnardmobile.com/DDU/

Use the "Recommended" button to remove the old drivers, it will reboot you back into normal mode, and there you can install your new drivers - DDU removes **** loads of leftover traces of the old drivers than the default installer leaves behind


----------



## rakesh27

Guy,

Sorry to take you alittle of topic, anyways i have 295x2 PowerColor, core clock 1018. Recently i brought a Sapphire 290x core clock 1000, i was wondering would i need voltage bump (by how much) if i raise the Sapphire 290x core clocks to the same as the 295x2 which is 1018...

You see i would like to keep my gpus all at the same core and memory clocks, luckily the memory clocks across all 3 gpus are the same.

Much appreciated all


----------



## xer0h0ur

14.9 and 14.9.1 were certified hot garbage. Both gave me BSODs and freezing. 14.9.2 was perfectly fine for me and I just installed the 14.11.1 Beta last night and have only played Skyrim on it. For what its worth Skyrim ran flawlessly and better than it did on the 14.9/14.9.1/14.9.2. I just disable crossfire for it though as that invites issues with flickering textures in the menus and water. I am used to disabling crossfire for DX9 games though.


----------



## axiumone

With an increase of 18mhz, you may not need any additional voltage at all.


----------



## xer0h0ur

Quote:


> Originally Posted by *rakesh27*
> 
> Guy,
> 
> Sorry to take you alittle of topic, anyways i have 295x2 PowerColor, core clock 1018. Recently i brought a Sapphire 290x core clock 1000, i was wondering would i need voltage bump (by how much) if i raise the Sapphire 290x core clocks to the same as the 295x2 which is 1018...
> 
> You see i would like to keep my gpus all at the same core and memory clocks, luckily the memory clocks across all 3 gpus are the same.
> 
> Much appreciated all


Unless you got the world's worst Hawaii core, you shouldn't need to add voltage until you reach somewhere around 1060MHz or so.


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> 14.9 and 14.9.1 were certified hot garbage. Both gave me BSODs and freezing. 14.9.2 was perfectly fine for me and I just installed the 14.11.1 Beta last night and have only played Skyrim on it. For what its worth Skyrim ran flawlessly and better than it did on the 14.9/14.9.1/14.9.2. I just disable crossfire for it though as that invites issues with flickering textures in the menus and water. I am used to disabling crossfire for DX9 games though.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> 14.9 WHQL is a very good driver in my experience, they released a new Beta driver today with some CoD: AW improvements as well if that would suit you better?
> 
> 
> Spoiler: 14.9.11 Beta Release Notes
> 
> 
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-11-1BetaWINReleaseNotes.aspx
> 
> Compatible Operating Systems
> The latest version of the AMD Catalyst™ Software Suite, AMD Catalyst™ 14.11.1 Beta Driver is designed to support the following Microsoft Windows® platforms:
> Windows® 8.1 64-bit
> Windows® 7 64-bit version with SP1 or higher
> Highlights of AMD Catalyst™ 14.11.1 Windows® Beta Driver
> 
> Performance Improvements
> Call of Duty®: Advanced Warfare performance optimizations
> Assassin's Creed® Unity performance optimizations
> Known Issues
> [408930]: Occasional stuttering in Assassin's Creed® Unity in CrossFire mode under specific game settings
> [409235]: Small chance of intermittent screen tearing or corruption in Call of Duty®: Advanced Warfare on high settings 4K resolution in AMD CrossFire™ mode
> [408706]: Quad CrossFire AMD Radeon™ R9 295X2 may sometimes black screen when loading a game in Call of Duty®: Advanced Warfare
> [409199]: AMD Radeon™ R9 280X/280 may experience a crash when playing Call of Duty®: Advanced Warfare and a video at the same time. If you experience this issue a work around is turning off the video while playing the game.
> [409177]: CrossFire users may experience intermittent flickering in Call of Duty®: Advanced Warfare menus. As a work around if this issue is seen restarting the game may cause the issue to disappear.
> Important Notes
> 
> AMD is currently working with Activision to resolve the following issues in Call of Duty®: Advanced Warfare:
> Users in AMD CrossFire™ mode sometimes experience texture corruption
> Negative scaling sometimes being observed at specific resolutions with AMD CrossFire™
> 
> 
> 
> There are some notes about Quad 295x2 Crossfire there as well so they are testing that as well


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Heat will only rise if the fans don't push it away first, If you have 5 fans then why not try and have the front as intake and the top/rear as exhaust just to test it out.
> 
> A simple test would be to have the Rad for the 295x2 outside of your case for an hour or so and see if that solves your temps issues.
> If it does then you have to re-think your fan placement


I just ran fire strike again with the rad out and the max temp i got was 59, however i still got the frame drops to the low 30's. It's worth mentioning the frame drops and lag h appends at the exact same moment every time, which is during the camera panning out at the end of the whole animation and during the camera turning the last corner of the larva on the graphics test animation.

Does anyone have any idea what you cause this very specific issue?


----------



## magicdave26

^^ You're not expecting a solid 60 all the way through are you ?


----------



## jelly4ish

Quote:


> Originally Posted by *magicdave26*
> 
> ^^ You're not expecting a solid 60 all the way through are you ?


on the graphics test at 1080? yeah of course it runs around 130fps but those drops to 30 and below and very significant. it's also incredibly jittery like 6 fps or less as the opening animation pans out

on the combined test I get around 28 - 30 which makes sense but the fact it suddenly drops 100 frames (at one of the least graphically intense scenes) and the fact i can't maintain 60fps in crysis 3 on medium makes it very obvious i've got a serious issue somewhere


----------



## joeh4384

Do your core clocks drop to 300 and rise back up? Are you running in crossfire with a 290?


----------



## xer0h0ur

Two things, list your system specs in your signature. The other is did you actually run DDU to wipe out the installation properly then re-install?

Other than those two suggestions, as joeh suggested, are you monitoring the gpu clocks and gpu usage with afterburner during your firestrike runs?


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> Two things, list your system specs in your signature. The other is did you actually run DDU to wipe out the installation properly then re-install?
> 
> Other than those two suggestions, as joeh suggested, are you monitoring the gpu clocks and gpu usage with afterburner during your firestrike runs?


I have wiped with DDU and re-installed the recommended beta drivers and i still get the same issue.

here is a link to my stats during the test

http://imgur.com/xa4FxKr,FfBuTqA,NogjsKe#0 - 3 images

I appreciate the help


----------



## joeh4384

Looks like the vrms are throttling. Do you have any card below the 295? I experienced the same clock drops when I tried crossfiring with a Msi gaming 290x below my 295


----------



## jelly4ish

I don't have any card below the 295x2


----------



## jelly4ish

Quote:


> Originally Posted by *joeh4384*
> 
> Looks like the vrms are throttling. Do you have any card below the 295? I experienced the same clock drops when I tried crossfiring with a Msi gaming 290x below my 295


I've updated my sig with my components


----------



## xer0h0ur

Yeah I was more interested in seeing your GPU1/GPU2 core clocks and by the look of that picture you listed GPU1 is all over the place. Joeh might be onto something there, about your VRM's causing throttling.

Some more questions for you, are you running PCI-E 3.0? Is it running @ 16x? Does your motherboard have the most up to date BIOS?


----------



## xer0h0ur

I thought maybe it could be a power supply issue but I highly doubt that considering you have a single 12V rail rated at 83.3A which certainly covers the 50A needed by the 295X2.

Do you have ULPS disabled through Afterburner?


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I was more interested in seeing your GPU1/GPU2 core clocks and by the look of that picture you listed GPU1 is all over the place. Joeh might be onto something there, about your VRM's causing throttling.
> 
> Some more questions for you, are you running PCI-E 3.0? Is it running @ 16x? Does your motherboard have the most up to date BIOS?


You can count out PCIe 2.0 as the issue as it's running fine for me even with a 290 underneath it.


----------



## xer0h0ur

Well this is a screenshot of Afterburner after a Firestrike run on my rig


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I was more interested in seeing your GPU1/GPU2 core clocks and by the look of that picture you listed GPU1 is all over the place. Joeh might be onto something there, about your VRM's causing throttling.
> 
> Some more questions for you, are you running PCI-E 3.0? Is it running @ 16x? Does your motherboard have the most up to date BIOS?


I'm running PCI E 3 at 16x I assume I'm on the most up to date bios as I'm running a haswell refresh cpu, but it's something I'll check when I get back from work.

As I said in a previous post I bought this card used with a custom water block and asked the seller to re attach the stock cooler. I'm thinking he may of incorrectly attached it and the fan isn't cooling the VRAM properly (although it looks correct)


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well this is a screenshot of Afterburner after a Firestrike run on my rig


I'm assuming when running the test you don't see these steep drops to 30ish fps and the stuttering at the end of the opening sequence etc?


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> I thought maybe it could be a power supply issue but I highly doubt that considering you have a single 12V rail rated at 83.3A which certainly covers the 50A needed by the 295X2.
> 
> Do you have ULPS disabled through Afterburner?


I've got one of these spare http://www.coolermaster.com/powersupply/silent-pro-hybrid/silent-pro-hybrid-1050w/ so I can try and test it with that.

But as you said the specs of the rm1000 should be more than enough


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> I thought maybe it could be a power supply issue but I highly doubt that considering you have a single 12V rail rated at 83.3A which certainly covers the 50A needed by the 295X2.
> 
> Do you have ULPS disabled through Afterburner?


Sorry I don't know what ULPS is could you explain so I get a better understand of your problem solving.

I will have a look for the option to disable it


----------



## Sgt Bilko

Quote:


> Originally Posted by *jelly4ish*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I thought maybe it could be a power supply issue but I highly doubt that considering you have a single 12V rail rated at 83.3A which certainly covers the 50A needed by the 295X2.
> 
> Do you have ULPS disabled through Afterburner?
> 
> 
> 
> Sorry I don't know what ULPS is could you explain so I get a better understand of your problem solving.
> 
> I will have a look for the option to disable it
Click to expand...

MSI Afterburner options:


----------



## xer0h0ur

Well I can tell you for sure that if he didn't keep all of the original vRAM, VRM, PLX chip etc. pads then using the pads from his waterblock wouldn't fit properly while using the 295X2's Asetek cooler. I know because I had to mount my EKWB then remove it since the card wouldn't POST and I had thrown away original pads. Turned out I had excess solder making direct contact with the nickel finish of the block so 30 minutes of filing away at it and a 3rd block re-mount later I was all good. In that process I had to re-mount the original cooler and found that the original pads were not all same thicknesses.

Its certainly worth checking and you might as well slap on whatever TIM you want on your GPU's while you're at it since you have no idea what he used either. Keeping the PLX chip cooled is also important.


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well I can tell you for sure that if he didn't keep all of the original vRAM, VRM, PLX chip etc. pads then using the pads from his waterblock wouldn't fit properly while using the 295X2's Asetek cooler. I know because I had to mount my EKWB then remove it since the card wouldn't POST and I had thrown away original pads. Turned out I had excess solder making direct contact with the nickel finish of the block so 30 minutes of filing away at it and a 3rd block re-mount later I was all good. In that process I had to re-mount the original cooler and found that the original pads were not all same thicknesses.
> 
> Its certainly worth checking and you might as well slap on whatever TIM you want on your GPU's while you're at it since you have no idea what he used either. Keeping the PLX chip cooled is also important.


I've quoted your post to who I bought it from and he's basically saying he disagrees and that i should just use the water block he provided with it to sort any heat issues out (not the answer i wanted)

the problem with that solution is my case is the Corsair CC-9011050 spec 01 I can't see how im going to fit a res and pump into it (let alone a full rad)

> i have no idea how to mount the card to the water block effectively (I'm not a fan of taking hardware apart as i'm not very handy)
> I already have a closed loop cooler for my cpu and the costs for water cooling seem prohibitive just to cool one card

Not really sure how to remedy this, hopefully disabling ULPS will be the magic bullet


----------



## ocvn

New problem with quad-fire 295x2 14.9. When i run single 4k screen, everything is fine, the core always max 1030/1300, temp 54-56 heaven 4.0. However, when i run triple 4k, the core drop fluently between 500-1000 all 4 cores and the frame are horrible. Anyone has any idea????


----------



## xer0h0ur

Granted I am after all talking about an EKWB. If he used a different brand of waterblock then I can't speak as to fitment with those pads. Did he say which waterblock he was using before on the card? For instance the Aquacomputer block apparently doesn't make use of vRAM pads so that would affect the thickness of the rest of the pads unless the milling was approximately 0.5mm taller for the contact points on the block for the vRAM. The vRAM pads on the original Asetek cooler and the EKWB are same thickness 0.5mm. From there on it changes. The VRM/mosfet pads are not the same thicknesses, neither is the pad for the PLX chip in case you choose to use a pad there. My block called for using TIM instead of a pad on the PLX chip. If you can at least find out which waterblock he was using I can compare.


----------



## xer0h0ur

Quote:


> Originally Posted by *ocvn*
> 
> New problem with quad-fire 295x2 14.9. When i run single 4k screen, everything is fine, the core always max 1030/1300, temp 54-56 heaven 4.0. However, when i run triple 4k, the core drop fluently between 500-1000 all 4 cores and the frame are horrible. Anyone has any idea????


Monitor your GPU temperatures, GPU usage and GPU clocks using Afterburner and/or GPU-Z logging to a file. You haven't tried the latest beta driver? The 14.11.1? ULPS is disabled? Are you running on the original Asetek cooler or waterblocked? Do you have proper spacing between both cards or is the top card's VRM fan being choked off from airflow?


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> Granted I am after all talking about an EKWB. If he used a different brand of waterblock then I can't speak as to fitment with those pads. Did he say which waterblock he was using before on the card? For instance the Aquacomputer block apparently doesn't make use of vRAM pads so that would affect the thickness of the rest of the pads unless the milling was approximately 0.5mm taller for the contact points on the block for the vRAM. The vRAM pads on the original Asetek cooler and the EKWB are same thickness 0.5mm. From there on it changes. The VRM/mosfet pads are not the same thicknesses, neither is the pad for the PLX chip in case you choose to use a pad there. My block called for using TIM instead of a pad on the PLX chip. If you can at least find out which waterblock he was using I can compare.


this one http://www.ebay.co.uk/itm/XSPC-Razor-R9-295X2-VGA-Full-Cover-Waterblock-Black-/321555659681

thanks for your help


----------



## jelly4ish

I've disabled ULPS and still get the same issues


----------



## xer0h0ur

Quote:


> Originally Posted by *jelly4ish*
> 
> this one http://www.ebay.co.uk/itm/XSPC-Razor-R9-295X2-VGA-Full-Cover-Waterblock-Black-/321555659681
> 
> thanks for your help


Okay, there are differences. That block uses only 1mm pads on both sides of the PCB:

http://static.squarespace.com/static/51998404e4b0ef02d1bd9c2c/t/537aac80e4b0f8eee612e826/1400548480713/r9-295x2.pdf

http://static.squarespace.com/static/51998404e4b0ef02d1bd9c2c/t/537aac91e4b0f8eee612e840/1400548497025/r9-295x2-backplate.pdf

That wouldn't be a problem for the backside/backplate but it likely wouldn't properly make good contact with the VRMs on the Asetek cooler using 1mm pads there on the vRAM. The reason I say that is because the 1mm pads on the vRAM are 0.5mm thicker therefore raising the cooler 0.5mm and in turn leaving a gap between the VRM's cooling and the VRM pads.

I would suggest at least replacing the VRM pads with 1.5mm pads and also covering the center section (with 1.5mm pads as well) that the XSPC block leaves uncovered. This is the section I was referencing. The top diagram is the EK block, bottom is the XSPC block. Anywhere on the top diagram with a number 2 or 3 on it would require 1.5mm pads if you kept the 1mm pads from the XSPC block everywhere else.



NOTE: Everything I am talking about is assuming that he just flat out slapped on the Asetek cooler using the pads that were on the PCB with this waterblock. If he did re-install the Asetek cooler along with its original pads then all of this is a non-issue. Problem is will he be honest with you if you ask.


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> Okay, there are differences. That block uses only 1mm pads on both sides of the PCB:
> 
> http://static.squarespace.com/static/51998404e4b0ef02d1bd9c2c/t/537aac80e4b0f8eee612e826/1400548480713/r9-295x2.pdf
> 
> http://static.squarespace.com/static/51998404e4b0ef02d1bd9c2c/t/537aac91e4b0f8eee612e840/1400548497025/r9-295x2-backplate.pdf
> 
> That wouldn't be a problem for the backside/backplate but it likely wouldn't properly make good contact with the VRMs on the Asetek cooler using 1mm pads there on the vRAM. The reason I say that is because the 1mm pads on the vRAM are 0.5mm thicker therefore raising the cooler 0.5mm and in turn leaving a gap between the VRM's cooling and the VRM pads.
> 
> I would suggest at least replacing the VRM pads with 1.5mm pads and also covering the center section (with 1.5mm pads as well) that the XSPC block leaves uncovered. This is the section I was referencing. The top diagram is the EK block, bottom is the XSPC block. Anywhere on the top diagram with a number 2 or 3 on it would require 1.5mm pads if you kept the 1mm pads from the XSPC block everywhere else.
> 
> 
> 
> NOTE: Everything I am talking about is assuming that he just flat out slapped on the Asetek cooler using the pads that were on the PCB with this waterblock. If he did re-install the Asetek cooler along with its original pads then all of this is a non-issue. Problem is will he be honest with you if you ask.


I've sent your response to him and asked if that's the case. He seems honest so honest.

how would i go about buying replacement pads of the correct size?


----------



## xer0h0ur

I bought my pads from frozencpu.com. I used the Fujipoly Ultra Extreme pads all around but its overkill as really only the number 2's and 3's on that top diagram would benefit from it. As for removing the Asetek cooler, I suggest doing so completely cold. I made the mistake of running my card before removal and my original pads shredded themselves so to speak as if layers came apart. You get a better, cleaner separation when the card is totally cooled down. You're also going to have to clean and remove the TIM from the GPUs and re-apply TIM on your re-mount however I suggest you dry test it so to speak. In other words put the cooler back on after you have changed the pads without applying TIM to the GPU's so you can verify contact on the VRM pads since they get marked up with solid contact. Once you've confirmed you're getting good contact on the pads then apply TIM to the GPUs and re-mount.

Edit: I almost forgot, you definitely want to make sure that your PLX chip's pad is making solid contact as well. Some people had experienced issues because of their PLX chip overheating.


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> I've sent your response to him and asked if that's the case. He seems honest so honest.
> 
> how would i go about buying replacement pads of the correct size?


Ok ok , well thanks for pointing out the fact that you consider me honest because I am. I had no idea I required 1.5MM on the Vrams , I simply applied "NEW" pads which i purchased from OCUK across the board . The pad thickness was 1mm , had I have known it required 1.5 mm across the vram then i surely would have applied it, after all who wants a bad seller rating and returned items ... not me.


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> I bought my pads from frozencpu.com. I used the Fujipoly Ultra Extreme pads all around but its overkill as really only the number 2's and 3's on that top diagram would benefit from it. As for removing the Asetek cooler, I suggest doing so completely cold. I made the mistake of running my card before removal and my original pads shredded themselves so to speak as if layers came apart. You get a better, cleaner separation when the card is totally cooled down. You're also going to have to clean and remove the TIM from the GPUs and re-apply TIM on your re-mount however I suggest you dry test it so to speak. In other words put the cooler back on after you have changed the pads without applying TIM to the GPU's so you can verify contact on the VRM pads since they get marked up with solid contact. Once you've confirmed you're getting good contact on the pads then apply TIM to the GPUs and re-mount.
> 
> Edit: I almost forgot, you definitely want to make sure that your PLX chip's pad is making solid contact as well. Some people had experienced issues because of their PLX chip overheating.


How are you doing Xer0h0ur ....

The plx has the same 1mm pad on it , I applied MX4 to the gpu's


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> I've sent your response to him and asked if that's the case. He seems honest so honest.
> 
> how would i go about buying replacement pads of the correct size?


Dont worry about bying the pads...

as a gesture of goodwill i'll order some 1.5mm and have it delivered straight to you, you should have it Friday , if i put in the order over at OCUK tomorrow


----------



## xer0h0ur

Not bad, dreading having to drain my loop since its been about 6 months on the same coolant lol. I have been extremely lazy lately, have a brand spanking new CPU waterblock sitting there collecting dust lol. I need to kill two birds with one stone and just get it done. Gotta order some more EK clear coolant. After that I still need to pay some stuff off so I can extend the loop with an external rad on a rear mounting bracket and finally drop the matching EK block on the 290X.

How you been Syceo? Hows those 980's treating you?


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Not bad, dreading having to drain my loop since its been about 6 months on the same coolant lol. I have been extremely lazy lately, have a brand spanking new CPU waterblock sitting there collecting dust lol. I need to kill two birds with one stone and just get it done. Gotta order some more EK clear coolant. After that I still need to pay some stuff off so I can extend the loop with an external rad on a rear mounting bracket and finally drop the matching EK block on the 290X.


Geezus are you still tinkering with your rig... what block are you adding now?

you might have to sell an organ or something to get it done quicker lol, coming up to xmas im now doing a mini build with that left over 290x from the trifire setup... so here we go again with this expensive hobby, pomms tried selling me his Ares lll , guess he got bored of it .. surprise surprise.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> Geezus are you still tinkering with your rig... what block are you adding now?
> 
> you might have to sell an organ or something to get it done quicker lol, coming up to xmas im now doing a mini build with that left over 290x from the trifire setup... so here we go again with this expensive hobby, pomms tried selling me his Ares lll , guess he got bored of it .. surprise surprise.


So you don't have the 980s anymore? You went back to the 295?
I'm finalizing my build today, just waiting for my compression fittings.


----------



## Syceo

How you been Syceo? Hows those 980's treating you?[/quote]

I cant complain really , not as much power as the trifire , but my gosh the drivers come quick and fast, set up is virtually silent, works for me in this environment. I will however be looking at whatever offerings the next AMD cards bring to the table (mainly the 390x if it ever surfaces)
Quote:


> Originally Posted by *xer0h0ur*
> 
> Not bad, dreading having to drain my loop since its been about 6 months on the same coolant lol. I have been extremely lazy lately, have a brand spanking new CPU waterblock sitting there collecting dust lol. I need to kill two birds with one stone and just get it done. Gotta order some more EK clear coolant. After that I still need to pay some stuff off so I can extend the loop with an external rad on a rear mounting bracket and finally drop the matching EK block on the 290X.
> 
> How you been Syceo? Hows those 980's treating you?


I cant complain really , not as much power as the trifire , but my gosh the drivers come quick and fast, set up is virtually silent, works well with the ROG Swift, works for me in this environment. I will however be looking at whatever offerings the next AMD cards bring to the table (mainly the 390x if it ever surfaces)


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> So you don't have the 980s anymore? You went back to the 295?
> I'm finalizing my build today, just waiting for my compression fittings.


Whats up Lj lol are you tinkering as well, you guys must have deeper pockets than me lol, nope still got the 980's, but i came real close to buying an Ares , to be honest I miss the insane power of the trifire, but oh well, as m,entioned before I am hoping AMD produce something special on the next run... stacked memory would be nice for starters, 4gb just seems like " just enough" these days


----------



## xer0h0ur

Its a Koolance CPU-380I. It was pretty much an impulse buy based on an Ivy Bridge waterblock roundup and realizing I am still using a ghetto equivalent Thermaltake block lol. Still debating with myself if I want to delid the 4930K or not.


----------



## xer0h0ur

Quote:


> Originally Posted by *Syceo*
> 
> Whats up Lj lol are you tinkering as well, you guys must have deeper pockets than me lol, nope still got the 980's, but i came real close to buying an Ares , to be honest I miss the insane power of the trifire, but oh well, as m,entioned before I am hoping AMD produce something special on the next run... stacked memory would be nice for starters, 4gb just seems like " just enough" these days


Man don't even get me started with developers these days. Its complete asshatery that suddenly 4GB isn't enough to run Ultra settings with impunity. Now we have uncompressed textures in SoM requiring 6GB!?!? WAT! Not gonna lie, I am also a bit jelly at those new 8GB 290X's. All I know is they better not make a 16GB 295X2 because I will be like a damned moth to the flame and perpetually broke for that matter


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its a Koolance CPU-380I. It was pretty much an impulse buy based on an Ivy Bridge waterblock roundup and realizing I am still using a ghetto equivalent Thermaltake block lol. Still debating with myself if I want to delid the 4930K or not.


Whats putting you off the de-lid?? i would totally do it on any cpu that under performs in terms of thermals, I know people say well what about the warranty and so on, but to be honest if you plan on using it for a few years and u just use the razor blade technique then cant really go wrong. When your done with it , slap it on the bay, people will defo grab it. Forget the hammer and vice , you get more control with the blade, Im just about to purchase a haswell refresh, my 4770K has been lidded and I shaved off 15c straight , but I still cant get past 4.4 comfortably.


----------



## Syceo

hahahahah I hear you.... Xer0h0ur, is that a straight 8GB on the 290x ?? if so thats quite an attractive proposition, 2 of those for 4K would work wonders


----------



## xer0h0ur

Quote:


> Originally Posted by *Syceo*
> 
> Whats putting you off the de-lid?? i would totally do it on any cpu that under performs in terms of thermals, I know people say well what about the warranty and so on, but to be honest if you plan on using it for a few years and u just use the razor blade technique then cant really go wrong. When your done with it , slap it on the bay, people will defo grab it. Forget the hammer and vice , you get more control with the blade, Im just about to purchase a haswell refresh, my 4770K has been lidded and I shaved off 15c straight , but I still cant get past 4.4 comfortably.


Inexperience and warranty. I have never attempted a delid and my processor is still under no questions asked swap out warranty another 6 months or so.
Quote:


> Originally Posted by *Syceo*
> 
> hahahahah I hear you.... Xer0h0ur, is that a straight 8GB on the 290x ?? if so thats quite an attractive proposition, 2 of those for 4K would work wonders


Yup, those bad boys are only available at the moment here: http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP

Vaguely remember something about other manufacturers also making 8GB models so hopefully those would be available across the pond in my neck of the woods.


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Inexperience and warranty. I have never attempted a delid and my processor is still under no questions asked swap out warranty another 6 months or so.


Well you clearly have considered it, all I can say is , its not as daunting as it seems, once you get the blade into the glue its like cutting cheese... you eat cheese right lol... do it do it do it...


----------



## Syceo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Inexperience and warranty. I have never attempted a delid and my processor is still under no questions asked swap out warranty another 6 months or so.
> Yup, those bad boys are only available at the moment here: http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP
> 
> Vaguely remember something about other manufacturers also making 8GB models so hopefully those would be available across the pond in my neck of the woods.


humm wonder if theres a block coming out on that, looks pretty much the same length as a 295x2


----------



## Syceo

Quote:


> Originally Posted by *Syceo*
> 
> humm wonder if theres a block coming out on that, looks pretty much the same length as a 295x2


Ignore that... there clearly is an EK block for it


----------



## xer0h0ur

Yeah, I am a tinkerer so I will no doubt end up delidding the sucker once the warranty is up. Its going to be like my car. I have like $4,000 worth of aftermarket parts waiting for the 100K mile extended warranty to expire before I drop a new suspension in my 2007 Charger R/T. I usually maintain my warranties as long as they last. Waterblocking the 295X2 was one of the few times where I threw caution to the wind and killed the warranty near immediately.

Oh yeah almost forgot, cheese. I love cheese. As long as its not goat cheese bring it!


----------



## ocvn

Quote:


> Originally Posted by *xer0h0ur*
> 
> Monitor your GPU temperatures, GPU usage and GPU clocks using Afterburner and/or GPU-Z logging to a file. You haven't tried the latest beta driver? The 14.11.1? ULPS is disabled? Are you running on the original Asetek cooler or waterblocked? Do you have proper spacing between both cards or is the top card's VRM fan being choked off from airflow?


Temp totally fine with 2 Koolance water blocks and dual monsta 480 radiators. When I run with single 4k screens, no problem at all. Smooth game play. The problem appear when I active the eyefinity triple screen. Havent try the beta yet but will do it today.


----------



## xer0h0ur

Quote:


> Originally Posted by *ocvn*
> 
> Temp totally fine with 2 Koolance water blocks and dual monsta 480 radiators. When I run with single 4k screens, no problem at all. Smooth game play. The problem appear when I active the eyefinity triple screen. Havent try the beta yet but will do it today.


ULPS is disabled? Do you have proper card spacing so your airflow to the VRM fan isn't choked off from the top card by the bottom card? When you waterblocked your cards did you ensure your pads had proper contact? Particularly the PLX chip and the VRMs.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syceo*
> 
> Whats putting you off the de-lid?? i would totally do it on any cpu that under performs in terms of thermals, I know people say well what about the warranty and so on, but to be honest if you plan on using it for a few years and u just use the razor blade technique then cant really go wrong. When your done with it , slap it on the bay, people will defo grab it. Forget the hammer and vice , you get more control with the blade, Im just about to purchase a haswell refresh, my 4770K has been lidded and I shaved off 15c straight , but I still cant get past 4.4 comfortably.
> 
> 
> 
> Inexperience and warranty. I have never attempted a delid and my processor is still under no questions asked swap out warranty another 6 months or so.
> Quote:
> 
> 
> 
> Originally Posted by *Syceo*
> 
> hahahahah I hear you.... Xer0h0ur, is that a straight 8GB on the 290x ?? if so thats quite an attractive proposition, 2 of those for 4K would work wonders
> 
> Click to expand...
> 
> Yup, those bad boys are only available at the moment here: http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP
> 
> Vaguely remember something about other manufacturers also making 8GB models so hopefully those would be available across the pond in my neck of the woods.
Click to expand...

I thought all the i7 "E" chips were all solded and you can't delid them?

I could be wrong but I was always under that impression.

And yeag AMD gave the all clear for all AIB's to make 8GB 290x's, so far its Sapphire with the Vapor-X and a Ref design card. XFX with the DD and I think MSI is doing one as well but im not 100% sure.


----------



## jelly4ish

dbl post


----------



## ocvn

Quote:


> Originally Posted by *xer0h0ur*
> 
> ULPS is disabled? Do you have proper card spacing so your airflow to the VRM fan isn't choked off from the top card by the bottom card? When you waterblocked your cards did you ensure your pads had proper contact? Particularly the PLX chip and the VRMs.


Ulps off. Full cover wb so vrm no problem. 4k single display is ok bro. The problem is from cfx and eyefinity


----------



## xer0h0ur

Quote:


> Originally Posted by *ocvn*
> 
> Ulps off. Full cover wb so vrm no problem. 4k single display is ok bro. The problem is from cfx and eyefinity


I know what you're saying but you have no way of monitoring your VRM temps and if you have improper spacing and even one of the cards is having its VRMs overheat from a lack of airflow while pushing some ridiculous eyefinity resolution across multiple monitors then you best believe your VRMs are getting hotter in eyefinity than just pushing 4K. Overheating VRMs or PLX chips can cause you problems so its why I kept asking over and over if you're sure both cards had proper contact to their blocks' pads and were getting good airflow to their VRM fans. You may benefit from doing the VRM fan mod to manually control the PWM fans yourself if you have improper card spacing.

Edit: Derp, I keep forgetting you have waterblocks, nevermind about the VRM fan mod. That is obviously for people using the original Asetek Cooler.


----------



## jelly4ish

Quote:


> Originally Posted by *Syceo*
> 
> Dont worry about bying the pads...
> 
> as a gesture of goodwill i'll order some 1.5mm and have it delivered straight to you, you should have it Friday , if i put in the order over at OCUK tomorrow


Cheers mate appreciate it


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> Cheers mate appreciate it


was'nt sure if you got my message this morning , didnt here from you so I didnt order it because you said you would, however if you havent placed an order yet I can do it, just let me know either way


----------



## xer0h0ur

jelly4ish, I never did ask you which version of Firestrike you were talking about. Firestrike, Firestrike Extreme or Firestrike Ultra? I only get framerate drops and issues running Firestrike Ultra.


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> jelly4ish, I never did ask you which version of Firestrike you were talking about. Firestrike, Firestrike Extreme or Firestrike Ultra? I only get framerate drops and issues running Firestrike Ultra.


vanilla firestrike I'm afraid


----------



## jelly4ish

Quote:


> Originally Posted by *xer0h0ur*
> 
> jelly4ish, I never did ask you which version of Firestrike you were talking about. Firestrike, Firestrike Extreme or Firestrike Ultra? I only get framerate drops and issues running Firestrike Ultra.


Quote:


> Originally Posted by *Syceo*
> 
> was'nt sure if you got my message this morning , didnt here from you so I didnt order it because you said you would, however if you havent placed an order yet I can do it, just let me know either way


Quote:


> Originally Posted by *jelly4ish*
> 
> vanilla firestrike I'm afraid


Ok I'm managed to do a firestrike run without getting the frame drops.

... However I had to run the test with the side of the case off with a large fan (desk fan) directly channeling air onto the card.

I think this pretty much proves it's an issue with the Vram overheating as the card gpu's don't go above the mid 60's anymore (ambient temp is very cold now in the UK) but i still get the frame drops to the low 30's which can't be gpu throttling.

Hopefully these new pads will fix the issue as this card is a rather pricey headache as it stands


----------



## xer0h0ur

It should be the VRMs' pads


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> Ok I'm managed to do a firestrike run without getting the frame drops.
> 
> ... However I had to run the test with the side of the case off with a large fan (desk fan) directly channeling air onto the card.
> 
> I think this pretty much proves it's an issue with the Vram overheating as the card gpu's don't go above the mid 60's anymore (ambient temp is very cold now in the UK) but i still get the frame drops to the low 30's which can't be gpu throttling.
> 
> Hopefully these new pads will fix the issue as this card is a rather pricey headache as it stands


Pads are on the way









Pricey is buying this card at £1000 and then purchasing a waterblock and back plate for an addition £150....... you got it for less than £600 + a waterblock + £15 worth of thermal pads..... I believe this headache is easily remedied







just saying.....


----------



## jelly4ish

Quote:


> Originally Posted by *Syceo*
> 
> Pads are on the way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Pricey is buying this card at £1000 and then purchasing a waterblock and back plate for an addition £150....... you got it for less than £600 + a waterblock + £15 worth of thermal pads..... I believe this headache is easily remedied
> 
> 
> 
> 
> 
> 
> 
> just saying.....


yeah man appreciate the support.

If it wasn't for my case I'd just get a res pump combo and go for full water cooling.... but I've promised myself i'm not spending anymore on this system


----------



## ljreyl

Quote:


> Originally Posted by *jelly4ish*
> 
> yeah man appreciate the support.
> 
> If it wasn't for my case I'd just get a res pump combo and go for full water cooling.... but I've promised myself i'm not spending anymore on this system


I haven't been keeping up with this thread -_-

Couple questions for you
1 - Is the card on the stock cooler?
2 - Do you only experience frame drop in 3DMark?
3 - Did you try doing a clean uninstall of drivers then reinstall? (DDU)
4 - Are you monitoring the core clocks with an app such as HWInfo64?


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> yeah man appreciate the support.
> 
> If it wasn't for my case I'd just get a res pump combo and go for full water cooling.... but I've promised myself i'm not spending anymore on this system


Not a problem at all, you will give in eventually lol ... resistance is futile .... you have a waterblock... im confident at somepoint your gonna install it


----------



## Orivaa

Finally got my 295x2 set up, and boy, was it an experience.


Spoiler: Warning: Spoiler!



It technically arrived yesterday, but since there were no one home to receive it, it went back to the post office. It's a 20 min walk from my place to the post office, but I had no other means of transportation, so I went on an extended walk in the light rain.
I arrived at the post office, and was told that, since no one were there to receive the package, I'd have to wait until tomorrow (today) to get it. So that sucked.
I went down there again today and got the box. It wasn't heavy, but rather large, so walking all the way back home was horrible.
At any rate, I get home, turn on my PC so that I could DDU it, and open my package. While waiting for my PC to boot up, I get everything ready - Screwdrivers, cables, bowls for the screws, and so forth.
I don't have the money or knowledge to do extensive modding, so I just decided to do Axiumone's VRM fan speed mod. I unscrewed all the screws and tried to take off the shroud, but couldn't. Perplexed, I found his tutorial on my phone, and saw that I had accidentally unscrewed the wrong screws (The ones on top, not on the sides.)
So I put all the screws back together and started looking for a screwdriver that fit the smaller side-screws. Turns out, though, that we had no such things. I had to go all the way down to our local hardware store and buy some.
I finally get back home and continues to tinker with the card. Problem is, one of the screws absolutely refused to budge. All the other ones came free fine, but this one just refused. Luckily, the cable was at the other end of the card, so I could (awkwardly) get it out and connect it to the mod cable.
At that point, I had uninstalled my display drivers and my PC was sound asleep. I disconnected everything, and lifted it on the table. I then removed the CPU cooler (Needed to add new paste anyway, as I had done a bad job last time), removed the exhaust fan and replaced it with the GPU cooler, and put the card into its sloth. Put the CPU cooler (Which barely fit with the GPU cooler, I might add) back on, closed the whole thing, but some wheels on it because it's damn heavy, and connected everything back up.


After several reboots and installation, everything seem to work perfectly. (Ran Firestrike Vanilla, and even though the result file was corrupted for some reason, my fps never came below 85.)

The only problem I have is that, while I have connected the VRM fan to a mobo header and should be able to control it, I don't know _how_.
In Axiumone's video, he used Corsair Link, but I don't know which fan is the one, nor which how to turn it up or set it to turn up automatically.
I remember reading that AMD locked the fan speed at 35%, so I'd like to just be able to play around with the percentages themselves without much fuss. Preferably in Asus' AIsuit 3. Then set it to automatically go up as load increases.


----------



## jelly4ish

Quote:


> Originally Posted by *ljreyl*
> 
> I haven't been keeping up with this thread -_-
> 
> Couple questions for you
> 1 - Is the card on the stock cooler?
> 2 - Do you only experience frame drop in 3DMark?
> 3 - Did you try doing a clean uninstall of drivers then reinstall? (DDU)
> 4 - Are you monitoring the core clocks with an app such as HWInfo64?


1 stock cooler
2 nope crysis 3 aswell (and other graphically intensive games)
3 uninstalled with DDU and a clean install of windows problem persists
4 still get the frame drops when both gpu's are mid 60's


----------



## jelly4ish

Quote:


> Originally Posted by *Syceo*
> 
> Not a problem at all, you will give in eventually lol ... resistance is futile .... you have a waterblock... im confident at somepoint your gonna install it


I've got my eyes on the NZXT H440


----------



## axiumone

Quote:


> Originally Posted by *Orivaa*
> 
> Finally got my 295x2 set up, and boy, was it an experience.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> It technically arrived yesterday, but since there were no one home to receive it, it went back to the post office. It's a 20 min walk from my place to the post office, but I had no other means of transportation, so I went on an extended walk in the light rain.
> I arrived at the post office, and was told that, since no one were there to receive the package, I'd have to wait until tomorrow (today) to get it. So that sucked.
> I went down there again today and got the box. It wasn't heavy, but rather large, so walking all the way back home was horrible.
> At any rate, I get home, turn on my PC so that I could DDU it, and open my package. While waiting for my PC to boot up, I get everything ready - Screwdrivers, cables, bowls for the screws, and so forth.
> I don't have the money or knowledge to do extensive modding, so I just decided to do Axiumone's VRM fan speed mod. I unscrewed all the screws and tried to take off the shroud, but couldn't. Perplexed, I found his tutorial on my phone, and saw that I had accidentally unscrewed the wrong screws (The ones on top, not on the sides.)
> So I put all the screws back together and started looking for a screwdriver that fit the smaller side-screws. Turns out, though, that we had no such things. I had to go all the way down to our local hardware store and buy some.
> I finally get back home and continues to tinker with the card. Problem is, one of the screws absolutely refused to budge. All the other ones came free fine, but this one just refused. Luckily, the cable was at the other end of the card, so I could (awkwardly) get it out and connect it to the mod cable.
> At that point, I had uninstalled my display drivers and my PC was sound asleep. I disconnected everything, and lifted it on the table. I then removed the CPU cooler (Needed to add new paste anyway, as I had done a bad job last time), removed the exhaust fan and replaced it with the GPU cooler, and put the card into its sloth. Put the CPU cooler (Which barely fit with the GPU cooler, I might add) back on, closed the whole thing, but some wheels on it because it's damn heavy, and connected everything back up.
> 
> 
> After several reboots and installation, everything seem to work perfectly. (Ran Firestrike Vanilla, and even though the result file was corrupted for some reason, my fps never came below 85.)
> 
> The only problem I have is that, while I have connected the VRM fan to a mobo header and should be able to control it, I don't know _how_.
> In Axiumone's video, he used Corsair Link, but I don't know which fan is the one, nor which how to turn it up or set it to turn up automatically.
> I remember reading that AMD locked the fan speed at 35%, so I'd like to just be able to play around with the percentages themselves without much fuss. Preferably in Asus' AIsuit 3. Then set it to automatically go up as load increases.


With AI suite you wont be able to make it automatically adjust by GPU temps as AI suite only monitors the CPU temp. However, you can set up some profile shortcuts in AI suite to ramp up the fans manually.


----------



## Orivaa

Quote:


> Originally Posted by *axiumone*
> 
> With AI suite you wont be able to make it automatically adjust by GPU temps as AI suite only monitors the CPU temp. However, you can set up some profile shortcuts in AI suite to ramp up the fans manually.


Can I set it to adjust by percentage so I can actually understand whatever the hell it means?


----------



## axiumone

Quote:


> Originally Posted by *Orivaa*
> 
> Can I set it to adjust by percentage so I can actually understand whatever the hell it means?


I don't remember, but I don't think that AI suite allows you to control by percentage.


----------



## Orivaa

Quote:


> Originally Posted by *axiumone*
> 
> I don't remember, but I don't think that AI suite allows you to control by percentage.


What would be your semi-professional opinion, then?


----------



## axiumone

Quote:


> Originally Posted by *Orivaa*
> 
> What would be your semi-professional opinion, then?


Well, I'm actually using AI suite to control all my fans right now as well. I pretty much just switch the fans to full tilt whenever I play anything. Otherwise my cards start to overheat.


----------



## Orivaa

Quote:


> Originally Posted by *axiumone*
> 
> Well, I'm actually using AI suite to control all my fans right now as well. I pretty much just switch the fans to full tilt whenever I play anything. Otherwise my cards start to overheat.


Fair enough.

Now, time for something else.

I brought this card from Sapphire on Amazon.co.uk.
I read somewhere that if you buy some specific AMD cards, you'll get some free games in their Never Settle promotion. It says on their site that you'll need a coupon, which I did not receive in my package. Any way I can get that? I live in Denmark, btw.


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> Fair enough.
> 
> Now, time for something else.
> 
> I brought this card from Sapphire on Amazon.co.uk.
> I read somewhere that if you buy some specific AMD cards, you'll get some free games in their Never Settle promotion. It says on their site that you'll need a coupon, which I did not receive in my package. Any way I can get that? I live in Denmark, btw.


Welcome to the club, when I bought my card through Newegg.com I got it e-mailed to me after the fact. If you don't get it within a week then start popping off some e-mails to them asking whats the deal.

In an entirely unrelated note, I just went through a stressful couple of hours. Yesterday evening I said screw it, its been 6 months with this powerful PC and I am still on a rinky dink Samsung 204B so I pulled the trigger on Samsung's 4K monitor UD590. I ordered it through Amazon and had it shipped next day so here I sat unboxing and connecting. So I expected everything to be uneventful, disconnect here, plug in there. Alas, NOPE! I remove the DVI connector from the 290X and plug in the mini displayport cable to the 295X2, fire up the machine and beep beep beep beep beep beep. I was like no freaking way my motherboard just took a dump on me right now. Since I had already had these shenanigans happen to me before when I was trying to tri-fire the 290X with the 295X2 I assumed the same demons were back. I disconnected the 8-pin cables from the 295X2 so only the 290X was powering up and used the displayport to displayport cable that came with the monitor to connect the monitor to the 290X. Fired up the PC again and still 6 beep error code. I was like okay...then I swapped the cable with an HDMI cable instead. Turned on the PC and it everything fired right up normally. I got to the login screen and just shut down the PC, swapped the cables again to use the displayport, fired her up again and once again everything was fine so shut it down again. This time I plugged back in the 8-pin power cables to the 295X2 and restarted.....beep beep beep beep beep beep....again. At this point I was like cmon! So I disconnected the power cables to the 290X and plugged in the mini displayport to the 295X2, fired her up and again everything started up fine so I shut her down. Finally I reconnected the power cables to the 290X and restarted the computer the exact same way I had attempted in the first place and sure enough system booted fine, posted and reached the login screen. What manner of sorcery is this? Shenanigans!


----------



## ljreyl

Quote:


> Originally Posted by *jelly4ish*
> 
> 1 stock cooler
> 2 nope crysis 3 aswell (and other graphically intensive games)
> 3 uninstalled with DDU and a clean install of windows problem persists
> 4 still get the frame drops when both gpu's are mid 60's


For #4, thats monitoring GPU temps. I'm curious to see if your core clocks fluctuate when your frames drop. Can you check?
Stock clocks too right?


----------



## ljreyl

Just wanted to give an update on my rig, which has been FINALIZED!

Before:


After:



So what did I change?
Basically, I got rid of the "C" loop and the 90 degree elbow from the bottom card to the pump/res.
I waited until I got 3/8 5/8 monsoon fittings (they were the shortest fittings that could make this bridge work VS the C loop)
Also, I added another fan to the front rad in push and mounted the pump lower. (Front rad is in push/pull)

By removing (2) 45's a (1) 90, temps went down a whopping 2c, probably from slightly increased flow and the extra fan for the front rad.
I also overclocked my 4790K to 4.6GHz from 4.5GHz now that I had more headroom for higher temps.

Final clocks and temps:
4790K - 4.6GHz with 75c MAX
295x2 - 1061 Core / 1400 Memory with 61C/62c MAX (Stock volts)
290x - 1061 Core / 1400 memory with 60c MAX (Stock volts)

Loop is:
Res > Pump > 240mm > CPU > 360MM > 295x2 > 290x > Res

I SWEAR that I am done for a very long time lol...


----------



## Syceo

Quote:


> Originally Posted by *ljreyl*
> 
> Just wanted to give an update on my rig, which has been FINALIZED!
> 
> Before:
> 
> 
> After:
> 
> 
> 
> So what did I change?
> Basically, I got rid of the "C" loop and the 90 degree elbow from the bottom card to the pump/res.
> I waited until I got 3/8 5/8 monsoon fittings (they were the shortest fittings that could make this bridge work VS the C loop)
> Also, I added another fan to the front rad in push and mounted the pump lower. (Front rad is in push/pull)
> 
> By removing (2) 45's a (1) 90, temps went down a whopping 2c, probably from slightly increased flow and the extra fan for the front rad.
> I also overclocked my 4790K to 4.6GHz from 4.5GHz now that I had more headroom for higher temps.
> 
> Final clocks and temps:
> 4790K - 4.6GHz with 75c MAX
> 295x2 - 1061 Core / 1400 Memory with 61C/62c MAX (Stock volts)
> 290x - 1061 Core / 1400 memory with 60c MAX (Stock volts)
> 
> Loop is:
> Res > Pump > 240mm > CPU > 360MM > 295x2 > 290x > Res
> 
> I SWEAR that I am done for a very long time lol...


much cleaner without that "c" loop







but why do i get the feeling your not done just yet lol..... that flow bridge must have been a pain to get in.....


----------



## jelly4ish

Quote:


> Originally Posted by *ljreyl*
> 
> For #4, thats monitoring GPU temps. I'm curious to see if your core clocks fluctuate when your frames drop. Can you check?
> Stock clocks too right?


how do i get the on screen display of core usage/temps/frames whilst running fire strike? I've not figured that out yet


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> how do i get the on screen display of core usage/temps/frames whilst running fire strike? I've not figured that out yet


Afterburner > Settings> Monitoring> toggle the ones you want in on screen display


----------



## jelly4ish

Quote:


> Originally Posted by *Syceo*
> 
> Afterburner > Settings> Monitoring> toggle the ones you want in on screen display


Cheers,

OK I'm considering (considering) adding in a small loop for the graphics card, assuming the replacement pads don't instantly resolve the frame drops (nothing ever seems to go as planned)

I've found this:

http://www.overclockers.co.uk/showproduct.php?prodid=WC-140-XS&tool=3

which seems like it would fit in my case and make use of my unused drive bays. Is this a good solution?

I'm envisioning setup is:

res pump drive bay combo > water goes to inlet on card > water comes out of outlet on card > water goes into inlet of front mounted radiator > water comes out of outlet on front mount radiator > water goes back into pump res combo > repeat

If i'm correct i would need to purchase:

1 x radiator
1 x pump res combo
1 x 2 meters tubing
compression fittings - do these come included usually with radiators etc?

If I'm grossly misunderstanding this Id appreciate some input


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> Cheers,
> 
> OK I'm considering (considering) adding in a small loop for the graphics card, assuming the replacement pads don't instantly resolve the frame drops (nothing ever seems to go as planned)
> 
> I've found this:
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-140-XS&tool=3
> 
> which seems like it would fit in my case and make use of my unused drive bays. Is this a good solution?
> 
> I'm envisioning setup is:
> 
> res pump drive bay combo > water goes to inlet on card > water comes out of outlet on card > water goes into inlet of front mounted radiator > water comes out of outlet on front mount radiator > water goes back into pump res combo > repeat
> 
> If i'm correct i would need to purchase:
> 
> 1 x radiator
> 1 x pump res combo
> 1 x 2 meters tubing
> compression fittings - do these come included usually with radiators etc?
> 
> If I'm grossly misunderstanding this Id appreciate some input


What case are you using ?


----------



## Syceo

Quote:


> Originally Posted by *Syceo*
> 
> What case are you using ?


And no, you'd have to buy the compression fittings, dont forget you will need to add an additional bit of tubing that you can hide in the back of the case (and use it as a drain for when you want to drain your loop)


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> Cheers,
> 
> OK I'm considering (considering) adding in a small loop for the graphics card, assuming the replacement pads don't instantly resolve the frame drops (nothing ever seems to go as planned)
> 
> I've found this:
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-140-XS&tool=3
> 
> which seems like it would fit in my case and make use of my unused drive bays. Is this a good solution?
> 
> I'm envisioning setup is:
> 
> res pump drive bay combo > water goes to inlet on card > water comes out of outlet on card > water goes into inlet of front mounted radiator > water comes out of outlet on front mount radiator > water goes back into pump res combo > repeat
> 
> If i'm correct i would need to purchase:
> 
> 1 x radiator
> 1 x pump res combo
> 1 x 2 meters tubing
> compression fittings - do these come included usually with radiators etc?
> 
> If I'm grossly misunderstanding this Id appreciate some input


Oh and one more thing, I personally would go for a Non-bay solution, that way your not restricted if in the future you want to change your case, layout or loop design


----------



## jelly4ish

Quote:


> Originally Posted by *Syceo*
> 
> Oh and one more thing, I personally would go for a Non-bay solution, that way your not restricted if in the future you want to change your case, layout or loop design


Quote:


> Originally Posted by *Syceo*
> 
> And no, you'd have to buy the compression fittings, dont forget you will need to add an additional bit of tubing that you can hide in the back of the case (and use it as a drain for when you want to drain your loop)


I'm using "Corsair CC-9011050-WW Carbide Series SPEC-01 Mid-Tower ATX " as my case

There really isn't any room for a none bay solution, in fact I'm wary if I will even be able to fit a modest radiator at the front I will have to check.


----------



## electro2u

Quote:


> Originally Posted by *jelly4ish*
> 
> I'm using "Corsair CC-9011050-WW Carbide Series SPEC-01 Mid-Tower ATX " as my case
> 
> There really isn't any room for a none bay solution, in fact I'm wary if I will even be able to fit a modest radiator at the front I will have to check.


One piece of advice I wish I had taken seriously when I decided to go water is:
Go slow. Very Slow. On everything.
Make sure this is really necessary (it isn't)

Think about everything very very carefully. Don't make any decisions on impulse. Don't order anything out of excitement.
I have a ton of stuff I misordered, changed my mind about or flat out destroyed out of stupidity and hurry.


----------



## xer0h0ur

Quote:


> Originally Posted by *electro2u*
> 
> One piece of advice I wish I had taken seriously when I decided to go water is:
> Go slow. Very Slow. On everything.
> Make sure this is really necessary (it isn't)
> 
> Think about everything very very carefully. Don't make any decisions on impulse. Don't order anything out of excitement.
> I have a ton of stuff I misordered, changed my mind about or flat out destroyed out of stupidity and hurry.


What this man said^

I misordered several parts and ended up shorting myself parts I needed when the time came to put it together. I ended up ordering stuff three times before I finally had the loop up and running in its current state.


----------



## magicdave26

Quote:


> Originally Posted by *magicdave26*
> 
> Selling my 295X2 after 2 weeks of owning it - drivers / crossfire support is non existent - no point in having a GPU like this when you can only use half of it in 70% of games


Sold, bought an EVGA SC 980 - thanks AMD it was fun


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicdave26*
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicdave26*
> 
> Selling my 295X2 after 2 weeks of owning it - drivers / crossfire support is non existent - no point in having a GPU like this when you can only use half of it in 70% of games
> 
> 
> 
> Sold, bought an EVGA SC 980 - thanks AMD it was fun
Click to expand...

Good thing you did the research before you pulled the pin huh?


----------



## magicdave26

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good thing you did the research before you pulled the pin huh?


Meaning the 295X2 pin?


----------



## Sgt Bilko

false
Quote:


> Originally Posted by *magicdave26*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Good thing you did the research before you pulled the pin huh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Meaning the 295X2 pin?
Click to expand...

Of course, Still no idea what your problems were as mine is working great for me in the games i've been playing


----------



## magicdave26

Quote:


> Originally Posted by *Sgt Bilko*
> 
> false
> Of course, Still no idea what your problems were as mine is working great for me in the games i've been playing


Well I didn't buy it, I was gifted it by AMD, and decided that I would pay it forwards so to speak, so gave my 290 to someone who needed a new card, then discovered that CF has the same issues it has always had - I don't look at it as a bad thing though, the whole process made two people happy in the end - I dont have a bad thing to say about AMD, enjoyed every day Ive been with them, but they struggle with Multi everything, if you keep with single gpu and single lcd, it's all good, once you start adding to either of them, the problems start


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicdave26*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> false
> Of course, Still no idea what your problems were as mine is working great for me in the games i've been playing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well I didn't buy it, I was gifted it by AMD, and decided that I would pay it forwards so to speak, so gave my 290 to someone who needed a new card, then discovered that CF has the same issues it has always had - I don't look at it as a bad thing though, the whole process made two people happy in the end - I dont have a bad thing to say about AMD, enjoyed every day Ive been with them, but they struggle with Multi everything, if you keep with single gpu and single lcd, it's all good, once you start adding to either of them, the problems start
Click to expand...

Wow......They gave it to you and then you sold it?

fair enough man, but i said before....I'm not having those issues and I'm running a very similar setup to you


----------



## magicdave26

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Wow......They gave it to you and then you sold it?
> 
> fair enough man, but i said before....I'm not having those issues and I'm running a very similar setup to you


I know, a lot of people with similar setups have a much better experience than me, not sure what's going on, but yea, they gave me it and that gave me the opportunity to get someone else into 2014 gaming with the 290, my plan was if CF was still broken I would have a 290X, but something wasn't right and I got worse performance than a single 290 most of the time, there was the odd game that worked flawlessly with CF, but only around 5% of everything I played worked, the rest was terrible - people suggested I was CPU bottlenecked, but I don't think it was that, usage was nowhere near 100% on any core in anything other than benches

I didn't really score for anything more than I already had, it only went for £490, then ebay & PP fees & delivery, I got back around what the 290 cost me, just pleased to be back with a single GPU again really


----------



## xer0h0ur

If I was going with a new single GPU solution I would be going with one of those new 8GB cards at least.


----------



## ljreyl

Quote:


> Originally Posted by *Syceo*
> 
> much cleaner without that "c" loop
> 
> 
> 
> 
> 
> 
> 
> but why do i get the feeling your not done just yet lol..... that flow bridge must have been a pain to get in.....


Massive pain, but it's a lot more visually appealing now


----------



## Orivaa

Noob question, but how do you set up a profile for a specific game and disable crossfire?


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> Noob question, but how do you set up a profile for a specific game and disable crossfire?


Quote:


> Originally Posted by *Orivaa*
> 
> Noob question, but how do you set up a profile for a specific game and disable crossfire?


You have to open the CCC, click the Gaming tab, click 3D application settings, click add, find and select the executable for the game. Once the profile has been created it will be listed below or above the add button within this 3d application settings section. Click on the profile you made (below/above the add button) and once its selected scroll to the bottom for the crossfire setting.


----------



## magicdave26

Quote:


> Originally Posted by *xer0h0ur*
> 
> If I was going with a new single GPU solution I would be going with one of those new 8GB cards at least.


Quite liked PhysX and TXAA last time I had an nVIDIA card and didn't want to be stuck with a 5450 until they release the Ti or TII


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> You have to open the CCC, click the Gaming tab, click 3D application settings, click add, find and select the executable for the game. Once the profile has been created it will be listed below or above the add button within this 3d application settings section. Click on the profile you made (below/above the add button) and once its selected scroll to the bottom for the crossfire setting.


What if it doesn't give you the option? Tried 3 different .exes, but they all just gave the "frame pacing" option on Crossfire and nothing else.


----------



## ljreyl

Quote:


> Originally Posted by *Orivaa*
> 
> What if it doesn't give you the option? Tried 3 different .exes, but they all just gave the "frame pacing" option on Crossfire and nothing else.


After you make the profile and you choose the .exe, you need to click the program on the left hand side of the window (below "System Settings" and "Application Settings"). Then, it'll show "CrossfireX Mode" under the "Frame Pacing" option.


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> What if it doesn't give you the option? Tried 3 different .exes, but they all just gave the "frame pacing" option on Crossfire and nothing else.


Are you not clicking on the created profile before you try modifying settings? Your listed options should be Default Crossfire settings, Optimize 1x1, AFR Friendly, Disabled, Use AMD Predefined Profile. Basically follow this guy's instructions on creating and modifying a CCC crossfire profile:


----------



## Mega Man

Quote:


> Originally Posted by *magicdave26*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> If I was going with a new single GPU solution I would be going with one of those new 8GB cards at least.
> 
> 
> 
> Quite liked PhysX and TXAA last time I had an nVIDIA card and didn't want to be stuck with a 5450 until they release the Ti or TII
Click to expand...

Physx-meh such simplistic and tbh does not add anything to the game ( for me ) . i just loled when i saw it i will never buy a vid card for that crap

sad the original code was quite optimized for multi core, but nvidia when they bought it stripped that out


----------



## Syceo

For real ... lol are you guys serious ,

and megaman why are you listing everything you own, what's next ? houses , cars, bank accounts with savings?? c'mon guys haha bigger things going on right now than the need to insult one another over such petty things... just saying


----------



## boredmug

Aye! Some men are bigger than others. .


----------



## electro2u

Quote:


> Originally Posted by *boredmug*
> 
> Aye! Some men are bigger than others. .


Must be weird for girls having everyone be able to see the size of your video cards.

Let's try to stay on topic and out of measuring contests boys


----------



## axiumone

Ahem... well that escalated quickly.

Taking the conversation back on topic a little bit. I think that after having owned AMD hardware for the last 5 generations, I think I'm going for big maxwell next. Now that nvidia supports 5x1 display configurations, I'm really curious to see if their drivers improved for surround.

I think AMD still has a lot of work to do as far as XDMA, that's very evident when using more than 2 cards in crossfire. Plus, the fact that they're severely limiting the amount of display outputs on their cards is a major drawback. My ideal setup is 5x1 + 1 @120hz. However, on a 295x2 there's no way to do that on this card. While this should technically be possible on the new nvidia cards.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Are you not clicking on the created profile before you try modifying settings? Your listed options should be Default Crossfire settings, Optimize 1x1, AFR Friendly, Disabled, Use AMD Predefined Profile. Basically follow this guy's instructions on creating and modifying a CCC crossfire profile:


Oh yeah, I didn't click on the profile afterwards. Silly me.
However, it didn't fix my issue.
After I had setup and installed everything, I wanted to see how it performed in a couple of games. (Or rather, I wanted to see if there were any obvious bugs or issues.)
First I tried Assassin's Creed: Revelations.
It crashed in the first second of the intro logo.
Then I tried Assassin's Creed: Brotherhood.
It crashed in the first second of the logo.

That's when I tried disabling Crossfire and failed. Now I did it properly, though, and it still crashed. I also tried deleting the intro videos, and it still crashes. This didn't happen on my old AMD card. Any ideas?


----------



## xer0h0ur

What are you doing? Why are you deleting videos and why are you not able to do something as simple as disabling crossfire for a specific game? Are you even sure you pointed it to the correct executable?


----------



## Mega Man

I seriously don't see these issues you all talk about. OK issues with valley dx9. But how do you know these are driver issues.

Going beyond that I play dx9 games without issue. But if you have the option of dx9 vs do 11 which would you choose?

None of the 3 above mentioned setups I have had problems with.

And all are quadfire systems.

As to the rest he stated I was behind. I showed I was not. By far I never listed all I have. But once again. He can not even respond with out acting like a child. Changing letters to numbers to make "words" the community had decided they did not want


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> What are you doing? Why are you deleting videos and why are you not able to do something as simple as disabling crossfire for a specific game? Are you even sure you pointed it to the correct executable?


Was that meant for me...?
Silly me, of course it was.

I deleted the intros because the crash happened during the logo intro for both games, so I just wanted to give it a shot.
I _did_ disable crossfire for the game, finally, but it made no difference. And yes, it was pointed at the correct .exe.

I did mostly know how to disable crossfire before I asked here, I just didn't know it Catalyst didn't automatically target the new profile after you made it, so the fact that the option didn't show up confused me. Figured I might as well ask here to be sure.


----------



## xer0h0ur

I don't believe anyone else here is deleting game videos. Are you using the 14.11.1 beta?


----------



## xer0h0ur

There is a reason a beta was released litetally after the 14.9 was dropped. Its buggy. 14.9 sucked for me and 14.9.1 was equally bad for me. 14.9.2 was fine and the new 14.11.1 beta was released as a beta because of AC Unity so it makes no sense you're not using it.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> There is a reason a beta was released litetally after the 14.9 was dropped. Its buggy. 14.9 sucked for me and 14.9.1 was equally bad for me. 14.9.2 was fine and the new 14.11.1 beta was released as a beta because of AC Unity so it makes no sense you're not using it.


AC Unity is bugged as hell, regardless of drivers, though. ^__^
But fair enough, I'm installing the beta now.


----------



## xer0h0ur

Oh yeah I know, even the console version was buggy as hell. I have yet to get my code for the free copy of AC Unity from Amazon but I have played it on PS4 and there were some pretty bad bugs that made the game unplayable. Like getting stuck in the hay. Only something we do dozens of times throughout the game lolololol.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh yeah I know, even the console version was buggy as hell. I have yet to get my code for the free copy of AC Unity from Amazon but I have played it on PS4 and there were some pretty bad bugs that made the game unplayable. Like getting stuck in the hay. Only something we do dozens of times throughout the game lolololol.


At any rate, I went to the AMD driver page, and downloaded the 2nd link. However, the only thing it said wasn't up to date was Raptr.
Did I download the wrong one? 'Cause it's the newest one on the page.


----------



## xer0h0ur

The install file itself says the catalyst version it is.


----------



## xer0h0ur

This is the link for the 14.11.1 beta for 64 bit Win7/8.1: http://www2.ati.com/drivers/beta/amd-catalyst-14.11.1beta-64bit-win8.1-win7-nov9.exe

Edit: Apparently I can't direct link you. Site doesn't allow it. Read which version you're picking. The 14.11.1 is above the 14.9.1


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> This is the link for the 14.11.1 beta for 64 bit Win7/8.1: http://www2.ati.com/drivers/beta/amd-catalyst-14.11.1beta-64bit-win8.1-win7-nov9.exe
> 
> Edit: Apparently I can't direct link you. Site doesn't allow it. Read which version you're picking. The 14.11.1 is above the 14.9.1


That's the version I've been using the entire time.


----------



## StillClock1

I'm getting a little bit of fan noise under load, it sounds like a skipping sound. It goes away if I minimize the game (ACU) and comes back when I open it up. Overall not a huge deal, but in case one of you guys has dealt with it before - let me know how you handled it.


----------



## cennis

Anyone ran Far Cry 4?

it is sitting at 50% utlization and very stuttery with 30~40fps at 1440p


----------



## ljreyl

Quote:


> Originally Posted by *cennis*
> 
> Anyone ran Far Cry 4?
> 
> it is sitting at 50% utlization and very stuttery with 30~40fps at 1440p


I don't think there's a driver out yet.
Try running it with crossfire disabled.


----------



## joeh4384

There is a driver for Far Cry 4 but in the release notes it said the crossfire profile was disabled and AMD was working on it with Ubisoft.

http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-11-2-BetaWINReleaseNotes.aspx

Important Notes
The AMD CrossFire™ profile for Far Cry 4 is currently disabled in this driver while AMD works with Ubisoft to investigate an issue where AMD CrossFire™ configurations are not performing as intended. An update is expected on this issue in the near future through an updated game patch or an AMD driver posting.


----------



## rakesh27

Guys,

for both games AC Unity and FC 4, AMD have released the beta v14.11.2, both games are playable, stutters are gone.

For AC Unity it works perfectly in crossfire mode with the beta 2 driver, but AMD have with beta 2 driver for FC 4, limited it to 1 gpu to make it playable.

Well at least now you can play both games, im sure later amd will rectify FC4 with crossfire support.

Finally they are now playable.


----------



## xer0h0ur

Unity on the 14.11.1 @ 4K ultra seemed pretty stuttery. I need to try the 14.11.2 tonight.


----------



## cennis

have a sealed nickel plexi waterblock extra for sale


----------



## PINKTULIPS7

After I install the Card it kept going through loops right before the Windows!!!I t just don't go to Windows regardless what Pci-E Slots is being used so put the old Sapphire Tri-X 290X back and everything is fine, I really don't understand what's going on?


----------



## ColeriaX

Guys is there any way we can push more than 100mV to the card? Best clocks for my FS benching runs were 1215 core 1725mhz mem using AB 4.0. I feel like it still has more potential,. Card doesnt get past 57C but any further and it either hard locks or errors out.


----------



## xer0h0ur

I can confirm that 14.11.2 Cat works better with AC Unity than the 14.11.1. I get pretty good GPU usage across the board.


----------



## xer0h0ur

I take it back, as soon as I tri-fired the game its now non-stop BSODing when loading.

Edit: 295X2 crossfired works fine. Tri-fire is what brings the BSODs.


----------



## rakesh27

I'm running trifire, there could be many things that cause bsod. I suggest you uninstall ccc properly, then reinstall It again, install msi overclocking tool, make sure you disable the power saving for your trifire and put on unified gpu monitoring.

Try this when playing games disable your internet security software, if your overclocking then check all your settings you might need to ad a little more voltage.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> I take it back, as soon as I tri-fired the game its now non-stop BSODing when loading.
> 
> Edit: 295X2 crossfired works fine. Tri-fire is what brings the BSODs.


What settings are you running?
Did you do a clean install of the latest drivers?
What games make you BSOD?

In my experience, CCC offers the most stable method of overclocking and HWInfo64 offers the best monitoring.


----------



## xer0h0ur

Settings are maxed out @ 4K with no AA. I have already tried running AC Unity without any overclocks and BSOD is unavoidable so far. Driver was DDUed and installed twice. Like I said before its only tri-fire that brings these demons out. The 295X2 crossfired works fine and the 290X alone also works fine. Its only when tri-fired I get this BSOD.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Settings are maxed out @ 4K with no AA. I have already tried running AC Unity without any overclocks and BSOD is unavoidable so far. Driver was DDUed and installed twice. Like I said before its only tri-fire that brings these demons out. The 295X2 crossfired works fine and the 290X alone also works fine. Its only when tri-fired I get this BSOD.


I've pretty much never had a situation outside of benchmarks where trifire worked out particularly well. I'd like to kick kyle bennett in the nuts for leading me to believe that was a good idea. I'm sure it works for some games but for me it was a terrible mistake along with the 295 in the first place. I have a new system I'm building around a single 980. Interested to see how that compares.


----------



## rakesh27

Ok, lets see if we can all help him solve this problem (295x2+290x BSOD) 4k gaming

1) Do all games bsod, or is just AC Unity and maybe a few others
2) Did you install beta 2 driver as suggested by AMD for AC unity and FC4
3) Have you disabled ULPS for your cards, like i said install msi overclocking tool and you can do it there
4) When playing games, have you played with all non processes not running eg AV, spyware, internet, overclocking
5) Are your temps ok ?
6) I know you are running a 1250 psu, but have you used the online psu calculator and see with everything you have your PSU can handle everything under load

Thats all i can think off at the moment


----------



## electro2u

Xer0h0ur is smart and rarely needs help w anything at all. The problem is the crossfire profile for ac unity I suspect. And/or the game itself.


----------



## Orivaa

Quote:


> Originally Posted by *electro2u*
> 
> Xer0h0ur is smart and rarely needs help w anything at all. The problem is the crossfire profile for ac unity I suspect. And/or the game itself.


Why anyone is even playing the game yet is beyond me. Wait a month or two for all the relevant patches to come out. Not like there aren't other new games one could play in the mean time.


----------



## xer0h0ur

Yes I am using the 14.11.2 beta driver and I disable ULPS on every driver install. I am not even certain if I can play offline as its a free copy of the game I got from Samsung and it forces me to use that Ubisoft launcher or whatever that connects to the net. I have not managed to play any other games using this driver other than Skyrim which works fine eveb without disabling tri-fire as I normally do on DX9 games. I am not really complaining since I have been playing it on PS4 anyways. I figure its going to take a few months to get the demons out of the game and drivers for PC. I was just curious to see what the PC game was like at 4K with tri-fire frameratewise since 295X2 crossfire gives me around 30-50 fps depending on how many NPCs there are. PSU is not even remotely a factor here.


----------



## rakesh27

ok, to be honest i think AC unity, FC 4, COD AW and DA I, are not perfected yet, as the other guy said we need to wait for game patches and drivers to mature.

Its a shame that we cant use our rigs to its full potential on new games, but i think thats always gonna be the case with PC gaming.

I tell you one thing though, PC gaming beats any console, on a PC the graphics, sound, gameplay is 10 better then consoles, just be patient.

All those games will be fixed in time, as we have choice theres many other games that work fine you could play, maybe you should be posing your question on the ubisoft pc gaming forum, maybe someone can help you tweak the game so you can play it fine.

good luck.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> I've pretty much never had a situation outside of benchmarks where trifire worked out particularly well. I'd like to kick kyle bennett in the nuts for leading me to believe that was a good idea. I'm sure it works for some games but for me it was a terrible mistake along with the 295 in the first place. I have a new system I'm building around a single 980. Interested to see how that compares.


Interested to see how you get on with your new build Electro


----------



## xer0h0ur

I'm holding off on building an x99 rig based off a 5960X and an 8GB 390X. God help me if they make a 395X2 because I will be like a moth to the flame. Crossfire didn't let me down. Tri-fire did.


----------



## crystaldark

I have 2 295x2s and they are both dead. My computer stopped booting with the first one installed a few days ago, and now it is the same story with the second one. They fail in two different ways. I am going to make a video about it tomorrow. I haven't ever had buyer's remorse this badly.


----------



## crystaldark

Quote:


> Originally Posted by *PINKTULIPS7*
> 
> After I install the Card it kept going through loops right before the Windows!!!I t just don't go to Windows regardless what Pci-E Slots is being used so put the old Sapphire Tri-X 290X back and everything is fine, I really don't understand what's going on?


I had the same issue for a while before my cards died. I have two possible solutions:

1. System restore. I had the same problem several times and every time there was a "Windows Modules Update" restore point on the same day I started having the problem. I disabled the Windows Trusted Installer service (which effectively stops any Windows updates from applying) and hadn't run in to that particular issue again (it was far from the only issue I was having though)

2. Enter the BIOS, change nothing, and then exit the BIOS. I read about this working on the ASUS forums. I wish I could have tried it myself instead of using system restore, but it might work for you.


----------



## xer0h0ur

Quote:


> Originally Posted by *crystaldark*
> 
> I have 2 295x2s and they are both dead. My computer stopped booting with the first one installed a few days ago, and now it is the same story with the second one. They fail in two different ways. I am going to make a video about it tomorrow. I haven't ever had buyer's remorse this badly.


Are you sure you weren't having power issues? Two 295X2's requires a fair amount of amps which most power supplies can't handle.


----------



## PINKTULIPS7

Quote:


> Originally Posted by *crystaldark*
> 
> I had the same issue for a while before my cards died. I have two possible solutions:
> 
> 1. System restore. I had the same problem several times and every time there was a "Windows Modules Update" restore point on the same day I started having the problem. I disabled the Windows Trusted Installer service (which effectively stops any Windows updates from applying) and hadn't run in to that particular issue again (it was far from the only issue I was having though)
> 
> 2. Enter the BIOS, change nothing, and then exit the BIOS. I read about this working on the ASUS forums. I wish I could have tried it myself instead of using system restore, but it might work for you.


I reset the BIOS three times then restore Factory default but no go, changed Pci-Exp slots but no go si it's going back to Newegg


----------



## rakesh27

X, Has a point , you definitely need a powerful PSU to run 2x295x2.

You have to or need a 1500watts psu to run them not to mention, some power left over to run your rig.

The better psu you have the less problems will havehave.


----------



## rakesh27

Quote:


> Originally Posted by *rakesh27*
> 
> X, Has a point , you definitely need a powerful PSU to run 2x295x2.
> 
> You have to or need a 1500watts psu to run them not to mention, some power left over to run your rig.
> 
> The better psu you have the less problems will have.
> 
> Check all your drives, and maybe try a Windows repair...


----------



## jelly4ish

Quote:


> Originally Posted by *Syceo*
> 
> For real ... lol are you guys serious ,
> 
> and megaman why are you listing everything you own, what's next ? houses , cars, bank accounts with savings?? c'mon guys haha bigger things going on right now than the need to insult one another over such petty things... just saying


Quote:


> Originally Posted by *xer0h0ur*
> 
> If I was going with a new single GPU solution I would be going with one of those new 8GB cards at least.


Guys I've finally sorted it those 1.5mm pads did the trick it was the vram throttling! now getting a steady 120-130 fps during firestrike with no drops or stutter what so ever.

Thanks to everyone who helped especially syceo for sending those pads gratis and xer0h0ur for breaking down the issue into pictures that my simple mind could understand









Think i'm going to swerve the water cooling for a bit and just enjoy leaving the side on my rig


----------



## Syceo

Glad you sorted it mate. Can I get the psu cable back pls.


----------



## jelly4ish

Quote:


> Originally Posted by *Syceo*
> 
> Glad you sorted it mate. Can I get the psu cable back pls.


yeah mate I'll mail it Monday


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> yeah mate I'll mail it Monday


Cheers


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *crystaldark*
> 
> I have 2 295x2s and they are both dead. My computer stopped booting with the first one installed a few days ago, and now it is the same story with the second one. They fail in two different ways. I am going to make a video about it tomorrow. I haven't ever had buyer's remorse this badly.
> 
> 
> 
> Are you sure you weren't having power issues? Two 295X2's requires a fair amount of amps which most power supplies can't handle.
Click to expand...

this was going to be my question, when 2 things die like that there is usually a "murderer"
Quote:


> Originally Posted by *rakesh27*
> 
> X, Has a point , you definitely need a powerful PSU to run 2x295x2.
> 
> You have to or need a 1500watts psu to run them not to mention, some power left over to run your rig.
> 
> The better psu you have the less problems will havehave.


i use 2x1kw psus


----------



## xer0h0ur

Quote:


> Originally Posted by *jelly4ish*
> 
> Guys I've finally sorted it those 1.5mm pads did the trick it was the vram throttling! now getting a steady 120-130 fps during firestrike with no drops or stutter what so ever.
> 
> Thanks to everyone who helped especially syceo for sending those pads gratis and xer0h0ur for breaking down the issue into pictures that my simple mind could understand
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Think i'm going to swerve the water cooling for a bit and just enjoy leaving the side on my rig


Happy to help brother. Enjoy!


----------



## ImperialOne

I've not had any problems with Trifire. Coming from nvidia, it has been a decent experience. I wish they did their drivers better.


----------



## electro2u

Quote:


> Originally Posted by *ImperialOne*
> 
> I've not had any problems with Trifire. Coming from nvidia, it has been a decent experience. I wish they did their drivers better.


It's not going to work out very well if you play DX9 games like FFXIV. I do not get any performance gain from 3 cards vs 2. The utilization just drops and averages across the GPUs. Annoying.


----------



## UsTaS

*hi guys i am new here and i have proplem with club amd r9 295x2

the proplem is wen i play ane game the FPS drop from 20- 30 and win i play Guild war the FPS is from 15 to 25









i tray all the ccc from 14.4 to 14.11.2 with clean instal with DDU but it the same lag and low FPS

i format my pc and i do fresh setup all programe and full windows update but the issus stell there

need help guys and my build is :

Intel core i7 4790k 4.00 ghz

ASRock Fatal1ty Z97 pro

Club Radeon r9 295x2 8GB

ram 16 GB

windows 7

my power is NZXT 1000w

and this some thing my be help*











*and ty for help from now







*


----------



## ImperialOne

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ImperialOne*
> 
> I've not had any problems with Trifire. Coming from nvidia, it has been a decent experience. I wish they did their drivers better.
> 
> 
> 
> It's not going to work out very well if you play DX9 games like FFXIV. I do not get any performance gain from 3 cards vs 2. The utilization just drops and averages across the GPUs. Annoying.
Click to expand...

I generally play games that utilize DX11 (or I select for its use), though I have no doubt you may have issues with Dx9. I've heard of such issues.


----------



## arieldeboca

Hello. I have some questions:
1) There is a difference between different brands of this card? for example
one 295x2 295x2 Shappire or asus?
2) The card has problems with the temperatures of the vram? as solved?
3) You can disable the crossfire of the card? If disabled, the card uses passes 8gb?


----------



## Orivaa

Quote:


> Originally Posted by *arieldeboca*
> 
> Hello. I have some questions:
> 1) There is a difference between different brands of this card? for example
> one 295x2 295x2 Shappire or asus?
> 2) The card has problems with the temperatures of the vram? as solved?
> 3) You can disable the crossfire of the card? If disabled, the card uses passes 8gb?


1) All the cards are manufactured by Sapphire, so you're just paying the for brand.
2) The heat issues are not solved. You CAN buy a cable to plug your VRM fan into your motherboard, thus gain control over the fan speed, which does seem to do the trick. But again, that requires you to buy the cable. (Or you can just go full water-cooling on it.)
3) You can disable crossfire for any game. And no, it'll only use 4gb if crossfire is disabled.


----------



## arieldeboca

amd 295x2 seriously all cards are manufactured by Shappire? I did not know


----------



## Santho

I need some help guys. i'm having some trouble with my 295x2 in bf4 My gpu are not at 100%load. They are bouncing up and down. I'm playing on fullscreen in 1920x1080p (I know, i know 1080p on 295x2 waste of money etc... But im planning to get 2 more monitors soon to try out surround gaming.) I really want to get this problem fixed before i go and pull th trigger on two more monitors for surround.. I guess if it is defect i'll just have to go back to my gtx 570...

My specs are:

MSI Z97m Gaming
i5 4690k (not overclocked)
r9 295x2(not overclocked)
Corsair AX 860w
Corsair H105 (front mounted Push/Pull)
HyperX Fury 16gb (Not overclocked)
Corsair 350d (Not really important but why not..)


----------



## Kraius

Quote:


> Originally Posted by *Orivaa*
> 
> 1) All the cards are manufactured by Sapphire, so you're just paying the for brand.


It's worth noting that the accessories will be different on the different brands. Some come with miniDP-HDMI adapters and others come with miniDP-DP adapters. you'd want the latter if hooking into a 4k monitor.


----------



## joeh4384

What kind of frames do you get? Also benchmark scores?


----------



## boredmug

Blah. Spoke too soon. Nm


----------



## Santho

Quote:


> Originally Posted by *joeh4384*
> 
> What kind of frames do you get? Also benchmark scores?


If you are talking to me, im getting everything from 60-100+ And i'll try runing a valley benchmark when i come home.


----------



## Cool Mike

This is crazy guys. Almost purchased a GTX980 and thought I would look at the 295x prices one more time.

To my surprise The XFX 295X2 was $679.99 and $30 rebate at NEWEGG. Grapped it. They are still available!


----------



## joeh4384

I saw that. I would love to pick up another one if quadfire worked better plus I think I would just run into vrm throttling issues again like when I tried running a 290x below it.


----------



## Aznlotus161

I might just sell my 970s and hop on the deal posted above since I'm at higher 1440p resolutions.

Benchmarks tell me the R9 295X2 performs better than SLI 970s when you reach closer to 4K resolutions in most games.

With that said, will an EVGA G2 850W be enough to power the hungry beast?

Since my computer is on my desk 24/7, how is the noise at load? Expecting a tad louder than 970s which is fine to me.

I unfortunately got a dud 970 from Gigabyte so that's in Cali hopefully getting repaired at the moment.

Also selling my 970s and getting this will probably save me money actually.


----------



## PontiacGTX

The 295x2 uses 500w


----------



## Aznlotus161

Quote:


> Originally Posted by *PontiacGTX*
> 
> The 295x2 uses 500w


Yeah, I saw that, but it seems people like recommending 1000w PSUs for this card for some odd reason.

Guess they're overcompensating.

Just wanted to confirm.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aznlotus161*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PontiacGTX*
> 
> The 295x2 uses 500w
> 
> 
> 
> Yeah, I saw that, but it seems people like recommending 1000w PSUs for this card for some odd reason.
> 
> Guess they're overcompensating.
> 
> Just wanted to confirm.
Click to expand...

Id rather have too much than too little when it comes to the PSU.

but you'll be fine, 850w is enough


----------



## Scorpion49

Quote:


> Originally Posted by *Cool Mike*
> 
> This is crazy guys. Almost purchased a GTX980 and thought I would look at the 295x prices one more time.
> 
> To my surprise The XFX 295X2 was $679.99 and $30 rebate at NEWEGG. Grapped it. They are still available!


I actually picked one up, I got an email from newegg today for 5% off my total purchase as well so it will be right around $600 brand new. I'll sell my Titan and recoup some of that as well, so I think its a pretty good deal. Need more powah for DA:I.


----------



## Aznlotus161

Quote:


> Originally Posted by *Scorpion49*
> 
> I actually picked one up, I got an email from newegg today for 5% off my total purchase as well so it will be right around $600 brand new. I'll sell my Titan and recoup some of that as well, so I think its a pretty good deal. Need more powah for DA:I.


Yeah I was thinking the same thing.

However, sly as always Newegg game promos probably conflict with the 5%.

Also I tried on mobile and apparently you can't get 2day shoprunner when you use their iOS app at least so looks like I'm ordering online.

*EDIT*: Might have second thoughts...a lot of reviews on Newegg is 1 egg due to monitor issues or driver issues.

Any truth to that now?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150710

Also, has anyone mounted their own aftermarket fans with this GPU?

Might throw a Cougar Vortex on there.


----------



## Sgt Bilko

I've got a Noctua NF-F12 industrial 3k rpm fan on mine in push/exhaust and its going great, cooler and quieter


----------



## F4ze0ne

Quote:


> Originally Posted by *PontiacGTX*
> 
> The 295x2 uses 500w


That's only for the card and no system components right?

I'm thinking an AX650 wouldn't be enough to power this thing with all components. Would 750w be the minimum for the system and a 295x2?


----------



## joeh4384

I would at least go 850. 750 might work but Hawaii cards can draw a lot of power for moments at a time.


----------



## joeh4384

I use Corsair SP 120 performance editions. I can imagine some noctua NF12s would be good for this or whatever the one that is good for static pressure.


----------



## Syceo

Quote:


> Originally Posted by *jelly4ish*
> 
> yeah mate I'll mail it Monday


Cheers
Quote:


> Originally Posted by *jelly4ish*
> 
> yeah mate I'll mail it Monday


Still waiting on that cable mate.. when do you think you can send it ?


----------



## PontiacGTX

Quote:


> Originally Posted by *F4ze0ne*
> 
> That's only for the card and no system components right?
> 
> I'm thinking an AX650 wouldn't be enough to power this thing with all components. Would 750w be the minimum for the system and a 295x2?


just the card.

Yes you need minimum of 750-850 with a quad core 32nm maybe 750w min with a 22nm quad core.


----------



## cz1g

Hey guys, I'm about to join the 295x2 club. I'm upgrading from a 560ti. I have a Coolmax ZU Series ZU-1000B. Will I be able to run this safely?


----------



## Syceo

Quote:


> Originally Posted by *cz1g*
> 
> Hey guys, I'm about to join the 295x2 club. I'm upgrading from a 560ti. I have a Coolmax ZU Series ZU-1000B. Will I be able to run this safely?


doesnt look like your PSU is compatible

http://www.pc-specs.com/gpu/ATI/R-200_Series/Radeon_R9_295X2/2101/Compatible_PSUs


----------



## xarot

Sorry if I'm asking the obvious questions, but I just ordered a Sapphire card. I already haven an ARESIII on water, but I am moving away from water cooling either completely or only use the ARESIII on water only. ARES has been working very well for me even with the ROG Swift at 120 Hz. Great card.

However, about the 'stock' 295X2:

- Is the warranty still OK if I remove those cable ties from the hoses so I could change the stock fan on radiator to something else? A bit dumb if not.








- Is there another BIOS to raise throttling temp
- Or is the throttling mainly because of VRM temps only and not GPUs?
- Anything else I could do for the throttling if it happens?

I know Sleeping Dogs for example will cook any card, so it sounds like a tough one for this card...


----------



## TheOnlyDoor

So I've had my R9-295x2 since may. Not to many problems except for the constant throttling at 4k resolutions. Crysis 3 and Star Citizen seem to hit the 74c limit real quick. I've tried push/ pull Noctuas and Corsair sp 120's on the Rad to no effect.mine is the XFX version of the card, I've actually RMA'd it to XFX because the stock rad fan stopped spinning.
Right now I'm using 1 ASUS 290x until xfx sends back my card.
My next question is this, for those who would know. Before I got the 295x2 I wasn't really up on vram amounts and their necceccity for 4k and simply took for granted that the 295x2 was perfect for 4k gaming as that is how it was sold.
Now that I've researched it a bit and have been looking at at the memory usage of my single 290x @ 4k, in Bf4 and Star Citizen I'm getting as High as 3800 mb usage @ 4k.
I realize now that though the 295x2 was pitched as " 8 Gigs vram!!!" it's only practically 4 gigs.....unless there is somthing I'm missing???
So my guess is that real soon my uber 4K 295x2 is not going to be so Uber @ 4k considering im only 200mb from the 4gb ceiling of this card.
Since I bought this card in may (got my ASUS PB287q 4k monitor in july) you can guess what I paid for my card.
Can any tech heads ( not guessers please) speak to this concernn for me......thanks in advance!
Cheers.

PS, here is my rig BTW: AMD 9590 (stock)
Crosshair V formula Z motherboard
Corsair h100i cooler
XFX R9-295x2 + Asus 290x (trifire)
32 g Corsair dominnnator platinum @ 1866
EVGA 1600w P2 psu
Samsung 840 pro 512gb SSD
Thermaltake V71 full tower case


----------



## PontiacGTX

Quote:


> Originally Posted by *TheOnlyDoor*
> 
> So I've had my R9-295x2 since may. Not to many problems except for the constant throttling at 4k resolutions. Crysis 3 and Star Citizen seem to hit the 74c limit real quick. I've tried push/ pull Noctuas and Corsair sp 120's on the Rad to no effect.mine is the XFX version of the card, I've actually RMA'd it to XFX because the stock rad fan stopped spinning.
> Right now I'm using 1 ASUS 290x until xfx sends back my card.
> My next question is this, for those who would know. Before I got the 295x2 I wasn't really up on vram amounts and their necceccity for 4k and simply took for granted that the 295x2 was perfect for 4k gaming as that is how it was sold.
> Now that I've researched it a bit and have been looking at at the memory usage of my single 290x @ 4k, in Bf4 and Star Citizen I'm getting as High as 3800 mb usage @ 4k.
> I realize now that though the 295x2 was pitched as " 8 Gigs vram!!!" it's only practically 4 gigs.....unless there is somthing I'm missing???
> So my guess is that real soon my uber 4K 295x2 is not going to be so Uber @ 4k considering im only 200mb from the 4gb ceiling of this card.
> Since I bought this card in may (got my ASUS PB287q 4k monitor in july) you can guess what I paid for my card.
> Can any tech heads ( not guessers please) speak to this concernn for me......thanks in advance!
> Cheers.
> 
> PS, here is my rig BTW: AMD 9590 (stock)
> Crosshair V formula Z motherboard
> Corsair h100i cooler
> XFX R9-295x2 + Asus 290x (trifire)
> 32 g Corsair dominnnator platinum @ 1866
> EVGA 1600w P2 psu
> Samsung 840 pro 512gb SSD
> Thermaltake V71 full tower case


the throttling can be caused by the VRM design on the 295x2 and that the ASUS 290X as the worst cooling for a 290X


----------



## cz1g

Quote:


> Originally Posted by *Syceo*
> 
> doesnt look like your PSU is compatible
> 
> http://www.pc-specs.com/gpu/ATI/R-200_Series/Radeon_R9_295X2/2101/Compatible_PSUs


Seems like that's the only coolmax PSU listed on that site. Looking more into it, I found out that the ZU-1000b does not have 2 8pin connectors. So that is out of the question.

Another edit* But it has 2x 6+2 pins. So I should be fine.


----------



## pompss

Long time i didn't post on this Thread.


----------



## PontiacGTX

Quote:


> Originally Posted by *cz1g*
> 
> Seems like that's the only coolmax PSU listed on that site. Looking more into it, I found out that the ZU-1000b does not have 2 8pin connectors. So that is out of the question.
> 
> Another edit* But it has 2x 6+2 pins. So I should be fine.


that psu should be replaced its not a quality PSU i saw a thread where I blew off with 2x 9800GTX


----------



## cz1g

Quote:


> Originally Posted by *PontiacGTX*
> 
> that psu should be replaced its not a quality PSU i saw a thread where I blew off with 2x 9800GTX


So what's a good PSU for around 100-125 that will be good for the 295x2?


----------



## PontiacGTX

Quote:


> Originally Posted by *cz1g*
> 
> So what's a good PSU for around 100-125 that will be good for the 295x2?


120usd?cad?aud?gbp?


----------



## cz1g

Quote:


> Originally Posted by *PontiacGTX*
> 
> 120usd?cad?aud?gbp?


Sorry about that. USD.


----------



## PontiacGTX

Quote:


> Originally Posted by *cz1g*
> 
> Sorry about that. USD.


100usd AR+code -1000w
http://www.newegg.com/Product/Product.aspx?Item=N82E16817182239&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=3938566&SID=
115usd-1000w

http://www.amazon.com/dp/B00GMDENMA/

130usd AR -1300w

http://www.newegg.com/Product/Product.aspx?Item=N82E16817182063&FM=1


----------



## Mega Man

EVGAs can be good ( DO NOT BUY THE NEX )

talk to @shilka


----------



## MapRef41N93W

Just bought the XFX 295x2 for $679. Never planned on it, but I picked up one of the Acer 32" 4k and I had basically no upgrade path with my setup (can't add a second GPU without skyrocketing temps) other than the 295x2. Hopefully this time my experience with AMD drivers won't be so bad. Still waiting for big Maxwell to drop to do my Quad SLI/HW-E build, but this should hold me over till then.


----------



## cz1g

Quote:


> Originally Posted by *PontiacGTX*
> 
> 100usd AR+code -1000w
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817182239&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=3938566&SID=
> 115usd-1000w
> 
> http://www.amazon.com/dp/B00GMDENMA/
> 
> 130usd AR -1300w
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817182063&FM=1


What about http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8669275&Sku= for the 295x2?


----------



## F4ze0ne

Quote:


> Originally Posted by *cz1g*
> 
> What about http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8669275&Sku= for the 295x2?


These are the ones I'm considering for the 295x2...

Corsair HX1050 ($119.99 after rebate)
http://www.newegg.com/Product/Product.aspx?Item=N82E16817139034

Thermaltake 1200w ($139.99 after rebate)
http://www.newegg.com/Product/Product.aspx?Item=N82E16817153207

Cooler Master V1000 ($184)
http://www.newegg.com/Product/Product.aspx?Item=N82E16817171078


----------



## Noyjitat

So what minidisplay cable should I buy for this card to get the most out of it on a 4k tv? I just bought a samsung 6830.

I have a mini displayport to hdmi adapter but I don't want to use that if it will reduce quality... will it? I had assumed that I should just buy
an hdmi cable with a displayport connector on one end but I don't know which one to buy.

I see all of these different speeds now on cables and I really don't want to lose quality... it defeats the purpose of buying this card and buying a 4k tv.

The tv is 240 hz.


----------



## Syceo

Quote:


> Originally Posted by *cz1g*
> 
> Seems like that's the only coolmax PSU listed on that site. Looking more into it, I found out that the ZU-1000b does not have 2 8pin connectors. So that is out of the question.
> 
> Another edit* But it has 2x 6+2 pins. So I should be fine.


Quote:


> Originally Posted by *cz1g*
> 
> What about http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8669275&Sku= for the 295x2?


you are going to need a PSU with a single +12v rail something along the lines of:

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4166605&CatId=2535

the cheapest is

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=332363&CatId=5436 but i truley would not recommend that psu given the card your trying to power. Unfortunately your going to have to increase your budget and get a quality PSU ( thats what i and many here would probably suggest)

something along the lines of this is sufficient http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8139567&CatId=2535

but if you really want to get the job done without any probs id go 80 plus gold on something like this http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4166605&CatId=2535

I have the 1200w version and its a solid PSU


----------



## SLADEizGOD

Question. I'm thinking of picking up the XFX 295x2 for $679 US. But I saw the XFX 290x for $329. should I just pick up 2 or just the bigger card. Im just running a qnix 1440p screen but I do want to run everything maxed out. Need some help on this one. just for gaming. And would a 1050 power supply take it.


----------



## Scorpion49

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Question. I'm thinking of picking up the XFX 295x2 for $679 US. But I saw the XFX 290x for $329. should I just pick up 2 or just the bigger card. Im just running a qnix 1440p screen but I do want to run everything maxed out. Need some help on this one. just for gaming.


I got the 295x2 on that sale and it showed up today, so far I'm very happy with it. Much better quality than the reference 290's and 290x's I've had. Runs nice and silent as well and maintains its stock 1018mhz clock without even hitting 45*C.


----------



## SLADEizGOD

Quote:


> Originally Posted by *Scorpion49*
> 
> I got the 295x2 on that sale and it showed up today, so far I'm very happy with it. Much better quality than the reference 290's and 290x's I've had. Runs nice and silent as well and maintains its stock 1018mhz clock without even hitting 45*C.


But would my 1050 power supply work with it. I heard it draws alot of power. But I only need 1 card. not to crazy.


----------



## Orivaa

Quote:


> Originally Posted by *SLADEizGOD*
> 
> But would my 1050 power supply work with it. I heard it draws alot of power. But I only need 1 card. not to crazy.


As long as you're not using another GPU, then easily.


----------



## Scorpion49

Quote:


> Originally Posted by *SLADEizGOD*
> 
> But would my 1050 power supply work with it. I heard it draws alot of power. But I only need 1 card. not to crazy.


Should be no problem, I'm using mine with a CM 1000W power supply, my systems full load when benching the GPU is 645W read off of my battery backup.


----------



## F4ze0ne

Where's the best place to mount the rad for this card?

I have a very large heatsink on my cpu and a little under 3 inches of clearance above it.

Is it possible to mount it outside through the loop holes on the case?


----------



## electro2u

Quote:


> Originally Posted by *F4ze0ne*
> 
> Where's the best place to mount the rad for this card?


In the next room.









No but seriously, you can mount the radiator outside the case if you can fit it through the loop holes.









No, seriously, seriously, you can do a little case modding and make a couple cuts between the loop holes that are close to a fan mount, and cut the grill out of the fan hole. Then you can pass the rad through the fan hole and run the tubing through the loop holes.


----------



## Scorpion49

Well, I was happy with the new card before I tried to play some DA:I on it. This game is the whole reason I bought the stupid card, but I can't play the game because there seems to be a memory leak. I was running my R9 270 last night with the same 14.11.2 drivers with zero problems besides the terrible performance in such a demanding game, but now I can't even play on medium settings for more than 5 minutes with the 295x2 before I get an out of memory error. Mantle crashes right away upon loading into a save game, DX11 runs for ~5 minutes and then crashes anyways.

I'm so glad I spent this money to gain a couple of FPS over my Titan using vsync/60hz and I can't even play the game. Ha, post 6666.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I was happy with the new card before I tried to play some DA:I on it. This game is the whole reason I bought the stupid card, but I can't play the game because there seems to be a memory leak. I was running my R9 270 last night with the same 14.11.2 drivers with zero problems besides the terrible performance in such a demanding game, but now I can't even play on medium settings for more than 5 minutes with the 295x2 before I get an out of memory error. Mantle crashes right away upon loading into a save game, DX11 runs for ~5 minutes and then crashes anyways.
> 
> I'm so glad I spent this money to gain a couple of FPS over my Titan using vsync/60hz and I can't even play the game. Ha, post 6666.


Don't use Mantle? Mantle is still a buggy POS especially with multi-card setups.

Edit: wait I missed where you said DX11 does it as well.


----------



## Scorpion49

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Don't use Mantle? Mantle is still a buggy POS especially with multi-card setups.
> 
> Edit: wait I missed where you said DX11 does it as well.


Every setting does it, I'm about to move it over to my FX 6300 rig and see if it does it on that as well.

EDIT: well it works without crashing with a memory error on the FX rig, but I get the top 1/3 of the screen flashing bright green every 45 seconds or so. Maybe the card is bad after all.


----------



## cz1g

So I was looking at this http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4668507 for my 295x2. I prefer corsair, plus I can get that model for 109.99. Should be fine right? Or the second option I can do is EVGA Supernova http://www.ncixus.com/products/?usaffiliateid=1000031504&sku=94404&vpn=220-G2-0850-XR&manufacture=eVGA&promoid=1029 ?


----------



## Orivaa

Quote:


> Originally Posted by *cz1g*
> 
> So I was looking at this http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4668507 for my 295x2. I prefer corsair, plus I can get that model for 109.99. Should be fine right? Or the second option I can do is EVGA Supernova http://www.ncixus.com/products/?usaffiliateid=1000031504&sku=94404&vpn=220-G2-0850-XR&manufacture=eVGA&promoid=1029 ?


I'm using an EVGA Supernova gold 1300, and it's working wonderfully.


----------



## cz1g

Quote:


> Originally Posted by *Orivaa*
> 
> I'm using an EVGA Supernova gold 1300, and it's working wonderfully.


Was reading alot of reviews and they said 850 is fine. Is that correct? I don't plan on overclocking or anything.

Nevermind. Decided to go with the Supernova Gold 1000w.


----------



## Scorpion49

Well I fiddled some more with my card, installing the 14.9 driver and THEN the 14.11.2 over top without cleaning the drivers out completely seems to have fixed its issues. Works perfectly in both of my computers now!


----------



## electro2u

Quote:


> Originally Posted by *Scorpion49*
> 
> Well I fiddled some more with my card, installing the 14.9 driver and THEN the 14.11.2 over top without cleaning the drivers out completely seems to have fixed its issues. Works perfectly in both of my computers now!


That is downright fascinating. If you check your software version in catalyst control center I believe you'll see the version listed as 14.9. I think this tells us which version is actually running.


----------



## ImperialOne

Unless you have had problems, you do not need to complete remove old drivers when updating AMD drivers anymore


----------



## Scorpion49

Quote:


> Originally Posted by *electro2u*
> 
> That is downright fascinating. If you check your software version in catalyst control center I believe you'll see the version listed as 14.9. I think this tells us which version is actually running.


No its listed as 14.11.2 Beta.
Quote:


> Originally Posted by *ImperialOne*
> 
> Unless you have had problems, you do not need to complete remove old drivers when updating AMD drivers anymore


It had Nvidia drivers on it previously. I wiped the Nvidia drivers and went straight to the beta driver.


----------



## Sgt Bilko

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Well I fiddled some more with my card, installing the 14.9 driver and THEN the 14.11.2 over top without cleaning the drivers out completely seems to have fixed its issues. Works perfectly in both of my computers now!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That is downright fascinating. If you check your software version in catalyst control center I believe you'll see the version listed as 14.9. I think this tells us which version is actually running.
Click to expand...

Actually that's what i do with all my driver installs.

I do a DDU wipe then install the latest WHQL driver then the Beta over the top and don't have any issues


----------



## Mirdain

I had a heat issue so upgraded to a 240mm radiator (amazing right LOL?).

Have two 20mm and one 25mm 120mm fans in a push config (only exhaust in case).

Seems one core is cooler than the other by about 5 degrees.

I think it's the heat transfer on water is much quicker so 2nd core is saturated more.

I could optimize the cooling a bit more but I wanted to maintain positive pressure in the case.

Just water with a kill coil.

this was a cheap stop gap as the total components cost around 130-150 bucks and I didn't have to go "full" water.

LOL ya 1500 dollar graphic card and i cheap out on cooling....... but i didn't have room for a full water system and I think this is pretty reliable.

I really have to push it to see one core throttle down but it's negligible as it rebounds very fast and doesn't throttle down much (other core would be around 69ish when gpu2 hits 74 and that is 100% on both cores for 15min+).

I can now beat the card to death and it doesn't have a problem.

I can overclock to 1120Mhz and 1625HMz on ram all day but I really don't need to as stock is just barely enough to max (60fps) the game i play.


----------



## xer0h0ur

Yeah pushing high resolution really kicks these cores in the nuts temperature-wise. I just removed the waterblock and re-pasted my GPUs and the PLX chip last night. I still have roughly a 5C difference between one core and the other as you describe but at least my temps dropped back down to mid to high 60's. Granted I also changed the waterblock on my 4930K as well as the TIM on it so I am not entirely sure if that was the reason for the drop. I still need to add a bracket mounted radiator on the rear to truly drop the temps to overclock.net respectable levels


----------



## boredmug

Good Lord that's the biggest heat sink I've ever seen on a cpu!
Quote:


> Originally Posted by *Mirdain*
> 
> I had a heat issue so upgraded to a 240mm radiator (amazing right LOL?).
> 
> Have two 20mm and one 25mm 120mm fans in a push config (only exhaust in case).
> 
> Seems one core is cooler than the other by about 5 degrees.
> 
> I think it's the heat transfer on water is much quicker so 2nd core is saturated more.
> 
> I could optimize the cooling a bit more but I wanted to maintain positive pressure in the case.
> 
> Just water with a kill coil.
> 
> this was a cheap stop gap as the total components cost around 130-150 bucks and I didn't have to go "full" water.
> 
> LOL ya 1500 dollar graphic card and i cheap out on cooling....... but i didn't have room for a full water system and I think this is pretty reliable.
> 
> I really have to push it to see one core throttle down but it's negligible as it rebounds very fast and doesn't throttle down much (other core would be around 69ish when gpu2 hits 74 and that is 100% on both cores for 15min+).
> 
> I can now beat the card to death and it doesn't have a problem.
> 
> I can overclock to 1120Mhz and 1625HMz on ram all day but I really don't need to as stock is just barely enough to max (60fps) the game i play.


----------



## Mirdain

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah pushing high resolution really kicks these cores in the nuts temperature-wise. I just removed the waterblock and re-pasted my GPUs and the PLX chip last night. I still have roughly a 5C difference between one core and the other as you describe but at least my temps dropped back down to mid to high 60's. Granted I also changed the waterblock on my 4930K as well as the TIM on it so I am not entirely sure if that was the reason for the drop. I still need to add a bracket mounted radiator on the rear to truly drop the temps to overclock.net respectable levels


- That's comforting since I was a bit worried the water might not play well since the stock fluid is much more viscous.

Almost seemed like a mineral oil.

How much radiator do you have?

I had to go with a 30mm 240mm due to case limitations.

As for the heat sink it never really gets hot but i have an intake blowing into it (140mm) and the radiator pulling off.

I'm running 4.8ghz and 4.4ghz cache on a 4790k (1.325v/1.226v) and if i stress test the cache/cpu she gets toasty around 80C (with GPU stressed to saturate case temps).

De-lidding the cpu would probably do more than water on it.

I really like the heat sink - I built it to be quite (in a 550D case).


----------



## xer0h0ur

Quote:


> Originally Posted by *Mirdain*
> 
> - That's comforting since I was a bit worried the water might not play well since the stock fluid is much more viscous.
> 
> Almost seemed like a mineral oil.
> 
> How much radiator do you have?
> 
> I had to go with a 30mm 240mm due to case limitations.
> 
> As for the heat sink it never really gets hot but i have an intake blowing into it (140mm) and the radiator pulling off.
> 
> I'm running 4.8ghz and 4.4ghz cache on a 4790k (1.325v/1.226v) and if i stress test the cache/cpu she gets toasty around 80C (with GPU stressed to saturate case temps).
> 
> De-lidding the cpu would probably do more than water on it.
> 
> I really like the heat sink - I built it to be quite (in a 550D case).


Right now I am under the calculated minimum for my setup. Since the 295X2 is a 500W TDP and the 4930K is a 130W TDP. I am using a 144mm Thermaltake aluminum radiator and a copper 120mm Alphacool NexXXos Monsta each with a Noctua 2000 RPM NF-F12 iPPC fan in a push configuration. I am going to end up slapping a 240 or a 360 Monsta on the rear end of the case mounted to a Koolance radiator bracket and using a Koolance PCI pass-thru for the lines. If that isn't enough then dammit I give up.


----------



## Mirdain

Your temps are quite low from just the two radiators imo.

I would think you should be around 360mm so another 240 would set you plenty especially externally mounted.

Did you stress test the cpu/gpu same time and see what your system can support?

If that temp is stressed for a long period I would be happy with 60-70's.......... there is a point of diminishing returns LOL.


----------



## Mega Man

a day late but here is my opinion
Quote:


> Originally Posted by *Noyjitat*
> 
> So what minidisplay cable should I buy for this card to get the most out of it on a 4k tv? I just bought a samsung 6830.
> 
> I have a mini displayport to hdmi adapter but I don't want to use that if it will reduce quality... will it? I had assumed that I should just buy
> an hdmi cable with a displayport connector on one end but I don't know which one to buy.
> 
> I see all of these different speeds now on cables and I really don't want to lose quality... it defeats the purpose of buying this card and buying a 4k tv.
> 
> The tv is 240 hz.


you have the choice of ANYTHING to hdmi.

why spend the extra money if your card comes with a dvi to hdmi adapter you will be fine if not what ever is cheapest is fine

it will all convert to the same signal

you will never get 240 frames out of that tv as hdmi is capped at 120
Quote:


> Originally Posted by *Syceo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cz1g*
> 
> Seems like that's the only coolmax PSU listed on that site. Looking more into it, I found out that the ZU-1000b does not have 2 8pin connectors. So that is out of the question.
> 
> Another edit* But it has 2x 6+2 pins. So I should be fine.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cz1g*
> 
> What about http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8669275&Sku= for the 295x2?
> 
> Click to expand...
> 
> you are going to need a PSU with a single +12v rail something along the lines of:
> 
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4166605&CatId=2535
> 
> the cheapest is
> 
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=332363&CatId=5436 but i truley would not recommend that psu given the card your trying to power. Unfortunately your going to have to increase your budget and get a quality PSU ( thats what i and many here would probably suggest)
> 
> something along the lines of this is sufficient http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8139567&CatId=2535
> 
> but if you really want to get the job done without any probs id go 80 plus gold on something like this http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4166605&CatId=2535
> 
> I have the 1200w version and its a solid PSU
Click to expand...

although from what i can see it is not a bad psu. please dont recommend any psu because of any 80+rating, 80+ means _*absolutely*_ nothing



that EVGA 1kw is more then enough i run 2 off of 2 ( ONE PER PSU ) of the superflower ( the oem ) equivalent
Quote:


> Originally Posted by *SLADEizGOD*
> 
> Question. I'm thinking of picking up the XFX 295x2 for $679 US. But I saw the XFX 290x for $329. should I just pick up 2 or just the bigger card. Im just running a qnix 1440p screen but I do want to run everything maxed out. Need some help on this one. just for gaming. And would a 1050 power supply take it.


personally i think you should go for the 2x 290xs

single cards are nice for certain things if you can go single gpus though i have never seen a time where they do not perform better ( single gpu cards )
Quote:


> Originally Posted by *cz1g*
> 
> So I was looking at this http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4668507 for my 295x2. I prefer corsair, plus I can get that model for 109.99. Should be fine right? Or the second option I can do is EVGA Supernova http://www.ncixus.com/products/?usaffiliateid=1000031504&sku=94404&vpn=220-G2-0850-XR&manufacture=eVGA&promoid=1029 ?


they are nice but way way way over priced go for that evga 1kw g2


----------



## Scorpion49

Okay, now I have a different problem. Sometimes when I turn my monitor off with this 295x2, it won't get signal back when I turn it back on (pc still running). Its using the DVI-D port on a P2414H because I don't have the miniDP cable, but I haven't had this issue with any other card. It has happened a few times and I have to unplug the DVI cable and plug it back in to get signal to come back.


----------



## m00ter

Picked up an MSI 295x2 on black Friday for £470 which seems to be a good price! Should be with me Monday.

Currently running 2 x 280x in crossfire so will be interesting to compare.


----------



## doctakedooty

Anyone got any idea I am occasionally getting a hard pc reset with my 295x2 running 4k res. When I was running my 1440p monitor I never had hard resets. Could it be because it's maxing out the Vram or just a defective gpu not sure any ideas. The psu is a seasonic 1250 w. During games with kill a watt my 4790k and gpu are pulling max 810 w from the wall under load.


----------



## Mega Man

Quote:


> Originally Posted by *Scorpion49*
> 
> Okay, now I have a different problem. Sometimes when I turn my monitor off with this 295x2, it won't get signal back when I turn it back on (pc still running). Its using the DVI-D port on a P2414H because I don't have the miniDP cable, but I haven't had this issue with any other card. It has happened a few times and I have to unplug the DVI cable and plug it back in to get signal to come back.


under power options change turn of the display to never

i only have problems when stressing, but i manually shut of my monitors as habbit as i spend too much on them anyway haha


----------



## Scorpion49

Quote:


> Originally Posted by *Mega Man*
> 
> under power options change turn of the display to never
> 
> i only have problems when stressing, but i manually shut of my monitors as habbit as i spend too much on them anyway haha


I've never had monitor turn off enabled. Ever. That option drives me nuts. I have ULPS disabled as well, but it still does it. Probably 1 out of 3 times if I shut the screen off it just stays black when I turn it back on, doesn't do it with any other GPU I have including my R9 270.


----------



## Mega Man

weird


----------



## xer0h0ur

Quote:


> Originally Posted by *Mirdain*
> 
> Your temps are quite low from just the two radiators imo.
> 
> I would think you should be around 360mm so another 240 would set you plenty especially externally mounted.
> 
> Did you stress test the cpu/gpu same time and see what your system can support?
> 
> If that temp is stressed for a long period I would be happy with 60-70's.......... there is a point of diminishing returns LOL.


Under consistent gaming loads it completely depends on how much the game utilizes the GPUs.

For instance last night I found that if I manually created an application profile for AC Unity then set CrossFireX mode to AFR friendly, it was getting fantastic near consistently full utilization from both GPUs on the 295X2. This boosted my framerate from a seemingly choppy 30-50 FPS to a buttery smooth near consistently 60 FPS. This also made my temps shoot up though and one GPU was throttling since it was reaching 74C within 15 minutes of play. Mind you I am playing at 4K with everything maxed out except I am not using AA or the Nvidia GameWorks soft shadows.

CPU remains in the high 60's under load with occasional spikes into the low 70's on random cores.
Quote:


> Originally Posted by *Scorpion49*
> 
> Okay, now I have a different problem. Sometimes when I turn my monitor off with this 295x2, it won't get signal back when I turn it back on (pc still running). Its using the DVI-D port on a P2414H because I don't have the miniDP cable, but I haven't had this issue with any other card. It has happened a few times and I have to unplug the DVI cable and plug it back in to get signal to come back.


From what I can tell this is a driver issue that some people claim has been a problem with all of the 14.X drivers. I have no idea since I only came to the game when the 295X2 was launched so my first driver was the 14.4 and it also gave me that problem. I did the same thing Mega Man suggested. Its no big deal for me to manually power off my monitor when I want to.


----------



## xer0h0ur

Quote:


> Originally Posted by *Scorpion49*
> 
> I've never had monitor turn off enabled. Ever. That option drives me nuts. I have ULPS disabled as well, but it still does it. Probably 1 out of 3 times if I shut the screen off it just stays black when I turn it back on, doesn't do it with any other GPU I have including my R9 270.


Oh, nevermind then. That isn't what most people experience. Sounds like a different issue entirely.


----------



## electro2u

Quote:


> Originally Posted by *doctakedooty*
> 
> Anyone got any idea I am occasionally getting a hard pc reset with my 295x2 running 4k res. When I was running my 1440p monitor I never had hard resets. Could it be because it's maxing out the Vram or just a defective gpu not sure any ideas. The psu is a seasonic 1250 w. During games with kill a watt my 4790k and gpu are pulling max 810 w from the wall under load.


Heya Doc! Say man, I have the same psu and I get shutdowns when I push the power ‰ in my afterburner too high? Is this happening at stock voltage and power?


----------



## doctakedooty

Quote:


> Originally Posted by *electro2u*
> 
> Heya Doc! Say man, I have the same psu and I get shutdowns when I push the power ‰ in my afterburner too high? Is this happening at stock voltage and power?


Yea stock power and everything nothing has been increased voltage or clocks. Everything is bone stock.


----------



## Mirdain

Quote:


> Originally Posted by *xer0h0ur*
> 
> Under consistent gaming loads it completely depends on how much the game utilizes the GPUs.
> .


Oh - I was referring to a program like AIDA64 that stresses them to 100% constantly.

I ran this 3DMark test to see how the 295x2 did and looks like it did pretty good.



Also got that magical 99% with a bit of OC


----------



## xer0h0ur

None of the FireStrike tests will ever cause throttling on my GPUs. They never get hot enough. The only way I would be able to get them to throttle would be running FireStrike combined test looping over and over again. Like I said, it needs prolonged and high gpu usage to get them to 74C and throttling. The last time I benched these were my scores:

295X2 alone: 17619

295X2 + 290X: 22153

Its been a while since I benchmarked only the 295X2 by itself to see how much I could get out of the latest drivers and I have since raised my CPU overclock another 100MHz. I never had much success overclocking past 1075MHz on the GPUs since I didn't know back then that AfterBurner doens't stick the power limit and mV increases. Either way its for show since I can't sustain those overclocks unless I expand my loop like I mentioned before. My 24/7 OC that most games can handle without throttling is a mere 1050MHz GPU and 1500MHz vRAM.

FireStrike Ultra would be the closest thing to getting my GPU's nice and toasty: 7180


----------



## MapRef41N93W

Quote:


> Originally Posted by *doctakedooty*
> 
> Anyone got any idea I am occasionally getting a hard pc reset with my 295x2 running 4k res. When I was running my 1440p monitor I never had hard resets. Could it be because it's maxing out the Vram or just a defective gpu not sure any ideas. The psu is a seasonic 1250 w. During games with kill a watt my 4790k and gpu are pulling max 810 w from the wall under load.


Wait are you sure? That doesn't sound right. You shouldn't be seeing more than 700ish watts at stock with a 295x2+ 4790k (unless your 4790k has a massive OC).


----------



## doctakedooty

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Wait are you sure? That doesn't sound right. You shouldn't be seeing more than 700ish watts at stock with a 295x2+ 4790k (unless your 4790k has a massive OC).


4790k is at 4.8 ghz at 1.255 volts and ram at 1.5 v


----------



## MapRef41N93W

Quote:


> Originally Posted by *doctakedooty*
> 
> 4790k is at 4.8 ghz at 1.255 volts and ram at 1.5 v


Sounds like your Kill-A-Watt is malfunctioning. 295x2 should draw 500-520 watts stock and 4790k at 1.25 volts should draw 120-130. 50 watts left for everything else puts you at about 700.


----------



## electro2u

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Sounds like your Kill-A-Watt is malfunctioning. 295x2 should draw 500-520 watts stock and 4790k at 1.25 volts should draw 120-130. 50 watts left for everything else puts you at about 700.


The Kill-A-Watt is measuring power draw from the wall, after taking into account the efficiency of the Gold PSU... it's probably correct. Just sayin'


----------



## MapRef41N93W

Quote:


> Originally Posted by *electro2u*
> 
> The Kill-A-Watt is measuring power draw from the wall, after taking into account the efficiency of the Gold PSU... it's probably correct. Just sayin'


Leakage on a gold is 8-10% so you'd still be looking at 760-770 watts.

I bring this up because it's a bit concerning. Everything I seen points to a 760 watt PSU being fine for a stock 295x2 + normally Haswell i7 OC. That's what I will be using and I don't want to have to switch to my V1000 as I spent $100 extra for braided cables with this build.


----------



## xer0h0ur

I wouldn't even bother using anything under 1000W honestly. Peak power draw has been measured at 507W with literally zero overclock on it. Bone stock. Literally the lowest recommended PSU I have seen assuming no overclock on this card has been an 850W PSU but me personally I wouldn't be comfortable with that.


----------



## electro2u

Quote:


> Originally Posted by *doctakedooty*
> 
> 4790k is at 4.8 ghz at 1.255 volts and ram at 1.5 v


I've been trying to decide if I think it's the PSU, the GPU, or the motherboard. I think the GPU would be easiest to RMA?
I'm suspicious of my PSU. I think it would go up in smoke if I put a 1250W load on it. The firesale prices they've put on the X-1250 Golds didn't help.

@MapRef I think Gold efficiency is technically 87% at full load. and 87% at half load. That puts his load at 705W. I think using a 750W PSU *should* be fine.


----------



## xer0h0ur

Whoa der, I don't know how accurate they are in this statement but they say the 295X2 has a peak power draw of 833W. http://www.xbitlabs.com/articles/graphics/display/amd-radeon-r9-295x2_9.html

Edit: Misread that, they were talking about the whole system's power consumption.


----------



## MapRef41N93W

Edit: That's with an i7 4960x at 4.2GHz so figure 150-170 watts for that.


----------



## pompss

anyone have two 295x2 in crossfire for 4k gaming?
Would like to know the average fps since i really tempted to buy two 295x2 at 649.99 each instead two gtx 980 for the same price (got two strix)
I want min 60 fps with any games at max settings (no max antialiasing)


----------



## xer0h0ur

Dear lord. With as much as you complained about a single 295X2 I don't even want to see what its going to look like when you run into the problems of having two of them quad-fired.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Dear lord. With as much as you complained about a single 295X2 I don't even want to see what its going to look like when you run into the problems of having two of them quad-fired.


Ahahaha








I've grown to love Pompss. Look at that build he did. Much respect


----------



## xer0h0ur

Oh no doubt, the build is sweet. I just think he should look into the problems people have experienced while having dual 295X2s before he takes that leap.


----------



## pompss

Quote:


> Originally Posted by *xer0h0ur*
> 
> Dear lord. With as much as you complained about a single 295X2 I don't even want to see what its going to look like when you run into the problems of having two of them quad-fired.


Thanks of the help like always


----------



## doctakedooty

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Leakage on a gold is 8-10% so you'd still be looking at 760-770 watts.
> 
> I bring this up because it's a bit concerning. Everything I seen points to a 760 watt PSU being fine for a stock 295x2 + normally Haswell i7 OC. That's what I will be using and I don't want to have to switch to my V1000 as I spent $100 extra for braided cables with this build.


Like it was said before 810 watts was from the wall draw so not including what the actual draw was on the components after calculating the efficency. The max draw I was stating was the highest I saw it spike to average was around 750 w from the wall.
Quote:


> Originally Posted by *electro2u*
> 
> I've been trying to decide if I think it's the PSU, the GPU, or the motherboard. I think the GPU would be easiest to RMA?
> I'm suspicious of my PSU. I think it would go up in smoke if I put a 1250W load on it. The firesale prices they've put on the X-1250 Golds didn't help.
> 
> @MapRef I think Gold efficiency is technically 87% at full load. and 87% at half load. That puts his load at 705W. I think using a 750W PSU *should* be fine.


I tried leaving everything stock again and just increasing power target to 50% without increasing voltage or clocks I played multiple maps for about 2 hours earlier with no hard reboot like I have been having I will test some more hopefully tonight see if that was the issue that the card in 4k I know is pushing a 295x2 to the max and really stretching it's legs so maybe power target was being exceeded and caused the graphics card to psu to ovp itself.


----------



## MapRef41N93W

Quote:


> Originally Posted by *doctakedooty*
> 
> Like it was said before 810 watts was from the wall draw so not including what the actual draw was on the components after calculating the efficency. The max draw I was stating was the highest I saw it spike to average was around 750 w from the wall.
> I tried leaving everything stock again and just increasing power target to 50% without increasing voltage or clocks I played multiple maps for about 2 hours earlier with no hard reboot like I have been having I will test some more hopefully tonight see if that was the issue that the card in 4k I know is pushing a 295x2 to the max and really stretching it's legs so maybe power target was being exceeded and caused the graphics card to psu to ovp itself.


Ok great, sounds like 760 watts should be OK then.


----------



## kalijaga

Pompss, nice rig and I had just made a new rig based on the baby Haswell E, and Stacker 935+915r. Firestrike score increased from my usual 20k to about 25k with temperature to spare. Definitely happy with the scaling of based on the CPU performance, and running games on 4k with no problem. Throttling is still there as the 120 rad on the 295x2 is really not up to the task for long term heavy load. If only I have the courage to mod the 295x2 to a 360 thick rads each card, I think it still have some more to go.


----------



## chudlin

Quote:


> Originally Posted by *pompss*
> 
> anyone have two 295x2 in crossfire for 4k gaming?
> Would like to know the average fps since i really tempted to buy two 295x2 at 649.99 each instead two gtx 980 for the same price (got two strix)
> I want min 60 fps with any games at max settings (no max antialiasing)


On metro last light I get between 90 and 120 fps on full settings 0.25 ssa. Check my YouTube vids on metro last light. Search Chris hudlin


----------



## doctakedooty

So update sadly I didn't get my issue resolved. I am going to try new pcie wires, a new set of ram which I haven't wanted to open that are corsair dominator platinum 4 x 4 gb kit because I need some 2 x 8 gb kit and after that I will rma the card. It could be the psu but rma to seasonic seems like a pita if it is a psu I will get it to my buddy's shop try his 1000 w on mine and if it is probably go corsair for the next psu or evga as there new g2 line is nice.


----------



## xer0h0ur

Ugh, I think my water pump is nearing death. Its making low pitch whining noises and my temperatures take longer than before to normalize once loads end. I am thinking my flow rate has gone down.


----------



## MapRef41N93W

Quote:


> Originally Posted by *xer0h0ur*
> 
> Ugh, I think my water pump is nearing death. Its making low pitch whining noises and my temperatures take longer than before to normalize once loads end. I am thinking my flow rate has gone down.


That's Asetek for you.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Ugh, I think my water pump is nearing death. Its making low pitch whining noises and my temperatures take longer than before to normalize once loads end. I am thinking my flow rate has gone down.
> 
> 
> 
> That's Asetek for you.
Click to expand...

He's running a custom loop.....


----------



## chudlin

Quote:


> Originally Posted by *doctakedooty*
> 
> So update sadly I didn't get my issue resolved. I am going to try new pcie wires, a new set of ram which I haven't wanted to open that are corsair dominator platinum 4 x 4 gb kit because I need some 2 x 8 gb kit and after that I will rma the card. It could be the psu but rma to seasonic seems like a pita if it is a psu I will get it to my buddy's shop try his 1000 w on mine and if it is probably go corsair for the next psu or evga as there new g2 line is nice.


For power to the graphics card, are you using an 8 pin splitter cable? That the problem I had when using a splitter cable. You Have to use two separate cables for power to the card. Also it may be your psu. What's the amperage output on the 12v rail? You need a minimum of 30amps. Of your psu can't handle that well it's just gonna shut down.

I've heard also that sessonic isn't all that either. I use a corsair ax 1500I for my cards and I'm using a 4k monitor with an oc without any issues


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Ugh, I think my water pump is nearing death. Its making low pitch whining noises and my temperatures take longer than before to normalize once loads end. I am thinking my flow rate has gone down.


I've been wondering how things were going in your loop... It's bothered me that you left that alu rad in. Can you see any of the coolant? I can't remember your tubing type. I've torn down my loop several times other past few months. Most recently discovered bunch of white powdery junk that was coming from one of my alphacool radiator's ports. One of the first times I took it apart I discovered my raystorm block had a bunch of black junk stuck in the water channels. Still trying to learn how to make a long term loop that won't get clogged or turn nasty.


----------



## chudlin

Dont get coloured water and use distilled water or id look at researching oil cooling in a custom loop.


----------



## Mega Man

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Ugh, I think my water pump is nearing death. Its making low pitch whining noises and my temperatures take longer than before to normalize once loads end. I am thinking my flow rate has gone down.
> 
> 
> 
> That's Asetek for you.
Click to expand...

yea well you will never have the best if all you can do is patent troll, have you seen some of their patents? they think they have invented pc cooling
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Ugh, I think my water pump is nearing death. Its making low pitch whining noises and my temperatures take longer than before to normalize once loads end. I am thinking my flow rate has gone down.
> 
> 
> 
> That's Asetek for you.
> 
> Click to expand...
> 
> He's running a custom loop.....
Click to expand...

still he is right about trolltek

but that TT AIO "openloop" is not the greatest pump, with that in mind you may want to clean your blocks see if the pins are clogged also gpus are one of the 2 most restrictive blocks ( cpu/gpu ) and with that said i dont know if the TT pump can take both cpu and gpu (* please note i am not saying it can not, i am saying i dont know if it can )
Quote:


> Originally Posted by *chudlin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *doctakedooty*
> 
> So update sadly I didn't get my issue resolved. I am going to try new pcie wires, a new set of ram which I haven't wanted to open that are corsair dominator platinum 4 x 4 gb kit because I need some 2 x 8 gb kit and after that I will rma the card. It could be the psu but rma to seasonic seems like a pita if it is a psu I will get it to my buddy's shop try his 1000 w on mine and if it is probably go corsair for the next psu or evga as there new g2 line is nice.
> 
> 
> 
> For power to the graphics card, are you using an 8 pin splitter cable? That the problem I had when using a splitter cable. You Have to use two separate cables for power to the card. Also it may be your psu. What's the amperage output on the 12v rail? You need a minimum of 30amps. Of your psu can't handle that well it's just gonna shut down.
> 
> I've heard also that sessonic isn't all that either. I use a corsair ax 1500I for my cards and I'm using a 4k monitor with an oc without any issues
Click to expand...

what is up with people trying to make these "great" recommendations / generalizations

corsair psus are generally overpriced for what you get and there are several to avoid.

seasonic has good and bad PSUs

like all OEMS


----------



## doctakedooty

Quote:


> Originally Posted by *chudlin*
> 
> For power to the graphics card, are you using an 8 pin splitter cable? That the problem I had when using a splitter cable. You Have to use two separate cables for power to the card. Also it may be your psu. What's the amperage output on the 12v rail? You need a minimum of 30amps. Of your psu can't handle that well it's just gonna shut down.
> 
> I've heard also that sessonic isn't all that either. I use a corsair ax 1500I for my cards and I'm using a 4k monitor with an oc without any issues


Not using a splitter cable and my 12v rail is a single rail with around 120 amps.


----------



## pompss

Quote:


> Originally Posted by *chudlin*
> 
> On metro last light I get between 90 and 120 fps on full settings 0.25 ssa. Check my YouTube vids on metro last light. Search Chris hudlin


Did you test it with some news games like assassins creed unity or dragon age , far cry 4 ??

I really appreciate your time thanks

did you experience some issues like stuttering or any kind of issues ??

what psu did you use for quad fire??


----------



## doctakedooty

Quote:


> Originally Posted by *Mega Man*
> 
> yea well you will never have the best if all you can do is patent troll, have you seen some of their patents? they think they have invented pc cooling
> still he is right about trolltek
> 
> but that TT AIO "openloop" is not the greatest pump, with that in mind you may want to clean your blocks see if the pins are clogged also gpus are one of the 2 most restrictive blocks ( cpu/gpu ) and with that said i dont know if the TT pump can take both cpu and gpu (* please note i am not saying it can not, i am saying i dont know if it can )
> what is up with people trying to make these "great" recommendations / generalizations
> 
> corsair psus are generally overpriced for what you get and there are several to avoid.
> 
> seasonic has good and bad PSUs
> 
> like all OEMS


I have to agree. Corsair does not make there own psu just like most companys they purchase them from another psu company and put there brand on it with there warranty. Seasonic used to be Corsairs psu company who produced there psu. With that its usually a lot cheaper to buy it directly from the company who makes it then buy it rebranded.

PROBLEM SOLVED FOR THE SEASONIC X-1250 PSU SHUTTING DOWN WITH 295x2

Anyways last night I was searching the web on solutions to my problem and although the tag on the psu says its a single rail psu Johnnyguru did the XFX 1250w which is the same psu I have just not rebranded XFX but even though it was labeled a single rail there was multiple rails and the pcie power is done in 30 amp and 45 amp rails so I had to plug one 8 pin to one of the rails and the other 8 pin to another rail and low and behold no more shut downs. I was always getting a shut down in valley and I was able to run valley 5 times in a row and no shut down. Hopefully this helps anyone with a seasonic X-1250 power supply. Below is the review that shows it has multiple rails instead of one single rail.
http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story3&reid=273


----------



## Scorpion49

Well, I figured out my monitor not coming on problem, the DVI cable I was using was very old, I switched to a new one I had in a box and it seems to have stopped. I turned the monitor on and off about 30 times and it worked each time.


----------



## chudlin

Quote:


> Originally Posted by *pompss*
> 
> Did you test it with some news games like assassins creed unity or dragon age , far cry 4 ??
> 
> I really appreciate your time thanks
> 
> did you experience some issues like stuttering or any kind of issues ??
> 
> what psu did you use for quad fire??


I use an ax1500I from corsair. As for the games you mentioned I haven't got them lol, I have watchdogs which runs at around 100 and cod advanced warfare and thats around 100 too.

At first on metro I really struggled with trying to get it to run right cause it was really bad. But I just reinstalled drivers by deleting the cards from dev manager.


----------



## chudlin

I'm having issues overclocking my r9 295x2 quadfire setup. I can oc both cards to 1100mhz gpu and 1500mhz memory clocks, but I get artifacts in 3dmark. Also I have noticed that 3dmark only used gpu's 1 and 4 or 3 and 4 or 2 and 3 it Never used all 4 cores


----------



## chudlin

Quote:


> Originally Posted by *Mega Man*
> 
> yea well you will never have the best if all you can do is patent troll, have you seen some of their patents? they think they have invented pc cooling
> still he is right about trolltek
> 
> but that TT AIO "openloop" is not the greatest pump, with that in mind you may want to clean your blocks see if the pins are clogged also gpus are one of the 2 most restrictive blocks ( cpu/gpu ) and with that said i dont know if the TT pump can take both cpu and gpu (* please note i am not saying it can not, i am saying i dont know if it can )
> what is up with people trying to make these "great" recommendations / generalizations
> 
> corsair psus are generally overpriced for what you get and there are several to avoid.
> 
> seasonic has good and bad PSUs
> 
> like all OEMS


Slow down marry, I'm going by what I KNOW not what I'm guessing. I've fried 850w psu's before just from playing games.let alone benchmarking. Yeah of course you can get parts that are different makes that do just fine. I stated what I was using, it wasn't an advert.


----------



## electro2u

Quote:


> Originally Posted by *doctakedooty*
> 
> I have to agree. Corsair does not make there own psu just like most companys they purchase them from another psu company and put there brand on it with there warranty. Seasonic used to be Corsairs psu company who produced there psu. With that its usually a lot cheaper to buy it directly from the company who makes it then buy it rebranded.
> 
> PROBLEM SOLVED FOR THE SEASONIC X-1250 PSU SHUTTING DOWN WITH 295x2
> 
> Anyways last night I was searching the web on solutions to my problem and although the tag on the psu says its a single rail psu Johnnyguru did the XFX 1250w which is the same psu I have just not rebranded XFX but even though it was labeled a single rail there was multiple rails and the pcie power is done in 30 amp and 45 amp rails so I had to plug one 8 pin to one of the rails and the other 8 pin to another rail and low and behold no more shut downs. I was always getting a shut down in valley and I was able to run valley 5 times in a row and no shut down. Hopefully this helps anyone with a seasonic X-1250 power supply. Below is the review that shows it has multiple rails instead of one single rail.
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story3&reid=273


I KNEW it. My shutdowns definitely had me suspicious of that PSU. (same psu) Dang dude. That is pretty serious. Glad you got it worked out and thanks much for sharing your finding!


----------



## SLADEizGOD

I purchased the XFX R9 295x2 on friday. And after reading all these post. Just in case. I bought the EVGA 1600w G2 PSU. Not taking any chances.


----------



## Orivaa

Quote:


> Originally Posted by *SLADEizGOD*
> 
> I purchased the XFX R9 295x2 on friday. And after reading all these post. Just in case. I bought the EVGA 1600w G2 PSU. Not taking any chances.


Wow. You could safely quadfire the 295x2 with that PSU.


----------



## SLADEizGOD

Quote:


> Originally Posted by *Orivaa*
> 
> Wow. You could safely quadfire the 295x2 with that PSU.


On Jayztwocents it said I can power a city..lol


----------



## Mega Man

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I figured out my monitor not coming on problem, the DVI cable I was using was very old, I switched to a new one I had in a box and it seems to have stopped. I turned the monitor on and off about 30 times and it worked each time.


awesome glad to hear!

I also found out one of my peeps. . It was not anything to do with my video card. I usually keep 1 gb page file on my sd.... I kept getting forced out of certain games ( newest being shadow of Moridor ) so far I am up to 26 gb and still some times get shutdowns ( program not pc) found it is using all 16GB of ram and all 26 gb page file ....

Seriously? WTH is up with wb.


----------



## Scorpion49

Quote:


> Originally Posted by *Mega Man*
> 
> awesome glad to hear!
> 
> I also found out one of my peeps. . It was not anything to do with my video card. I usually keep 1 gb page file on my sd.... I kept getting forced out of certain games ( newest being shadow of Moridor ) so far I am up to 26 gb and still some times get shutdowns ( program not pc) found it is using all 16GB of ram and all 26 gb page file ....
> 
> Seriously? WTH is up with wb.


Yeah I noticed I had to up my pagefile to 25GB for dragon age, it was set at 10GB and it was filling up and then giving me an out of memory error. Using 7GB out of 8 on the system and 19GB pagefile plus 3800MB Vram... holy crap.


----------



## Mega Man

I think I need and will start only making pcs with 32 gb/2400 ram min from now on.

Also thanks makes me feel good I thought I was going crazy


----------



## axiumone

So, I've recently switched from a 295x2 crossfire to a 4 way sli gtx 980. I haven't had nvidia cards in a while, and they just recently started supporting 5x1 display configurations, so it was the right time to try team green.

For anyone running 295x2 in crossfire, or quadcrossfire in general thinking that the grass may be greener so to speak, it isn't. Both of the companies have some serious work to do with their 4 gpu configurations.

I really don't want to make a lengthy post going into all the details. However, if anyone is considering a similar move and I can be of any help, don't hesitate to PM me.


----------



## joeh4384

Quote:


> Originally Posted by *axiumone*
> 
> So, I've recently switched from a 295x2 crossfire to a 4 way sli gtx 980. I haven't had nvidia cards in a while, and they just recently started supporting 5x1 display configurations, so it was the right time to try team green.
> 
> For anyone running 295x2 in crossfire, or quadcrossfire in general thinking that the grass may be greener so to speak, it isn't. Both of the companies have some serious work to do with their 4 gpu configurations.
> 
> I really don't want to make a lengthy post going into all the details. However, if anyone is considering a similar move and I can be of any help, don't hesitate to PM me.


Do you think it is more the game developers or AMD/Nvidia?


----------



## axiumone

Quote:


> Originally Posted by *joeh4384*
> 
> Do you think it is more the game developers or AMD/Nvidia?


Very much both. Some game engines have memory leaks/game breaking bugs with more than 2 gpus. As well as driver development lagging behind in the very high end configurations.

Funny thing is both camps are awesome with 2 gpu configurations.


----------



## Mega Man

More people use 2 way vs 3/4 way. Also last I heard nvidia does not support 4 way any more?


----------



## axiumone

Quote:


> Originally Posted by *Mega Man*
> 
> More people use 2 way vs 3/4 way. Also last I heard nvidia does not support 4 way any more?


You're absolutely right. Way more people use 2 way and most of the time it's with a single display. Then, why even allow 4 way configurations? If the support is so limited, it will just frustrate your consumers.

Also, the nvidia not supporting 4 way is only partially true. You have to have the top end gpu in the line up in order to get 4 way. For instance, gtx 980 supports 4 way, but gtx 970 is limited to 3 way maximum.


----------



## Mega Man

Ah.

They do support it because they can.

Personally I have very few problems with quadfire. (See above example) most of mine are not even related to drivers.

Another problem I had was nvidia/wb forced shadow of Moridor to be 16:9 max. So I had to change that in a text doc. .... I mean really? Why would you do that. ...


----------



## eqc6

Hey all, first timer here so please take it easy on me









Have a couple of questions about the 295x2. After debating for days whether to get 295x2 or sli'd 980s, I finally pulled the trigger on the 295x2.

First here are my current specs:

i7 2600k
Asrock p67 gen3 extreme4
Corsair vengeance 1600 16gb
Corsair 105 cpu cooler
Crossfire'd 7970s reference cards
30" apple cinema display
3x 250gb ssds and 1x 500gb 7200 hd
Corsair 850w psu

1. Is my corsair 850w psu good enough to run the 295x2? Here's the exact psu I have http://www.amazon.com/gp/product/B005E98EI2/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
2. I currently play Dragon Age: Inquisition at 2560x1600, all ultra settings except for AA being disabled. Game runs at 55-60fps (vsync on) and will dip around 45-50 on big battles. Now, I plan on upgrading to 4k display. Can I expect similar game performance with the 295x2 at 4k?

I'd appreciate any help or feedback/suggestions.

Thanks!


----------



## Mega Man

1 Should be ok but it may be close.

2 no idea


----------



## xer0h0ur

Yeah, I am more than likely going to use a Koolance external mounting bracket, a Koolance PCI pass-thru and a to be determined dual 5.25" bay pump/reservoir combo to replace that Thermaltake pump/res/rad combo. I am probably going to use an Alphacool NexXxos Monsta 240 rad back there if not maybe a 360.

Its not the blocks that are the problem, CPU block was upgraded to a Koolance CPU-380I with a Coollaboratory Liquid MetalPad and the EKWB is still fine with no fin obstructions. I took advantage of having the loop taken apart to drain it and replace the clear EK Koolant since I was already somewhere in the neighborhood of 6 months on that coolant.


----------



## MapRef41N93W

Ran through a bunch of testing with my 295x2 on my AX760 and didn't get any shutdowns. Highest I saw pulling from the wall was 720 watts. Running a pair of CM Jetflows in P/P and my temps were maxing at 68c at max fan with no added voltage (1040/1400 was what I was running at). At medium fan speed I was seeing 73/74c. Do these temps sound right? By max/medium I mean the speeds on the Node 804 fan controller.

Biggest difference though is my CPU temps have plummeted. With my GTX 970 in the same chamber as my CPU rad I was regularly seeing in the 60s during Valley benching. Now with the 295x2 rad in the other chamber I was hardly ever seeing 50c.


----------



## joeh4384

Those temps sound right. I get 68-70 with corsair SP120 performance editions at 1030. I do not have any special fan config, one runs from the card and one off the board at a standard profile.


----------



## doctakedooty

Quote:


> Originally Posted by *eqc6*
> 
> Hey all, first timer here so please take it easy on me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have a couple of questions about the 295x2. After debating for days whether to get 295x2 or sli'd 980s, I finally pulled the trigger on the 295x2.
> 
> First here are my current specs:
> 
> i7 2600k
> Asrock p67 gen3 extreme4
> Corsair vengeance 1600 16gb
> Corsair 105 cpu cooler
> Crossfire'd 7970s reference cards
> 30" apple cinema display
> 3x 250gb ssds and 1x 500gb 7200 hd
> Corsair 850w psu
> 
> 1. Is my corsair 850w psu good enough to run the 295x2? Here's the exact psu I have http://www.amazon.com/gp/product/B005E98EI2/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
> 2. I currently play Dragon Age: Inquisition at 2560x1600, all ultra settings except for AA being disabled. Game runs at 55-60fps (vsync on) and will dip around 45-50 on big battles. Now, I plan on upgrading to 4k display. Can I expect similar game performance with the 295x2 at 4k?
> 
> I'd appreciate any help or feedback/suggestions.
> 
> Thanks!


First the psu it should be able to just remember 4k is going to be pushing the cards to the max and performance at 4k is great. If for some reason the psu don't hold up I would look into a 1200w I can see draws of up to 810 w from the wall with no overclock on the card with a 4790k oc to 4.8 at 1.255v. My personal preference is I don't like pushing psu to there max wattage for long periods of time so a 1200w should be a little over 60% load under gaming again this is my preference. Also psu if I remember correctly lose something around 10% capacity a year due to capacitors getting older etc.

Now with performance at 4k I don't play the games you do but BF4 at 4k everything on ultra 2x AA I still manage to stay plenty above 60 fps so I am going to say you should be good and throughly enjoy this gpu. The higher the res with these cards the better performance you will see . This is where these cards start to shine is at 4k 1600p etc. If you were on 1080p I would say you probably wouldn't like the card to much as you wouldn't see it full potential as you would with the higher res.


----------



## xarot

Quote:


> Originally Posted by *xarot*
> 
> Sorry if I'm asking the obvious questions, but I just ordered a Sapphire card. I already haven an ARESIII on water, but I am moving away from water cooling either completely or only use the ARESIII on water only. ARES has been working very well for me even with the ROG Swift at 120 Hz. Great card.
> 
> However, about the 'stock' 295X2:
> 
> - Is the warranty still OK if I remove those cable ties from the hoses so I could change the stock fan on radiator to something else? A bit dumb if not.
> 
> 
> 
> 
> 
> 
> 
> 
> - Is there another BIOS to raise throttling temp
> - Or is the throttling mainly because of VRM temps only and not GPUs?
> - Anything else I could do for the throttling if it happens?
> 
> I know Sleeping Dogs for example will cook any card, so it sounds like a tough one for this card...


Bump. Anyone?







Eagerly awaiting for my Sapphire card.


----------



## ocvn

Anyone have UEFI bios with EFI 1.3? Because for Dell monitor to fix the black screen/ screen wake up issued, we need the UEFI bios with EFI 1.3? Can someone post it please


----------



## ColeriaX

Well I took the plunge when CoolMike posted the deal on the XFX 295x2 and added a 2nd Card to my Powercolor 295X2. Unfortunately that XFX card is not nearly as strong as an overclocker







Oh well heres some pics I took this morning. I had to daisy chain another power supply (corsair ax850) as I was getting shutdowns with both cards OC'd using just the EVGA supernova 1300!!! Pretty crazy. I was able to mod the back side of my 540 air to support the other PSU (dont mind the mess back there lol). I also cut out the bottom panel so that I can put in a 240 or 360 rad for when I move from CL to OP watercooling, but for now i stuck some Corsair 140mm fans to exhaust some of the heat out of the case. Budget is tight so thats going to be a while I guess. Hope you enjoy.


Can't tell if he was enthused or not...


----------



## SLADEizGOD

Quote:


> Originally Posted by *xarot*
> 
> Bump. Anyone?
> 
> 
> 
> 
> 
> 
> 
> Eagerly awaiting for my Sapphire card.


I.waiting for my card too..


----------



## zilchstar

Would it be possible to flash my 295x2 with the bios of a 290x?

I know that sounds like a crazy idea but I'd love to have this card running in my Hackintosh rig for Final Cut Pro X but it's just not possible because OS X doesn't allow crossfire and there's no way to disable it on this card.

The 290x is fully supported though, so I'd love to switch to that BIOS for work and flip back to the full 295x2 goodness for gaming.

Thoughts?


----------



## joeh4384

I do not think it would work. The card has essentially two bioses on it one a master and one the slave.


----------



## Mega Man

Quote:


> Originally Posted by *ColeriaX*
> 
> Well I took the plunge when CoolMike posted the deal on the XFX 295x2 and added a 2nd Card to my Powercolor 295X2. Unfortunately that XFX card is not nearly as strong as an overclocker
> 
> 
> 
> 
> 
> 
> 
> Oh well heres some pics I took this morning. I had to daisy chain another power supply (corsair ax850) as I was getting shutdowns with both cards OC'd using just the EVGA supernova 1300!!! Pretty crazy. I was able to mod the back side of my 540 air to support the other PSU (dont mind the mess back there lol). I also cut out the bottom panel so that I can put in a 240 or 360 rad for when I move from CL to OP watercooling, but for now i stuck some Corsair 140mm fans to exhaust some of the heat out of the case. Budget is tight so thats going to be a while I guess. Hope you enjoy.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> Can't tell if he was enthused or not...


nice welcome to quadfire !!!!
Quote:


> Originally Posted by *zilchstar*
> 
> Would it be possible to flash my 295x2 with the bios of a 290x?
> 
> I know that sounds like a crazy idea but I'd love to have this card running in my Hackintosh rig for Final Cut Pro X but it's just not possible because OS X doesn't allow crossfire and there's no way to disable it on this card.
> 
> The 290x is fully supported though, so I'd love to switch to that BIOS for work and flip back to the full 295x2 goodness for gaming.
> 
> Thoughts?


nope


----------



## tsm106

Is that quad on a P8 board with sb chip? That's gonna be running x4 per gpu at pcie 2.0... whoa dude!


----------



## F4ze0ne

Quote:


> Originally Posted by *ColeriaX*
> 
> Well I took the plunge when CoolMike posted the deal on the XFX 295x2 and added a 2nd Card to my Powercolor 295X2. Unfortunately that XFX card is not nearly as strong as an overclocker
> 
> 
> 
> 
> 
> 
> 
> Oh well heres some pics I took this morning. I had to daisy chain another power supply (corsair ax850) as I was getting shutdowns with both cards OC'd using just the EVGA supernova 1300!!! Pretty crazy. I was able to mod the back side of my 540 air to support the other PSU (dont mind the mess back there lol). I also cut out the bottom panel so that I can put in a 240 or 360 rad for when I move from CL to OP watercooling, but for now i stuck some Corsair 140mm fans to exhaust some of the heat out of the case. Budget is tight so thats going to be a while I guess. Hope you enjoy.


Looks great and I like the fans on the bottom of the case.

Btw... How are you supporting the cards weight on the end to relieve stress on the PCI-X slots?

I tried putting my card in last night to check spacing in my case. But, it's so heavy I'm afraid to let it just hang without some type of support on the end of the card even with the screws tightened.


----------



## ColeriaX

Quadfire compatible nonetheless and still servicable!!! check it out...

Firestrike


FS Ultra


Quote:


> Originally Posted by *F4ze0ne*
> 
> Looks great and I like the fans on the bottom of the case.
> 
> Btw... How are you supporting the cards weight on the end to relieve stress on the PCI-X slots?
> 
> I tried putting my card in last night to check spacing in my case. But, it's so heavy I'm afraid to let it just hang without some type of support on the end of the card even with the screws tightened.


Thanks bud. Lift the card up in the slot as much as you can before you tighten the thumbscrews. I also used the tubing and pcie power cables with zipties pulling up to help support the cards. I've had the 295x2 since launch day and no noticeable damage to my pcie slots.

Tsm this p67 deluxe board is definitely a few gens old, but it's still surprising me. I'm waiting for tax time to move to another platform. All cards run at x8 2.0 but I'd still hardly say I'm bandwidth starved. I've looked at newer z87 and z97 platforms with ivy and Haswell 4 core i7s and I'm still roughly as fast or faster than those in synthetics which shocks me. My 2600k can bench at 5.2 and daily drive at 5.1 ghz with an h100i. But yeah enough rambling from me thanks for checking out my cube (what the girly calls it lol).


----------



## boredmug

Nice. Good to know the 290x crossfire I have planned won't perform too poorly with my 2600k until I can upgrade.


----------



## F4ze0ne

Quote:


> Originally Posted by *ColeriaX*
> 
> Thanks bud. Lift the card up in the slot as much as you can before you tighten the thumbscrews. I also used the tubing and pcie power cables with zipties pulling up to help support the cards. I've had the 295x2 since launch day and no noticeable damage to my pcie slots.


Nice thanks. I figured from the pic that you had used the power cables. I did something similar with my 7950s which also needed support.


----------



## Mega Man

@tsm106 isnt it 8x/8x ?


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> @tsm106 isnt it 8x/8x ?


How many gpus are on each card? The quad FS is 4K lower than the fastest 7970 btw, including the gscore.


----------



## Mega Man

Hmm I never thought that it did that. I thought ( not that I am right ) the plx chips switched for them to get full avail bandwidth. Not that I know anything about the interworkings of gpus


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> Hmm I never thought that it did that. I thought ( not that I am right ) the plx chips switched for then to get full avail bandwidth. Not that I know anything about the interwoven of this


Nah, the PLX just connect the gpus together and then to the pcie bus. There is no free lunch no extra magic bandwidth. He is squeezing FOUR gpus thru x16 at pcie 2.0 at the end of the day.


----------



## Orivaa

Quote:


> Originally Posted by *Mega Man*
> 
> nope


Can't one just make a crossfire profile for Final Cut Pro like one would for a game, then disable Crossfire for it?


----------



## xer0h0ur

Since, for lack of a better term, **** got real. I was forced to buy something that would provide support for the 295X2 to stop the sagging on the PCI-E slot. I have been getting boot errors unless I wiggle the card or re-seat it so after 6 months of avoiding the issue I bought a PowerColor Power Jack from a seller on eBay. Should do the trick.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Since, for lack of a better term, **** got real. I was forced to buy something that would provide support for the 295X2 to stop the sagging on the PCI-E slot. I have been getting boot errors unless I wiggle the card or re-seat it so after 6 months of avoiding the issue I bought a PowerColor Power Jack from a seller on eBay. Should do the trick.


It's a cool product. I've been trying to come up with a home-brew design for a multi GPU jack design. I'm very tired lately so it's going slow.


----------



## xer0h0ur

LOL, believe it or not the only other worthy suggestion I came across was using legos. Yes. Legos.


----------



## StillClock1

I'm a bit late: but for the PSU discussion here are my 2 cents.

I got a LEPA 1600W and have no issues. That said, my rental arranagement is such that electricity is a flat fee and I would not get such a beefy PSU if I had to think about the running costs.

Secondly, and more important: The cables look like crap, I wish I would have gotten a corsair PSU such that I could have used braided cables.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, believe it or not the only other worthy suggestion I came across was using legos. Yes. Legos.


I've been using a stop fitting wrapped in electrical tape to anchor my 295 against the SATA ports of my motherboard. Works surprisingly well... But then with the 295x2 being straight the 290 below is obviously out of alignment so I have gone looking for a "frame" design that will attach to the case itself.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> It's a cool product. I've been trying to come up with a home-brew design for a multi GPU jack design. I'm very tired lately so it's going slow.


When i had the 295x2 , i used a bit of hardline tubing , cut to size and sprayed, done the trick just fine


----------



## electro2u

That's genius Syceo! I really wanted to do something "stealth" so I'm considering the frame idea because the GPU shelves will be almost totally invisible and the frame portion that connects the shelves to the chassis will be covered by a reservoir and cabling. Anchoring the 295 to the SATA connectors on the mobo was hidden as well.

Your solution is very very good.


----------



## Syceo

Quote:


> Originally Posted by *electro2u*
> 
> That's genius Syceo! I really wanted to do something "stealth" so I'm considering the frame idea because the GPU shelves will be almost totally invisible and the frame portion that connects the shelves to the chassis will be covered by a reservoir and cabling. Anchoring the 295 to the SATA connectors on the mobo was hidden as well.
> 
> Your solution is very very good.












just out of curiosity did u do get around to doing 980 build ?


----------



## silencespr

just got XFX R9 295X2 off newegg is it worth it to add it to my other two R9 290X ?
should i do one of each ? or should i just run the R9 295X2 alone?

also will Enermax Platimax 1350W be able to handle the cards?

Thank you.


----------



## silencespr

Quote:


> Originally Posted by *SLADEizGOD*
> 
> I purchased the XFX R9 295x2 on friday. And after reading all these post. Just in case. I bought the EVGA 1600w G2 PSU. Not taking any chances.


did you get it yet ? i picked mine up today waiting on shipping now.


----------



## Rohandy

The card that keeps paying back! Went back to Microcenter and got another credit for $350 with recent price drop







thats $700 total so far! Great store, keep it coming AMD


----------



## joeh4384

Quote:


> Originally Posted by *silencespr*
> 
> just got XFX R9 295X2 off newegg is it worth it to add it to my other two R9 290X ?
> should i do one of each ? or should i just run the R9 295X2 alone?
> 
> also will Enermax Platimax 1350W be able to handle the cards?
> 
> Thank you.


1350 can run the 295x2 and a 290x not all 4 gpus though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *silencespr*
> 
> just got XFX R9 295X2 off newegg is it worth it to add it to my other two R9 290X ?
> should i do one of each ? or should i just run the R9 295X2 alone?
> 
> also will Enermax Platimax 1350W be able to handle the cards?
> 
> Thank you.
> 
> 
> 
> 1350 can run the 295x2 and a 290x not all 4 gpus though.
Click to expand...

Eh?

I can run my 295x2 + 2 R9 290's on my AX1200i...doesnt leave alot of wiggle room but you can do it.


----------



## SLADEizGOD

I just got my XFX R9 295x2 today & I also have a Qnix 27in 1440p. I cant figure out how to crank it up to 96Hz. need some help on this.


----------



## joeh4384

http://www.overclock.net/t/1384767/official-the-qnix-x-star-1440p-monitor-club


----------



## MapRef41N93W

So far my 295x2 is chewing up everything at 4k with some slight tweaks. I am noticing something strange though, some specific things are causing a "flicker" like effect. Most notable, on Firestrike Ultra during the first scene, the lava flickers. Also found this in Tomb Raider with TressFX enable. Lara's hair will flicker. Only happening with my 4k panel (tested on 1440p).


----------



## Mega Man

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Hmm I never thought that it did that. I thought ( not that I am right ) the plx chips switched for then to get full avail bandwidth. Not that I know anything about the interwoven of this
> 
> 
> 
> Nah, the PLX just connect the gpus together and then to the pcie bus. There is no free lunch no extra magic bandwidth. He is squeezing FOUR gpus thru x16 at pcie 2.0 at the end of the day.
Click to expand...

another big hit for the auto correct fail, interwoven ,- innards
Quote:


> Originally Posted by *Orivaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> nope
> 
> 
> 
> Can't one just make a crossfire profile for Final Cut Pro like one would for a game, then disable Crossfire for it?
Click to expand...

you cannt, bios control everything on the GPU, IE the vrms the fans ect for bios to work on other cards they have to be identical
Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Since, for lack of a better term, **** got real. I was forced to buy something that would provide support for the 295X2 to stop the sagging on the PCI-E slot. I have been getting boot errors unless I wiggle the card or re-seat it so after 6 months of avoiding the issue I bought a PowerColor Power Jack from a seller on eBay. Should do the trick.
> 
> 
> 
> It's a cool product. I've been trying to come up with a home-brew design for a multi GPU jack design. I'm very tired lately so it's going slow.
Click to expand...

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, believe it or not the only other worthy suggestion I came across was using legos. Yes. Legos.


haha nice my idea is extruded AL with t chanel ( 8020/8010 ) ( the part going to the card plastidiped )
Quote:


> Originally Posted by *Hoff248*
> 
> I'm a bit late: but for the PSU discussion here are my 2 cents.
> 
> I got a LEPA 1600W and have no issues. That said, my rental arranagement is such that electricity is a flat fee and I would not get such a beefy PSU if I had to think about the running costs.
> 
> Secondly, and more important: The cables look like crap, I wish I would have gotten a corsair PSU such that I could have used braided cables.


meh i like them, but i am making my own
Quote:


> Originally Posted by *silencespr*
> 
> just got XFX R9 295X2 off newegg is it worth it to add it to my other two R9 290X ?
> should i do one of each ? or should i just run the R9 295X2 alone?
> 
> also will Enermax Platimax 1350W be able to handle the cards?
> 
> Thank you.


at stock maybe !~ as to should you do quadfire depends on what you want,


----------



## electro2u

Quote:


> Originally Posted by *Syceo*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just out of curiosity did u do get around to doing 980 build ?


Working on it right now







Love the 980. EK block is gorgeous but I don't understand why they aren't making any full cover plexi blocks.


----------



## SLADEizGOD

Coming from Nvidia Precision. This AMD CCC is confusing. having Issue's while playing warframe. Is enabling Graphics OverDrive fix the problem?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Coming from Nvidia Precision. This AMD CCC is confusing. having Issue's while playing warframe. Is enabling Graphics OverDrive fix the problem?


I havent played Warframe in a while but try and run it in borderless windowed with DX11 and multicore rendering enabled. That works alright for me.

But tbh Warframe is just never going to make sense for me...closed beta had eyefinity support but open beta doesnt?


----------



## SLADEizGOD

Welp. I noticed some issue's with the XFX 295x2 was doing. I got the colorful lines with the black screen. So I guess its a bad card. Did my research on here with deleting all ATI files thinking it was a software issue. Hope newegg cant RMA it for a new one.


----------



## zilchstar

No, Final Cut Pro is a Mac only program with no Windows version. And mac OS X doesn't support crossfire at all, under any circumstances


----------



## xer0h0ur

Quote:


> Originally Posted by *zilchstar*
> 
> No, Final Cut Pro is a Mac only program with no Windows version. And mac OS X doesn't support crossfire at all, under any circumstances


You do realize you can create an application specific profile within the catalyst control center so that crossfire is disabled as soon as the application's executable loads right?


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> You do realize you can create an application specific profile within the catalyst control center so that crossfire is disabled as soon as the application's executable loads right?


Which is exactly what I said earlier.


----------



## zilchstar

Quote:


> Originally Posted by *xer0h0ur*
> 
> You do realize you can create an application specific profile within the catalyst control center so that crossfire is disabled as soon as the application's executable loads right?


Yes, I realize that. But I think you missed the part about Final Cut being a Mac only program. Which means I'm trying to boot into Mac is OS X. Unfortunately there is Catalyst Control center for Mac.


----------



## zilchstar

Quote:


> Originally Posted by *Orivaa*
> 
> Which is exactly what I said earlier.


I appreciate your attempt to help!

But I'm talking about Mac OS X and there is no equivalent of catalyst control center for Mac. So I need to disable crossfire before the OS loads, which is why I asked about bios options.


----------



## xer0h0ur

Well crap, missed that detail.


----------



## Orivaa

Aren't there programs that can emulate different OS' and launch programs as if it were on the specified OS? Like Applocale, but with OS'.


----------



## m00ter

Add me to the list!

Totally awesome card. Min 92fps on Dirt 3 Ultra settings and AA turned up to max! (6100x1080p)


----------



## xer0h0ur

I need suggestions for a single power supply to power dual 295X2's in a water cooled system with a 3930K. I know a hell of a lot of the time PSUs are just re-branded and you pay the price for it so preferably the actual manufacturer instead of the re-seller if possible.


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> I need suggestions for a single power supply to power dual 295X2's in a water cooled system with a 3930K. I know a hell of a lot of the time PSUs are just re-branded and you pay the price for it so preferably the actual manufacturer instead of the re-seller if possible.


That's a tall order. I would suggest the EVGA 1600w line. With a mild overclock on the cards of 1050 core and the cpu at 4.5-.6, you _should_ be around 1600w on full load.

I would suggest staying away from the corsair ax1500i as it has build in over current protection which the 295x2 trips on load. It's very difficult to disable that safety feature and it's extremely annoying.


----------



## xer0h0ur

I know man, I told him its a tall order to draw ~100A and 1000W just for the two video cards while keeping up with the power demands of everything else.


----------



## m00ter

I'm running one card and a Furmark burn in peaked at 900w draw.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> I need suggestions for a single power supply to power dual 295X2's in a water cooled system with a 3930K. I know a hell of a lot of the time PSUs are just re-branded and you pay the price for it so preferably the actual manufacturer instead of the re-seller if possible.


honestly several times this has been proven not true ( rebrands are actually cheaper at times )
Quote:


> Originally Posted by *axiumone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I need suggestions for a single power supply to power dual 295X2's in a water cooled system with a 3930K. I know a hell of a lot of the time PSUs are just re-branded and you pay the price for it so preferably the actual manufacturer instead of the re-seller if possible.
> 
> 
> 
> That's a tall order. I would suggest the EVGA 1600w line. With a mild overclock on the cards of 1050 core and the cpu at 4.5-.6, you _should_ be around 1600w on full load.
> 
> I would suggest staying away from the corsair ax1500i as it has build in over current protection which the 295x2 trips on load. It's very difficult to disable that safety feature and it's extremely annoying.
Click to expand...

agreed, or if you have them in your are / can import them , the super flower leadex 1600 ( same psu )


----------



## Stevesousan

Hi Everyone,

I just got this card and I'm having a hard time with suppling power. I have a corsair ax1200 PSU, which should be fine with this card.
The PSU has eight 6+2 PCI-E connection. Two horizontal [upper and lower] rails with 4 connections in each rail. I watched some videos about how to install this card, and they recommended using a 6+2 connection from separate rails. So i did that, i used one upper and one lower rail. But when i tested the card under pressure, my circuit breaker trips. I do believe i have problems with the lower rail from previous experience. So i just switched the cable to the upper rail. Now both are taking power from the upper rail, and under pressure, i have no problems with my circuit breaker. Its just when i play TitanFall under high, but recommended settings, the frame rate drops, and it starts to get very annoying and unplayable.
Can someone tell me what the problem is? or if have dome something wrong in my setup?

Thanks


----------



## joeh4384

Quote:


> Originally Posted by *Stevesousan*
> 
> Hi Everyone,
> 
> I just got this card and I'm having a hard time with suppling power. I have a corsair ax1200 PSU, which should be fine with this card.
> The PSU has eight 6+2 PCI-E connection. Two horizontal [upper and lower] rails with 4 connections in each rail. I watched some videos about how to install this card, and they recommended using a 6+2 connection from separate rails. So i did that, i used one upper and one lower rail. But when i tested the card under pressure, my circuit breaker trips. I do believe i have problems with the lower rail from previous experience. So i just switched the cable to the upper rail. Now both are taking power from the upper rail, and under pressure, i have no problems with my circuit breaker. Its just when i play TitanFall under high, but recommended settings, the frame rate drops, and it starts to get very annoying and unplayable.
> Can someone tell me what the problem is? or if have dome something wrong in my setup?
> 
> Thanks


Have you tried moving the PC to a different breaker?


----------



## Stevesousan

No not yet. But its a good point you made. The only way i can do that is moving the computer outside my room. Which is not an option for me.
So you are saying that the upper rail is not providing enough power?


----------



## Stevesousan

Quote:


> Originally Posted by *joeh4384*
> 
> Have you tried moving the PC to a different breaker?


No not yet. But its a good point you made. The only way i can do that is moving the computer outside my room. Which is not an option for me.
So you are saying that the upper rail is not providing enough power?


----------



## joeh4384

Wondering if it is your outlet instead of psu.


----------



## Stevesousan

Quote:


> Originally Posted by *joeh4384*
> 
> Wondering if it is your outlet instead of psu.


Yes, i agree. I guess i have to settle that. I will test it on a different circuit breaker.


----------



## xer0h0ur

Well, I don't know about how many things you have plugged in or how your PSU is connected to your wall outlet but in the case of my 1250W PSU it states you must plug it in directly to the wall outlet. Not into a power strip sharing power with other things. It will act up otherwise.


----------



## Stevesousan

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well, I don't know about how many things you have plugged in or how your PSU is connected to your wall outlet but in the case of my 1250W PSU it states you must plug it in directly to the wall outlet. Not into a power strip sharing power with other things. It will act up otherwise.


I will try that, Thank you. But isn't it best to have it plugged in to a surge protector?


----------



## xer0h0ur

Sure you can use a surge protector in between. Just try it without other things connected to the same surge protector. Keep everything else on a separate connection. If that doesn't help then really I don't know what else to suggest.


----------



## Mega Man

Quote:


> Originally Posted by *Stevesousan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Well, I don't know about how many things you have plugged in or how your PSU is connected to your wall outlet but in the case of my 1250W PSU it states you must plug it in directly to the wall outlet. Not into a power strip sharing power with other things. It will act up otherwise.
> 
> 
> 
> I will try that, Thank you. But isn't it best to have it plugged in to a surge protector?
Click to expand...

honestly i would be willing to bet the PSU has better surge protection then the strip !


----------



## Lordevan83

Hey, the stock fan on the radiator for 295x2 is set to push. Is it worthwhile to add a second fan to do push/pull?


----------



## electro2u

Quote:


> Originally Posted by *Lordevan83*
> 
> Hey, the stock fan on the radiator for 295x2 is set to push. Is it worthwhile to add a second fan to do push/pull?


Sure it helps a little


----------



## Mega Man

it depends the rad has pretty high fpi, you can replace the original fan with higher quality rad fan or go push pull but it will be up to you if you like it


----------



## xer0h0ur

Adding a second fan does make a small difference. A few degrees C.


----------



## joeh4384

Quote:


> Originally Posted by *Lordevan83*
> 
> Hey, the stock fan on the radiator for 295x2 is set to push. Is it worthwhile to add a second fan to do push/pull?


I think it is worth it, especially if you are creeping into the low 70s and close to the 75 degree throttle point. For me replacing the fan with Corsair SP120s knocked load temps to 68 when it cranks.


----------



## Lordevan83

how do you guys hook up the second fan? do you guys use a y cable and have the card control both fan's speed?


----------



## xer0h0ur

No, since I have no idea at all how much power can be drawn directly from the PCB, I instead opted to use two PWM fans connected to a motherboard fan header instead of drawing extra power from the card for a 2nd fan.


----------



## electro2u

I used a 3 pin splitter cable and added a noctua fan.


----------



## MapRef41N93W

So I started playing my first real games at 4k with my 295x2 and unfortunately got shutdowns... apparently the ax760 isn't enough after all. I had to totally switch over my build to my leftover V1000 which sucks because I wasted $100+ on stupid braided cables for the ax760.


----------



## F4ze0ne

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I started playing my first real games at 4k with my 295x2 and unfortunately got shutdowns... apparently the ax760 isn't enough after all. I had to totally switch over my build to my leftover V1000 which sucks because I wasted $100+ on stupid braided cables for the ax760.


I'm in the same boat. I have an AX650 with braided cables and I'll have to get rid of it now because it can't power the 295x2 either.


----------



## Mega Man

Quote:


> Originally Posted by *Lordevan83*
> 
> how do you guys hook up the second fan? do you guys use a y cable and have the card control both fan's speed?


Aquaero !


----------



## electro2u

Quote:


> Originally Posted by *Mega Man*
> 
> Aquaero !


Dang it... now I remember what I'm missing for new build. Going to look for an alternative. Love my AQ6 but it's ******ed expensive.


----------



## Mega Man

IMO it is price just right NOTHING on the market is close, JVCs one was ( ON OCN ) but he disappeared ; ;


----------



## electro2u

Quote:


> Originally Posted by *Mega Man*
> 
> IMO it is price just right NOTHING on the market is close, JVCs one was ( ON OCN ) but he disappeared ; ;


You're totally right if you use a lot of its functions but I really don't...








It's definitely the sexiest fan controller on the market though


----------



## Mega Man

have you looked into the aq5lt ?


----------



## Lordevan83

Just finished building this today. Corsair 780t turned out to be much bigger that I expected. Formatting HD takes SO LONG. The build is also louder than I expected. No sure if it is the GPU/CPU fan or Power supply (lepa 1600w) making the noise. Next step would be to install a liquid CPU cooler and maybe adding some fan to run push pull setup with the 295x2 coolers. What 120mm fans do you guys recommend?


----------



## Orivaa

Quote:


> Originally Posted by *Lordevan83*
> 
> 
> 
> Just finished building this today. Corsair 780t turned out to be much bigger that I expected. Formatting HD takes SO LONG. The build is also louder than I expected. No sure if it is the GPU/CPU fan or Power supply (lepa 1600w) making the noise. Next step would be to install a liquid CPU cooler and maybe adding some fan to run push pull setup with the 295x2 coolers. What 120mm fans do you guys recommend?


That's not gonna work. The VRMs on the top card are gonna get totally cooked.


----------



## eqc6

Running 4k with my 295x2 and 850w psu and eveyrthings going great!

But I do want to start looking around for a higher watt psu and another 290x for trifire.

*Any suggestions on the best futureproof PSU I can buy today?
*
*And what about the best brand 290x to trifire with my 295x2?
*
Again, thanks for you help fellas!


----------



## xer0h0ur

I regret buying a 290X to tri-fire with the 295X2. Just sayin. Damn near nothing works properly with it and often times I am better off just using crossfire instead of tri-fire since games won't even load or underperform.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> I regret buying a 290X to tri-fire with the 295X2. Just sayin. Damn near nothing works properly with it and often times I am better off just using crossfire instead of tri-fire since games won't even load or underperform.


I've been messing about with an R9 290 + R9 295x2 lately and i gotta say i'm hooked, these 290's are going to the Wife soon so i might start looking into a 290x for cheap somewhere









I've been playing some Tomb Raider, Sleeping Dogs but mainly Dragon Age: Inquisition and there are a couple of hiccups but overall i'm impressed....

Have you got the "Force Crossfire" option ticked in CCC?


----------



## eqc6

Quote:


> Originally Posted by *xer0h0ur*
> 
> I regret buying a 290X to tri-fire with the 295X2. Just sayin. Damn near nothing works properly with it and often times I am better off just using crossfire instead of tri-fire since games won't even load or underperform.


After reading this review http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/#.VISP_THF9rU , it makes me more interested in going for a trifire, which is why I brought it up. Besides, I don't have any more extra fan slot for another 295x2 plus the scaling is not as great as the trifire (based on the review)
Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've been messing about with an R9 290 + R9 295x2 lately and i gotta say i'm hooked, these 290's are going to the Wife soon so i might start looking into a 290x for cheap somewhere
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been playing some Tomb Raider, Sleeping Dogs but mainly Dragon Age: Inquisition and there are a couple of hiccups but overall i'm impressed....
> 
> Have you got the "Force Crossfire" option ticked in CCC?


What psu and 290x do you have? And how does DA:I perform in trifire?


----------



## Sgt Bilko

Quote:


> Originally Posted by *eqc6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I regret buying a 290X to tri-fire with the 295X2. Just sayin. Damn near nothing works properly with it and often times I am better off just using crossfire instead of tri-fire since games won't even load or underperform.
> 
> 
> 
> After reading this review http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/#.VISP_THF9rU , it makes me more interested in going for a trifire, which is why I brought it up. Besides, I don't have any more extra fan slot for another 295x2 plus the scaling is not as great as the trifire (based on the review)
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I've been messing about with an R9 290 + R9 295x2 lately and i gotta say i'm hooked, these 290's are going to the Wife soon so i might start looking into a 290x for cheap somewhere
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been playing some Tomb Raider, Sleeping Dogs but mainly Dragon Age: Inquisition and there are a couple of hiccups but overall i'm impressed....
> 
> Have you got the "Force Crossfire" option ticked in CCC?
> 
> Click to expand...
> 
> What psu and 290x do you have?
Click to expand...

AX1200i and i'm using an XFX DD R9 290 (non X) at 1018/1250 to match the 295x2's clocks


----------



## eqc6

Quote:


> Originally Posted by *Sgt Bilko*
> 
> AX1200i and i'm using an XFX DD R9 290 (non X) at 1018/1250 to match the 295x2's clocks


cool thanks. I edited my post so you probably missed my question regarding DA:I's performance in trifire.


----------



## Sgt Bilko

Quote:


> Originally Posted by *eqc6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> AX1200i and i'm using an XFX DD R9 290 (non X) at 1018/1250 to match the 295x2's clocks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cool thanks. I edited my post so you probably missed my question regarding DA:I's performance in trifire.
Click to expand...

yep, i did miss it.

My benchmark option in it is a bit broken and not realistic but with Tri-Fire 1440p and the game maxed out ( not just Ultra preset but with 4xMSAA, Ultra Post processing etc) i'm getting around 50 fps min and 90-100 max with it sitting around 80fps for most of it


----------



## xer0h0ur

I don't know. I guess I can give tri-fire another chance but I had literally removed the 290X altogether from the system since it was causing immediate BSOD loading AC: Unity and performance issues in other games like Thief where forcing full usage on a single core on the 295X2 was better framerate than three GPUs with pitiful usage. Didn't matter if DX11 or Mantle. In reality the only things that ever got full usage out of tri-fire for me so far have been benchmarks which make it pointless to me.


----------



## Sgt Bilko

false
Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know. I guess I can give tri-fire another chance but I had literally removed the 290X altogether from the system since it was causing immediate BSOD loading AC: Unity and performance issues in other games like Thief where forcing full usage on a single core on the 295X2 was better framerate than three GPUs with pitiful usage. Didn't matter if DX11 or Mantle. In reality the only things that ever got full usage out of tri-fire for me so far have been benchmarks which make it pointless to me.


I have Thief but not Unity.....first AC game i didn't preorder









EDIT: DX in Thief is a bit wonky for me, Crossfire is worse than Single GPU but with Mantle Crossfire is better, I'm only at 1440p and i think you're at 4k iirc?

I'm getting decent usage across the GPU's for Tri and Crossfire but not 100% all the time and it wasn't stuttering or being weird for me


----------



## xer0h0ur

You know its bad when DX9 games like Counter Strike Source get better GPU usage across all three cores than modern games. That is pretty pathetic. I still need to try Tombraider, Hitman Absolution and Sleeping Dogs since I have not played them @ 4K yet.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> You know its bad when DX9 games like Counter Strike Source get better GPU usage across all three cores than modern games. That is pretty pathetic. I still need to try Tombraider, Hitman Absolution and Sleeping Dogs since I have not played them @ 4K yet.


Tomb Raider and Sleeping Dogs should work great as they do for me at 1440p, i haven't tried Hitman yet as I don't think i have it installed


----------



## Orivaa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Tomb Raider and Sleeping Dogs should work great as they do for me at 1440p, i haven't tried Hitman yet as I don't think i have it installed


Hitman worked wonderfully for my 295x2, but I can't speak for TriFire.


----------



## MapRef41N93W

I'm still getting crashing in Shadow of Mordor even after switching from a 970 to this 295x2. Is this game just super prone to crashes or something? The worst part is they only seem to happen after 2+ hours and they result in all progress being lost (even if the game auto saved). This is why I can never continue playing this game.

Something else I am noticing is black flickering lines near the spawn towers and this weird artifacting only in the controls menu.


----------



## Orivaa

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I'm still getting crashing in Shadow of Mordor even after switching from a 970 to this 295x2. Is this game just super prone to crashes or something? The worst part is they only seem to happen after 2+ hours and they result in all progress being lost (even if the game auto saved). This is why I can never continue playing this game.
> 
> Something else I am noticing is black flickering lines near the spawn towers and this weird artifacting only in the controls menu.


I never had any issues with Shadow of Mordor with my old AMD card. Have you tried completely uninstalling the graphics drivers and then re-installing them?
Speaking of, which version are you using?

Also, if your old GPU had the same issue, it might be some other component in your system that's at fault. Does the same issue occur in different games?


----------



## MapRef41N93W

Quote:


> Originally Posted by *Orivaa*
> 
> I never had any issues with Shadow of Mordor with my old AMD card. Have you tried completely uninstalling the graphics drivers and then re-installing them?
> Speaking of, which version are you using?
> 
> Also, if your old GPU had the same issue, it might be some other component in your system that's at fault. Does the same issue occur in different games?


Yes drivers have already been uninstalled. I'm on the latest drivers.

No no issues in any other game. SoM is the only game I am getting this issue in.


----------



## Orivaa

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Yes drivers have already been uninstalled. I'm on the latest drivers.
> 
> No no issues in any other game. SoM is the only game I am getting this issue in.


Did you try uninstalling and reinstalling SoM? And are you using the latest update?


----------



## MapRef41N93W

Quote:


> Originally Posted by *Orivaa*
> 
> Did you try uninstalling and reinstalling SoM? And are you using the latest update?


No, it takes something like 20 hours for me to download the game from Steam on my internet. I have verified game cache and I assume I'm running the latest update since Steam is set to auto-update it.


----------



## Orivaa

Quote:


> Originally Posted by *MapRef41N93W*
> 
> No, it takes something like 20 hours for me to download the game from Steam on my internet. I have verified game cache and I assume I'm running the latest update since Steam is set to auto-update it.


Did you try Google?


----------



## MapRef41N93W

Quote:


> Originally Posted by *Orivaa*
> 
> Did you try Google?


All I found was some users claiming crashes caused them to lose their entire save file. They were from a few months ago.

Something else I should note, I am getting this weird error from CCC on startup. It claims there is a "displayport link failure" and that it "can't run the resolution being attempted". It seems to be a false error as my setup is running in the correct resolution and refresh rate when I check. I wonder if this has any correlation at least to the weird flickering I am getting in certain areas.


----------



## wulfie

hi all i just joined the forums noticed this thread ^.^

i got a msi r9 295x2 in my rig just now

.


----------



## Mega Man

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I'm still getting crashing in Shadow of Mordor even after switching from a 970 to this 295x2. Is this game just super prone to crashes or something? The worst part is they only seem to happen after 2+ hours and they result in all progress being lost (even if the game auto saved). This is why I can never continue playing this game.
> 
> Something else I am noticing is black flickering lines near the spawn towers and this weird artifacting only in the controls menu.


all normal excluding the menu and i keep my saves, i get "lightning artifacts" when it rains,
IMO it is like it is trying to use physx

i think it is the game i really do

it just goes super slow, then a black screen but it pull up at ~ 9% cpu useage then states driver crashed,

it may be driver but i just dont think it is

i used to go 2+ hours then i installed the hd pack, now it is every 30 min
Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Orivaa*
> 
> Did you try Google?
> 
> 
> 
> All I found was some users claiming crashes caused them to lose their entire save file. They were from a few months ago.
> 
> Something else I should note, I am getting this weird error from CCC on startup. It claims there is a "displayport link failure" and that it "can't run the resolution being attempted". It seems to be a false error as my setup is running in the correct resolution and refresh rate when I check. I wonder if this has any correlation at least to the weird flickering I am getting in certain areas.
Click to expand...

this is what the game does to mine, normally i would have you check your dp cable but mine are good ( i own 9+dp cables ) and i still get them
Quote:


> Originally Posted by *wulfie*
> 
> hi all i just joined the forums noticed this thread ^.^
> 
> i got a msi r9 295x2 in my rig just now
> 
> .


welcome


----------



## RagingCain

Is the PowerColor Devil 13 Dual Core R9 290X welcome here?

If so, may I part some wisdom of non-ref 295Xs. Install your GPUs with just one monitor connected. The reason to not have a second (3rd or 4th) monitor connected before installation, was that I was not getting Crossfire options as it was auto creating a semi-eyefinity setup.

Post 14.3 drivers all seemed to suffer the issue. Just figured it out today.

(Using DVI-D and HDMI in case this is also necessary info)


----------



## xer0h0ur

Out of curiosity do you know if anyone makes a waterblock for that Devil 13 yet? I had suggested to the one other guy who owns a Devil 13 around here to submit his card to Alphacool for a free custom block. I don't know if they ever did make one or not.


----------



## electro2u

No block for the devil 13. Alphacool is unlikely to ever do anything that doesn't benefit them in some way. They won't even honor legit RMAs for their disgusting black quick disconnects that turn blue from being in contact with coolant.


----------



## xer0h0ur

Its a damn shame. Its kind of pointless to have a better power delivery then not be able to waterblock it to exploit the design improvement.


----------



## Lordevan83

Quote:


> Originally Posted by *Orivaa*
> 
> That's not gonna work. The VRMs on the top card are gonna get totally cooked.


what do u suggest here?


----------



## ChampN252

I'm really thinking about getting one of these. What are your guys thought on them? I have 2 290 now, but one is on the way out. I doubt I'll be able to afford two of the 3xx series and the 290s have proven to me the 2 290x can handle 4K. Now I play mostly dragon age: I and BF4. Any issues? And I may buy Titanfall and Elite Dangerous later. Witcher 3 too.


----------



## F4ze0ne

Quote:


> Originally Posted by *Lordevan83*
> 
> what do u suggest here?


The cards should be separated and not sandwiched. Try and use the next available PCIe slot on your motherboard. Ideally, you want to have space for air to flow between them or else the fan on the top card will not be effective.


----------



## xarot

Hey guys, it seems like my hoses are about to come off from the radiator barbs out of the box, the tubes are in the very, very ends of the barbs I assume. Normal? :O

Comparing with my Thermaltake Water 3.0 the hoses are much deeper into the barbs.

Edit. added a picture.


----------



## ColeriaX

Shadow of mordor crashes constantly on my machine too so don't freak out it's pretty frustrating Tbh. Needs a serious patch. Its happening on nvidia cards too fwiw.


----------



## silencespr

ehhh the wait is killing me! should be here tomorrow.


----------



## joeh4384

Nice, hopefully AMD releases those new Omega drivers tomorrow too. It always seems like when I have something cool like a 295x2 arriving, I am the last house on the fed-ex/ups route.


----------



## Fritata

Hay guys, just bought a MSI R9295x2 and decided to go mini-itx. This is what happend next......








The rest of the rig is, Asus Maximus VI impact, 4690K i5 with H80i for cooling running at 4.2, 16GB Corsair Vengeance pro 2100(i think), Corsair CX 750 PSU and samsung revo 840 1TB. All in the cooler master elite 130 and some light surgery with the dremel.

Let me know what you think.


----------



## ChampN252

Very slick. Sounds 4K ready


----------



## joeh4384

I like it but do you have enough power with that PSU?


----------



## Fritata

I have issues if i overclock the CPU, otherwise running this ASUS pb297q nicely.


----------



## Fritata

*PB287Q


----------



## MapRef41N93W

Quote:


> Originally Posted by *Fritata*
> 
> Hay guys, just bought a MSI R9295x2 and decided to go mini-itx. This is what happend next......
> 
> 
> 
> 
> 
> 
> 
> 
> The rest of the rig is, Asus Maximus VI impact, 4690K i5 with H80i for cooling running at 4.2, 16GB Corsair Vengeance pro 2100(i think), Corsair CX 750 PSU and samsung revo 840 1TB. All in the cooler master elite 130 and some light surgery with the dremel.
> 
> Let me know what you think.


You're going to get shutdowns. I had them on my rig with an AX760 which carry's more on the +12v.


----------



## Fritata

Haven't had any issues since i left the CPU to do its own turbo boost, gaming at 4k no problems. If i do get issues i wil do an upgrade to a 1000W modular... The current one as cables everywhere! Ridiculous!


----------



## F4ze0ne

I wouldn't run anything less than 850w with this card. I benched on Valley and was pulling around 800w from the wall last night.


----------



## Sploosh

I'm starting to see graphical glitches in Firefox (see image), but in nothing else:


I've updated to all the most recent drivers and BIOS for my system, yet I still see this from time to time. I can run Firestrike with no issue either, and both cores are working on the card.

Is this a hardware issue of some sort? I doubt that the monitor is going anytime soon.


----------



## Mega Man

Quote:


> Originally Posted by *ColeriaX*
> 
> Shadow of mordor crashes constantly on my machine too so don't freak out it's pretty frustrating Tbh. Needs a serious patch. Its happening on nvidia cards too fwiw.


once i uninstalled HD it got better, but still 30 min -2 hours


----------



## MapRef41N93W

So I just loaded up SoM again and wow things got even worse. Now the menus all shift around while flashing and their is clear texture flickering in the distance while playing.


----------



## kalijaga

welcome to the club,

try using 2 psus like I did. safer I think. But problem in getting a case with support for 2 psus.

Cheers.


----------



## xer0h0ur

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I just loaded up SoM again and wow things got even worse. Now the menus all shift around while flashing and their is clear texture flickering in the distance while playing.


Did you DDU your driver installations? I would run DDU for Nvidia then DDU for AMD and then finally re-install the driver. If you didn't completely wipe out your Nvidia installation then there can possibly still be remnants of that 970 left over.


----------



## MapRef41N93W

Quote:


> Originally Posted by *xer0h0ur*
> 
> Did you DDU your driver installations? I would run DDU for Nvidia then DDU for AMD and then finally re-install the driver. If you didn't completely wipe out your Nvidia installation then there can possibly still be remnants of that 970 left over.


Yes.

Also I recorded and it did a playback. Recording showed exactly what I saw.

Edit: Hmm okay after some reading this is apparently a known issue.


----------



## Mega Man

so how are people liking the omega driver ?


----------



## ramos29

i will get my 295x2 this weekend, so i was woundering if i made a good choice as i had to choose between gtx 980 and this one, do you experience stutter freezes ...? thats what i am afraid of, looks like it differs from person to another


----------



## Sgt Bilko

Quote:


> Originally Posted by *ramos29*
> 
> i will get my 295x2 this weekend, so i was woundering if i made a good choice as i had to choose between gtx 980 and this one, do you experience stutter freezes ...? thats what i am afraid of, looks like it differs from person to another


Overall ive had a good experience with mine, others have had some issues but mines been solid


----------



## ramos29

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Overall ive had a good experience with mine, others have had some issues but mines been solid


so you had no stutter? and what about periodic fps drop and stuff like that, and thx for your quick reply


----------



## ramos29

i had a gtx 780, first i did not want to switch to amd gpus as i knew they lack in support and correct drivers, but what annoyed me the most is the abscent of super resolution ( my monitor is 1080p only )
so when i saw the VSR in the omega driver features i decided to go for these beast, hope i will enjoy and not regret it


----------



## joeh4384

I think you made the right choice. I have had no issues with stuttering or crossfire besides having to be patient on new releases.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ramos29*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Overall ive had a good experience with mine, others have had some issues but mines been solid
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so you had no stutter? and what about periodic fps drop and stuff like that, and thx for your quick reply
Click to expand...

Ive had some stuttering in Dragon Age: Inquisition but thats mainly in cut scenes and from what i understand its because im running them at 70fps instead of the standard 30fps but its only some scenes, not all.

Sleeping Dogs plays well, same with Tomb Raider havent played much else im afraid and im away from my main rig for a month so i can't run through anything else to check for you sorry.


----------



## ramos29

ok thanks for you all


----------



## ramos29

no one tested the omega driver yet?

i dont know if i am allowed to do that but this is a review about the omega driver
http://wccftech.com/amd-catalyst-omega-driver-officially-launched-brings-performance-improvement-gpusapus-4k-vsr-tressfx-30-5k-monitor-support-freesync/


----------



## electro2u

Quote:


> Originally Posted by *ramos29*
> 
> no one tested the omega driver yet?
> 
> i dont know if i am allowed to do that but this is a review about the omega driver
> http://wccftech.com/amd-catalyst-omega-driver-officially-launched-brings-performance-improvement-gpusapus-4k-vsr-tressfx-30-5k-monitor-support-freesync/


Pretty exciting stuff if true:


----------



## Orivaa

Wait, so there's still no CrossFire profile for Far Cry 4???


----------



## electro2u

Quote:


> Originally Posted by *Orivaa*
> 
> Wait, so there's still no CrossFire profile for Far Cry 4???


FC4 so broken AMD hasn't bothered I guess?


----------



## F4ze0ne

For some reason the Omega driver couldn't install correctly on my system. I tried twice and it never showed the actual driver under the install options? Really strange... Went back to the beta for now.


----------



## Orivaa

Quote:


> Originally Posted by *electro2u*
> 
> FC4 so broken AMD hasn't bothered I guess?


They improved it another 50% for single GPUs, though.


----------



## axiumone

Quote:


> Originally Posted by *electro2u*
> 
> FC4 so broken AMD hasn't bothered I guess?


Oh don't worry, it's broken for nvidia multi gpu as well.


----------



## joeh4384

Quote:


> Originally Posted by *Orivaa*
> 
> Wait, so there's still no CrossFire profile for Far Cry 4???


I am guessing they are waiting on a patch from Ubisoft. I want to pick this game up but am going to wait.


----------



## xer0h0ur

Have not had a chance to install the Omega driver yet. I hope it brings tri-fire improvements.


----------



## silencespr

ah finally its here!


----------



## Lordevan83

will 295x2 get bottlenecked by pci-e 3.0 x8?

I currently run a 5820k with 28 lanes, the way my board is set up i can get 2 x pci-e 3.0 x8 lanes for my 2 295x2's. Will upgrading to a 40 lane CPU make a difference?


----------



## MapRef41N93W

Quote:


> Originally Posted by *Lordevan83*
> 
> will 295x2 get bottlenecked by pci-e 3.0 x8?
> 
> I currently run a 5820k with 28 lanes, the way my board is set up i can get 2 x pci-e 3.0 x8 lanes for my 2 295x2's. Will upgrading to a 40 lane CPU make a difference?


You will see a small difference. Something like a 1% gain. AMD cards can run just fine on 3.0 x4 (x8/2) so it's not really worth the CPU upgrade I would say.


----------



## xer0h0ur

Its such a marginal gain that its virtually pointless unless you just have your heart set on upgrading your CPU anyways. We have already compared it here. Its not much at all.


----------



## silencespr

hmmm im scoring less in Heaven Benchmark 4.0 with Tri fire R9 295x2 and R9 290x than i did with just two R9 290x cross fire... installed the Omega drivers from the new card.

R9 295x2 and R9 290x sore: 1495 min FPS 21.1 Max fps 65.8

R9 290x cross fire score: 1501 min FPS 24.6 max FPS: 69.7

__________________________________________

Just added all the cards tow R9 290x and one R9 295x2 Heaven Benchmark 4.0 is choppy while 3D mark ran great scored : 14673


----------



## xer0h0ur

Something is wrong there ^


----------



## silencespr

Quote:


> Originally Posted by *xer0h0ur*
> 
> Something is wrong there ^


Tell me about it games and everything runs great in quad fire but not Heaven Benchmark 4.0

during futuremark test i saw the card under clocking in GPU-Z .. tems didnt get over 56C


----------



## xer0h0ur

Speaking of something not being right...ran into several issues after installing the Omega driver.

I had a weird issue when I rebooted after installation. Login took much longer than usual and when it finally got past the login screen I was greeted by a black screen, then an internet explorer window with some sort of registration which I immediately closed only to still have a black screen and a command prompt window. It wouldn't allow me to close this command prompt window and never finished loading windows. After ALT+CTRL+DELing my way into a restart windows loaded fine.

Another issue which I haven't the faintest idea if its related is that Afterburner now no longer shows all three GPUs. Its only showing two which freaked me out but after a couple of Firestrike runs all three are being used and GPU-Z still shows all three as well as AIDA64 Extreme. The missing GPU in Afterburner is one of the GPUs from the 295X2. Its reporting the 290X and one GPU from the 295X2. I did however swap the cards to opposite PCI-E slots so now the 290X is in the primary (bottom) PCI-E slot so I don't know if that is a part of why this is happening. Meh, whatever.

Another issue I am not completely certain is related is that 3dmark results were showing default GPU clock on my 290X when it was overclocked in Afterburner. The 295X2 was reporting the correct overclock in the 3dmark results. GPU-Z showed all three cores were running same overclock though which made even less sense. Again, meh.


----------



## electro2u

Try uninstalling and reinstalling afterburner. Without saving the info it asks about.


----------



## MapRef41N93W

Quote:


> Originally Posted by *xer0h0ur*
> 
> Speaking of something not being right...ran into several issues after installing the Omega driver.
> 
> I had a weird issue when I rebooted after installation. Login took much longer than usual and when it finally got past the login screen I was greeted by a black screen, then an internet explorer window with some sort of registration which I immediately closed only to still have a black screen and a command prompt window. It wouldn't allow me to close this command prompt window and never finished loading windows. After ALT+CTRL+DELing my way into a restart windows loaded fine.
> 
> *Another issue which I haven't the faintest idea if its related is that Afterburner now no longer shows all three GPUs.* Its only showing two which freaked me out but after a couple of Firestrike runs all three are being used and GPU-Z still shows all three as well as AIDA64 Extreme. The missing GPU in Afterburner is one of the GPUs from the 295X2. Its reporting the 290X and one GPU from the 295X2. I did however swap the cards to opposite PCI-E slots so now the 290X is in the primary (bottom) PCI-E slot so I don't know if that is a part of why this is happening. Meh, whatever.
> 
> Another issue I am not completely certain is related is that 3dmark results were showing default GPU clock on my 290X when it was overclocked in Afterburner. The 295X2 was reporting the correct overclock in the 3dmark results. GPU-Z showed all three cores were running same overclock though which made even less sense. Again, meh.


Disable ULPS.


----------



## xer0h0ur

I always disable ULPS after every driver install. I am going to remove and re-install Afterburner and see if that fixes it.

EDIT: You're a BOSS electro. That was it. All gravy now.


----------



## fat4l

Hi.
I just bought my 295X2 and im planning on adding it to my custom wcooling.

What do u believe is the max safe voltage for this card in a custom(quality) waterloop?
Can it do +100mV ?


----------



## ljreyl

Quote:


> Originally Posted by *fat4l*
> 
> Hi.
> I just bought my 295X2 and im planning on adding it to my custom wcooling.
> 
> What do u believe is the max safe voltage for this card in a custom(quality) waterloop?
> Can it do +100mV ?


I'd say that's the max safe voltage you can go. The card heats up easily and the VRMs are closely packed together. If you're going with tons of radiator space, you might get away with 150mV.

But really, you can push clocks out pretty high for stock voltage. I just recently hit 1070/1425 on stock volts.


----------



## silencespr

Quote:


> Originally Posted by *xer0h0ur*
> 
> Speaking of something not being right...ran into several issues after installing the Omega driver.
> 
> I had a weird issue when I rebooted after installation. Login took much longer than usual and when it finally got past the login screen I was greeted by a black screen, then an internet explorer window with some sort of registration which I immediately closed only to still have a black screen and a command prompt window. It wouldn't allow me to close this command prompt window and never finished loading windows. After ALT+CTRL+DELing my way into a restart windows loaded fine.
> 
> Another issue which I haven't the faintest idea if its related is that Afterburner now no longer shows all three GPUs. Its only showing two which freaked me out but after a couple of Firestrike runs all three are being used and GPU-Z still shows all three as well as AIDA64 Extreme. The missing GPU in Afterburner is one of the GPUs from the 295X2. Its reporting the 290X and one GPU from the 295X2. I did however swap the cards to opposite PCI-E slots so now the 290X is in the primary (bottom) PCI-E slot so I don't know if that is a part of why this is happening. Meh, whatever.
> 
> Another issue I am not completely certain is related is that 3dmark results were showing default GPU clock on my 290X when it was overclocked in Afterburner. The 295X2 was reporting the correct overclock in the 3dmark results. GPU-Z showed all three cores were running same overclock though which made even less sense. Again, meh.


funny thing i had all the same issues after omega...


----------



## xer0h0ur

Sometimes it feels like one step forward and two steps back with AMD drivers but hell, they are certainly trying. I am leaving the Omega driver installed though as I have not run into anything I would consider a fatal flaw...yet. AC Unity now works in tri-fire instead of the immediate BSOD loading the game the 14.11.2 Beta was giving me. The stuttering I used to get in Skyrim in enclosed spaces like caves or underground settings seems to be gone. Its never been this buttery smooth for me. CS:GO also works perfectly fine for me. I still need to test more games.


----------



## Juris

Guys I'm finally getting to use my 295x2 in an actual pc. 2 months after I bought it. I have an XFX 1250w Pro series Black edition PSU, this one http://xfxforce.com/en-us/products/all-previous-psus/pro-series-1250w-psu-black-edition-p1-1250-befx

I was wondering if I can use one of the 12-pin PCI-E cables supplied with it which splits off into 2 6-pins to power the 295x2 or do I need to use 2 12-pin cables with only 1 of each cables 6-pin heads plugged into the 295. Cheers.


----------



## xer0h0ur

You have me confused. The 295X2 uses two 8-pin cables. You can of course use 6+2 cables instead but it shouldn't work with 6-pin cables.


----------



## Juris

Quote:


> Originally Posted by *xer0h0ur*
> 
> You have me confused. The 295X2 uses two 8-pin cables. You can of course use 6+2 cables instead but it shouldn't work with 6-pin cables.


Oh sorry I wasn't looking at them correctly (the fun of 1 hour sleep in last 30). Its 12 pin at the PSU end and 8 on each cable end which is split itself with a 6 pin and a 2 pin piece which can click together.


----------



## joeh4384

Use two seperate cables from the psu


----------



## MapRef41N93W

Quote:


> Originally Posted by *Juris*
> 
> Oh sorry I wasn't looking at them correctly (the fun of 1 hour sleep in last 30). Its 12 pin at the PSU end and 8 on each cable end which is split itself with a 6 pin and a 2 pin piece which can click together.


You can use one cable as long as you have a single rail +12v. If multi rail you will need to use multiple.


----------



## xer0h0ur

He can use a single 12 pin to dual 6+2 cable since its not daisy chaining both together and its a single rail dishing out 104A. He is all good.

Edit: In the outside chance you notice that cable being very hot then its best if you connect a 2nd 12 pin cable so you're feeding power to the card from one 6+2 of each cable.

Double Edit: Just use both cables so you're on separate rails since its a multi-rail PSU after all.


----------



## doctakedooty

Quote:


> Originally Posted by *Juris*
> 
> Guys I'm finally getting to use my 295x2 in an actual pc. 2 months after I bought it. I have an XFX 1250w Pro series Black edition PSU, this one http://xfxforce.com/en-us/products/all-previous-psus/pro-series-1250w-psu-black-edition-p1-1250-befx
> 
> I was wondering if I can use one of the 12-pin PCI-E cables supplied with it which splits off into 2 6-pins to power the 295x2 or do I need to use 2 12-pin cables with only 1 of each cables 6-pin heads plugged into the 295. Cheers.


No you can't use a single pcie cable to power this gpu. The xfx psu you said you have is made my seasonic and is the x1250 series psu. It's suppose to be a single rail psu but it actually has multiple rails on it and will cause pc shut off because it will draw to much amperage my connector that goes into the psu I did not know it was multiple rail and was going off the single rail that the label says you will need to use two different pcie connectors on the psu and with that power supply you can only power 1 295x2 not 2 of them as it has enough wattage maybe to but not enough amps per rail.


----------



## xer0h0ur

Freakin false advertising. Its sold as single rail 104A. I don't know how these people get away without getting sued for false advertisement.


----------



## electro2u

Quote:


> Originally Posted by *doctakedooty*
> 
> No you can't use a single pcie cable to power this gpu. The xfx psu you said you have is made my seasonic and is the x1250 series psu. It's suppose to be a single rail psu but it actually has multiple rails on it and will cause pc shut off because it will draw to much amperage my connector that goes into the psu I did not know it was multiple rail and was going off the single rail that the label says you will need to use two different pcie connectors on the psu and with that power supply you can only power 1 295x2 not 2 of them as it has enough wattage maybe to but not enough amps per rail.


It's weird though because I have the Seasonic X-1250 and I use one cable to power the 295x2. It only gives me trouble (instant power down, no message of any kind-must be OCP) if I throw the Power Limit % in AB too high. I have the 295x2 overclocked and +30 power limit.

CHeck THIS out. I believe this is the issue:

There are 2 versions of this particular power supply.
SS-1250XM (mine)
SS-1250XM2 (DocstakeDooty's and XFX1250 OEM)

The design on one side of the PSUs are different:


And they come with different cable combinations (slightly)


----------



## ColeriaX

Quote:


> Originally Posted by *ljreyl*
> 
> I'd say that's the max safe voltage you can go. The card heats up easily and the VRMs are closely packed together. If you're going with tons of radiator space, you might get away with 150mV.
> 
> But really, you can push clocks out pretty high for stock voltage. I just recently hit 1070/1425 on stock volts.


The highest afterburner allows is +100 mv unless I'm missing something....your ram should have no problems reaching 1600 btw. Highest I was able to reach on stock volts was 1115 or so core 1625 mem. With +100mv and 50 % power I can bench at 1215 1690 on one card 1160 1690 with quadfire. They are begging for more volts just wish I knew how to give them moar !


----------



## Juris

Thanks for all the help and info guys. Finally up and running on my 295x2 using 2 cables. Last time I go for XFX PSU's though after that BS. If its sold as single rail it should be single rail. The 295x2 looks the business in my NZXT S340







Now for a day of updates & downloads.


----------



## xer0h0ur

Enjoy, don't forget to disable ULPS either manually in the registry or through an application like Afterburner.


----------



## ljreyl

Quote:


> Originally Posted by *ColeriaX*
> 
> The highest afterburner allows is +100 mv unless I'm missing something....your ram should have no problems reaching 1600 btw. Highest I was able to reach on stock volts was 1115 or so core 1625 mem. With +100mv and 50 % power I can bench at 1215 1690 on one card 1160 1690 with quadfire. They are begging for more volts just wish I knew how to give them moar !


The +150mV is if you flash to the Asus Bios and use GPUTweak. And I have "low clocks" because I'm running trifire and I have to go with my weakest links. The core goes up to 1070 stable for trifire on stock volts and my 290x can only get 1425 stable vs my 295 getting 1550 before crapping out during gaming. But yea. Good luck with custom water cooling. It's worth it!


----------



## ColeriaX

Quote:


> Originally Posted by *ljreyl*
> 
> The +150mV is if you flash to the Asus Bios and use GPUTweak. And I have "low clocks" because I'm running trifire and I have to go with my weakest links. The core goes up to 1070 stable for trifire on stock volts and my 290x can only get 1425 stable vs my 295 getting 1550 before crapping out during gaming. But yea. Good luck with custom water cooling. It's worth it!


Nice did not know that. Wonder if I can use gputweak without flashing the bios to increase volts...


----------



## xarot

Huh I asked Sapphire to verify that the warranty won't be a problem if I change the stock radiator fan to something else. Well they replied I'll be out of warranty then. Great.


----------



## ColeriaX

Anyone have an ASUS 295x2 that can upload their BIOS? TPU doesnt seem to have an ASUS BIOS in their possession for the 295x2...really wanna try out some more voltage.


----------



## fat4l

Quote:


> Originally Posted by *ColeriaX*
> 
> Anyone have an ASUS 295x2 that can upload their BIOS? TPU doesnt seem to have an ASUS BIOS in their possession for the 295x2...really wanna try out some more voltage.


that would be cool if someone had it







What about this one :
http://www.techpowerup.com/vgabios/157762/sapphire-r9295x2-4096-140414.html
It has improved clocks, but not sure if volage as well.

as i never flashed a gpu bios, a link to some nice guide would be also nice.








(would it be possible to use the same guide as for flashing 7990 ? http://forums.overclockers.co.uk/showthread.php?t=18558655 )

Aparently u can do power target 100% on asus card.
https://pcdiy.asus.com/2014/10/tested-ares-iii-water-cooled-r9-295-x2-review/
What influence it has vs 50%?


----------



## electro2u

Quote:


> Originally Posted by *xarot*
> 
> Huh I asked Sapphire to verify that the warranty won't be a problem if I change the stock radiator fan to something else. Well they replied I'll be out of warranty then. Great.


Oh that's nonsense. They are just covering their butts whoever said that. Just make sure you can put it back like it was.


----------



## ColeriaX

Quote:


> Originally Posted by *fat4l*
> 
> that would be cool if someone had it
> 
> 
> 
> 
> 
> 
> 
> What about this one :
> http://www.techpowerup.com/vgabios/157762/sapphire-r9295x2-4096-140414.html
> It has improved clocks, but not sure if volage as well.


Thanks, I am already using the Sapphire OC BIOS, I wanted to try the asus BIOS to see if I could use GPUTweak to give it more than +100mV
Quote:


> as i never flashed a gpu bios, a link to some nice guide would be also nice.
> 
> 
> 
> 
> 
> 
> 
> 
> (would it be possible to use the same guide as for flashing 7990 ? http://forums.overclockers.co.uk/showthread.php?t=18558655 )


I used the same instructions that can be found in the R9 290--->r9 290X Unlock thread
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_50


----------



## xarot

Quote:


> Originally Posted by *electro2u*
> 
> Oh that's nonsense. They are just covering their butts whoever said that. Just make sure you can put it back like it was.


Well, there is one zip tie which I need to cut if I want to take off the fan completely. I knew Sapphire wasn't very good with warranty, but their products have been maybe the best in AMD camp during the years..


----------



## fat4l

one more question,
what waterblock would perform better ?
EK Water Blocks EK-FC R9-295X2 - Nickel VS Aqua Computer Kryographics Vesuvius for 295X2 Acrylic Edition - Nickel Plated .

http://www.overclockers.co.uk/showproduct.php?prodid=WC-567-EK&groupid=962&catid=1520&subcat=2765
http://www.overclockers.co.uk/showproduct.php?prodid=WC-274-AQ&groupid=962&catid=1520&subcat=2765


----------



## xarot

Quote:


> Originally Posted by *fat4l*
> 
> one more question,
> what waterblock would perform better ?
> EK Water Blocks EK-FC R9-295X2 - Nickel VS Aqua Computer Kryographics Vesuvius for 295X2 Acrylic Edition - Nickel Plated .
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-567-EK&groupid=962&catid=1520&subcat=2765
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-274-AQ&groupid=962&catid=1520&subcat=2765


Are you sure you'll want a nickel block? I think copper is always the best way to go, but that's just me...after seeing a failure with EK's nickel-plated 780 block myself.


----------



## ramos29

295x2 is already a beast, trying to overclock it and incresse voltage is just insane! i will never overclock my gpu or my cpu if i had to change mine one day ( you will say so what are you doing in overclock.net XDDDD)


----------



## MapRef41N93W

Quote:


> Originally Posted by *ramos29*
> 
> 295x2 is already a beast, trying to overclock it and incresse voltage is just insane! i will never overclock my gpu or my cpu if i had to change mine one day ( you will say so what are you doing in overclock.net XDDDD)


Here you go I think this site seems like it's more down your alley http://www.tomshardware.com/forum/

Sorry couldn't resist


----------



## ramos29

hhhhhhh i was seeking a forum where people talk about their 295x2, i found many but this one seemed to be the most populated so i registred here


----------



## doctakedooty

Quote:


> Originally Posted by *electro2u*
> 
> It's weird though because I have the Seasonic X-1250 and I use one cable to power the 295x2. It only gives me trouble (instant power down, no message of any kind-must be OCP) if I throw the Power Limit % in AB too high. I have the 295x2 overclocked and +30 power limit.
> 
> CHeck THIS out. I believe this is the issue:
> 
> There are 2 versions of this particular power supply.
> SS-1250XM (mine)
> SS-1250XM2 (DocstakeDooty's and XFX1250 OEM)
> 
> The design on one side of the PSUs are different:
> 
> 
> And they come with different cable combinations (slightly)


Mine is the 1250xm version not xm2. I didn't have a problem with instant reboots either until I ran 4k. As soon as I hooked up my 4k monitor is when it started tripping ocp. In fact one if the stock pcie connectors that plugs into the psu melted which could have turned out pretty badly. When I gamed on 1440p it never tripped the ocp and I have never ran my 295x2 oced always ran it at stock clocks and volts.
Quote:


> Originally Posted by *fat4l*
> 
> one more question,
> what waterblock would perform better ?
> EK Water Blocks EK-FC R9-295X2 - Nickel VS Aqua Computer Kryographics Vesuvius for 295X2 Acrylic Edition - Nickel Plated .
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-567-EK&groupid=962&catid=1520&subcat=2765
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-274-AQ&groupid=962&catid=1520&subcat=2765


I am a ek fan myself. I think they should perform about the same. I do have the ek nickel block myself and I do recommend you buy the ek backplate as they give you more thermal pads that unlike my 780ti classy or 780 they cover alot more of the hot spots on the back. The nickel plating used to be a issue with ek but they changed there nickel plating manufacturing so there is not the problem of flaking like there used to be. The nickel is again just more for looks anyways. As far as the blocks go performance wise should be about equal just depends on which style you like better.


----------



## cennis

Quote:


> Originally Posted by *fat4l*
> 
> one more question,
> what waterblock would perform better ?
> EK Water Blocks EK-FC R9-295X2 - Nickel VS Aqua Computer Kryographics Vesuvius for 295X2 Acrylic Edition - Nickel Plated .
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-567-EK&groupid=962&catid=1520&subcat=2765
> http://www.overclockers.co.uk/showproduct.php?prodid=WC-274-AQ&groupid=962&catid=1520&subcat=2765


They main problem with 295x2 is still the VRMs even with full cover waterblocks. If you overclock hard the vrm will still be hot. core temps will be the same for those blocks but 290x cards aquacomputer ones have -20c lead for vrms.

i have a nickel AC brand new in box that needs a new home. also i have the active cooling backplate which may improve vrm cooling a little more


----------



## ramos29

does the 295x2 give 4 or 8go of usable memory? i know that two r9 290x in crossfire dliver 4go of uable memory but i saw some people talk about 8go of ram in the 295x2


----------



## RagingCain

All 8GB of VRAM is usable, but it's only 4GB VRAM totaled that is mirrored onto another set of 4GB VRAM.


----------



## fat4l

Quote:


> Originally Posted by *cennis*
> 
> They main problem with 295x2 is still the VRMs even with full cover waterblocks. If you overclock hard the vrm will still be hot. core temps will be the same for those blocks but 290x cards aquacomputer ones have -20c lead for vrms.
> 
> i have a nickel AC brand new in box that needs a new home. also i have the active cooling backplate which may improve vrm cooling a little more


so basically, is there any difference in cooling VRM's between EK and AqCo ? If im buying i wanna make sure im buying hte best








Quote:


> Originally Posted by *doctakedooty*
> 
> I am a ek fan myself. I think they should perform about the same. I do have the ek nickel block myself and I do recommend you buy the ek backplate as they give you more thermal pads that unlike my 780ti classy or 780 they cover alot more of the hot spots on the back. The nickel plating used to be a issue with ek but they changed there nickel plating manufacturing so there is not the problem of flaking like there used to be. The nickel is again just more for looks anyways. As far as the blocks go performance wise should be about equal just depends on which style you like better.


Yeah that was a long time ago(2011 i think) and now its okay. Also if U dont use silver coil its all okay.
I will deffo buy a backplate on this one









Still waiting for someone with Asus 295x2 Ares bios(for both gpus). Anyone?


----------



## xer0h0ur

I avoided the aquacomputer block because the active backplate for the 295X2 didn't exist and I wasn't about to leave the backside uncooled at all so I just went with the EK nickel block. Then there was some rumor about something being wrong with the aquacomputer block so I didn't feel like finding out firsthand if it was BS or true.


----------



## evoll88

Does anyone know what the best deal on one of these is? I would like to buy 2 and put a W.B. on them.


----------



## Orivaa

Quote:


> Originally Posted by *fat4l*
> 
> Still waiting for someone with Asus 295x2 Ares bios(for both gpus). Anyone?


I think the Ares bios on a regular 295x2 would cause all kinds of trouble.


----------



## xer0h0ur

I certainly wouldn't be brave enough to try a non-reference card BIOS on a reference card.


----------



## RagingCain

If you guys wanted, I could look at possibly editing some reference BIOSes. I would need some uploads of reference BIOSes.


----------



## fat4l

Quote:


> Originally Posted by *Orivaa*
> 
> I think the Ares bios on a regular 295x2 would cause all kinds of trouble.


so which asus bios then enables to add + 150mV instead of 100? Normal asus 295x2 bios ?
Quote:


> Originally Posted by *ljreyl*
> 
> The +150mV is if you flash to the Asus Bios and use GPUTweak. And I have "low clocks" because I'm running trifire and I have to go with my weakest links. The core goes up to 1070 stable for trifire on stock volts and my 290x can only get 1425 stable vs my 295 getting 1550 before crapping out during gaming. But yea. Good luck with custom water cooling. It's worth it!


hm ?


----------



## snow cakes

can this card play bf4 ultra settings on a 4k single monitor with no issues?


----------



## Asus11

Quote:


> Originally Posted by *ljreyl*
> 
> The +150mV is if you flash to the Asus Bios and use GPUTweak. And I have "low clocks" because I'm running trifire and I have to go with my weakest links. The core goes up to 1070 stable for trifire on stock volts and my 290x can only get 1425 stable vs my 295 getting 1550 before crapping out during gaming. But yea. Good luck with custom water cooling. It's worth it!


what is the reason for your tubing you did to connect to the 290x?


----------



## xer0h0ur

Quote:


> Originally Posted by *snow cakes*
> 
> can this card play bf4 ultra settings on a 4k single monitor with no issues?


I don't know about the performance aspect (framerates) of it since I don't own the game but in the last beta driver they fixed the crashing users were getting. At least most report BF4 Mantle works perfectly fine now. I don't have a clue if the Omega driver still works just as fine or if the crashing persists in DX11.


----------



## ljreyl

Quote:


> Originally Posted by *Asus11*
> 
> what is the reason for your tubing you did to connect to the 290x?


I assume you're talking about my build. It's to have water flow from the 295x2 to the 290x. It's custom water cooling.
Not sure if you saw the latest picture, but here's the link.


http://imgur.com/5oWVo


----------



## MapRef41N93W

Quote:


> Originally Posted by *snow cakes*
> 
> can this card play bf4 ultra settings on a 4k single monitor with no issues?


Frame rate wise yes. You can play any game at very high with 0x AA and get 60 fps. You don't need AA at 4k anyways. It makes almost no difference even if you put your face right near the panel. Maybe at 40 inches + you would.


----------



## cennis

Quote:


> Originally Posted by *fat4l*
> 
> so basically, is there any difference in cooling VRM's between EK and AqCo ? If im buying i wanna make sure im buying hte best


I am saying theres no difference for the core temps, for vrm the aquacomputer should do better if it followed the same trend for 290 blocks. i also have the active cooling vrm backplate which should help. pm me if u are interested.


----------



## xer0h0ur

Forgetting looks and branding, the active backplate is enough to draw me towards that aquacomputer block simply because of the 295X2's hot VRMs. Taking looks into account I still like the aquacomputer block more.


----------



## Juris

Looks like I spoke too soon. Got the 295x2 to power up and was installing Windows 8 using the i7's onboard GPU when I posted. Bloody MSI Gaming 9 AC mobo PCI-E slots are DOA. They don't register any PCI-E card as present with the 295 or a HP SAS card. Lights on, nobody's home. Looks like RMA time before 295 game time. Never rains but it pours.

While I'm waiting anyone recommend a good replacement backplate for the 295x2 (I'm keeping the cards air cooling) to complete the look of the build or any company who makes backplates for other components like soundcards etc. Cheers.


----------



## electro2u

Quote:


> Originally Posted by *ljreyl*
> 
> I assume you're talking about my build. It's to have water flow from the 295x2 to the 290x. It's custom water cooling.
> Not sure if you saw the latest picture, but here's the link.
> 
> 
> http://imgur.com/5oWVo


What are those attractive pedastal items you have the case sitting on. Very nice looking build.


----------



## ljreyl

Quote:


> Originally Posted by *electro2u*
> 
> What are those attractive pedastal items you have the case sitting on. Very nice looking build.


You're like the millionth person to ask that since I posted this build on reddit lol.
They're monitor stands. I repurposed them to be tower stands so it's not on my carpet.


----------



## fat4l

Quote:


> Originally Posted by *ljreyl*
> 
> You're like the millionth person to ask that since I posted this build on reddit lol.
> They're monitor stands. I repurposed them to be tower stands so it's not on my carpet.


just wondering, are u in possession of asus 295x2 bios ? (+150mV one)









We need one !


----------



## ColeriaX

Trick that worked for me using the sapphire oc bios was to open trixxx set my vddc and clocks then open afterburner while trixxx is still open and voila more than +100mv as long as you set it higher in trixxx. I was able to bench at 1240 1725 in FS using +140mV


----------



## ljreyl

I
Quote:


> Originally Posted by *fat4l*
> 
> just wondering, are u in possession of asus 295x2 bios ? (+150mV one)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We need one !


I actually used the Sapphire OC bios on mine. Then you can use Trixx and go up 150mV if you wanted.


----------



## remnant

I'm contemplating getting one of these bad boys for my mITX build. Are people experiencing problems playing games? old or new?


----------



## MapRef41N93W

Quote:


> Originally Posted by *remnant*
> 
> I'm contemplating getting one of these bad boys for my mITX build. Are people experiencing problems playing games? old or new?


Shadow of Mordor has distant texture flickering and menus that are basically unusable (they go spastic all over the screen when you roll over them) as well as massive screen tearing when switching out of wraith mode with crossfire enabled. Game is completely fine outside of that. Far Cry 4 has Crossfire disabled in the current build. Those are the main two games causing issues. Other than that I have been able to play just about everything else flawlessly at very high/ultra 0x AA 4k.

Also I don't know exactly what the cause is, but Unigine Valley does suffer very bad screen tearing when switching scenes. Could just be a 4k thing, but not like it really matters since the actual benchmark runs fine.


----------



## remnant

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Shadow of Mordor has distant texture flickering and menus that are basically unusable (they go spastic all over the screen when you roll over them) as well as massive screen tearing when switching out of wraith mode with crossfire enabled. Game is completely fine outside of that. Far Cry 4 has Crossfire disabled in the current build. Those are the main two games causing issues. Other than that I have been able to play just about everything else flawlessly at very high/ultra 0x AA 4k.
> 
> Also I don't know exactly what the cause is, but Unigine Valley does suffer very bad screen tearing when switching scenes. Could just be a 4k thing, but not like it really matters since the actual benchmark runs fine.


Any idea in terms of skyrim ( yup still playing that








) bioshock infinite ( and that ) and borderlands 2 ( I need new games )?


----------



## MapRef41N93W

Quote:


> Originally Posted by *remnant*
> 
> Any idea in terms of skyrim ( yup still playing that
> 
> 
> 
> 
> 
> 
> 
> ) bioshock infinite ( and that ) and borderlands 2 ( I need new games )?


Nope sorry. Haven't played Bioshock or Skyrim in over a year. Don't even own Borderlands (not really a fan).


----------



## fat4l

Quote:


> Originally Posted by *ljreyl*
> 
> I
> I actually used the Sapphire OC bios on mine. Then you can use Trixx and go up 150mV if you wanted.


Quote:


> Originally Posted by *ColeriaX*
> 
> Thanks, I am already using the Sapphire OC BIOS, I wanted to try the asus BIOS to see if I could use GPUTweak to give it more than +100mV


Can you try if it works ?


----------



## ColeriaX

It does...use the sapphire of bios and then use trixxx to set vddc.


----------



## fat4l

Quote:


> Originally Posted by *ColeriaX*
> 
> It does...use the sapphire of bios and then use trixxx to set vddc.


that is cool









is there any way how to make it work with afterburner too?

Anyway, what clocks u pushed it to with +150mV?


----------



## remnant

can someone confirm if these fit in a Bitfenix Prodigy? officially they fit length wise it's the width I'm concerned about


----------



## ColeriaX

Quote:


> Originally Posted by *fat4l*
> 
> that is cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> is there any way how to make it work with afterburner too?
> 
> Anyway, what clocks u pushed it to with +150mV?


Wouldn't take any higher than +140 without constant blackscreening.....not sure what that means maybe not enough power available through the two 8 pins?? But I was able to bench firestrike at 1240 1725 on my powercolor 295x2.

Here was my 2nd best run...first one had cpu info missing (200 pts higher though)http://www.3dmark.com/fs/3447666 (sorry on phone can't post a screenshot)


----------



## ColeriaX

Quote:


> Originally Posted by *remnant*
> 
> can someone confirm if these fit in a Bitfenix Prodigy? officially they fit length wise it's the width I'm concerned about


From Tom's forums same question asked
There's no particular reason why not, it's probably the only ITX case which is suited to doing this. The 250D might work, but it wouldn't be as good as you'd be forced to put the 295x2 radiator on the front.

You are fine on graphics card length/width. It has suitable locations for the radiator(s) too.
I'd have two or three minor concerns-
1) Both the H100i and 295x2 have reasonably long pipes for larger cases. That means you'll have a lot of tucked/folded pipes. You should have the space in which to do it, but it'll be a bit of a mess and it won't do much for airflow either.
2) With all of those pipes, installation will be a nightmare.
3) Off the top of my head I only know of one PSU which fits in a Prodigy and will support this hardware config (Silverstone Strider Plus ST1000-P) Now I'm sure there's others which will, but the selection will be dramatically limited. It is possible to mod the case to extend the PSU area.


----------



## fat4l

Quote:


> Originally Posted by *ColeriaX*
> 
> Wouldn't take any higher than +140 without constant blackscreening.....not sure what that means maybe not enough power available through the two 8 pins?? But I was able to bench firestrike at 1240 1725 on my powercolor 295x2.
> 
> Here was my 2nd best run...first one had cpu info missing (200 pts higher though)http://www.3dmark.com/fs/3447666 (sorry on phone can't post a screenshot)


thats nice

Im trying to flash my bios to saphice OC bios ( http://www.techpowerup.com/vgabios/157762/sapphire-r9295x2-4096-140414.html ) but before I do I need to ask something.
When I download the bios, its just 1 file. This card is dual GPU so I need 2 bioses for both gpus right?
Did u use the same bios for both gpus?

Here in this post about flashing radeon 7990 (dual gpu too) ( http://forums.overclockers.co.uk/showthread.php?t=18558655 ) he is saying:
Quote:


> Included in step 7 is the download link to the two custom bios which you will flash. They are labeled Malta1 (master bios for gpu1) and Malta2 (slave bios for gpu2). Its very important you flash the master bios to gpu 1. If you flash the mater bios to the slave or the other way around the card will not function correctly.


How did you deal with it ?
(the guy that I quoted is using his custom bios so maybe it matters to use the right bios for each gpu but maybe it doesnt matter to us as we use an official bios and its the same for both gpus? This is just my assumtion so waiting for some opinions on this







)


----------



## ColeriaX

Earlier in the thread (look for coolmikes posts) he uploaded both master and slave bioses for the sapphire 295x2. I used the same process that the 290 to 290x guys did. Just search for the 290 to 290x unlock thread. The 295x2 has a master and slave bios and they are labeled as such in coolmikes post. I only flashed the master bios on both of My cards but if you want to do the slave as well it's as simple as sliding your bios switch over and repeating the process with the slave BIOS. Hope that clears it up.


----------



## fat4l

Quote:


> Originally Posted by *ColeriaX*
> 
> Earlier in the thread (look for coolmikes posts) he uploaded both master and slave bioses for the sapphire 295x2. I used the same process that the 290 to 290x guys did. Just search for the 290 to 290x unlock thread. The 295x2 has a master and slave bios and they are labeled as such in coolmikes post. I only flashed the master bios on both of My cards but if you want to do the slave as well it's as simple as sliding your bios switch over and repeating the process with the slave BIOS. Hope that clears it up.


well im still talking about bios1.
how i understand it u need 2 bioses for each "bios setting switch" so 4 in total for both switch positions but 2 bioses is enough for bios1 postion? Or im just not getting it right


----------



## ColeriaX

You can flash just the master...the slave does not need to be flashed unless you want too.


----------



## xer0h0ur

So a fellow Alienware Aurora owner was told by VisionTek that its CryoVenom 295X2 can only be paired with another identical card. I told him that they flat out lied to him. I am right though, right? As far as I know you can mix and match a 295X2 with any 295X2, 290X or 290....regardless of being reference design or non-reference.


----------



## joeh4384

You need the master and slave bios for each switch. I nearly had a heart attack almost ******* up my bioses until i flashed the master then flashed the slave in the same switch position. I accidently flashed the mater twice and it ruined xfire.


----------



## joeh4384

Quote:


> Originally Posted by *ColeriaX*
> 
> You can flash just the master...the slave does not need to be flashed unless you want too.


Don't you need to flash the slave so the 2nd GPU has the same bios paremeters as the first one?


----------



## ColeriaX

Brain fart. I meant you don't need to slide the bios switch and do it again. Good catch.


----------



## fat4l

Quote:


> Originally Posted by *ColeriaX*
> 
> Brain fart. I meant you don't need to slide the bios switch and do it again. Good catch.


tahts what im talking about








so basically, we have to use the flash command twice(gpu1-master bios, gpu2-slave bios) for each position of bios switch right ? Therefore we need 2 bioses; master and slave for each position of switch right ?


----------



## joeh4384

There is essentially two sets of two bioses with the switch. You can just flash gpu1 master and gpu2 slave with the switch in one position and just leave the switch alone unless you want the other bioses changed.


----------



## ColeriaX

Quote:


> Originally Posted by *fat4l*
> 
> tahts what im talking about
> 
> 
> 
> 
> 
> 
> 
> 
> so basically, we have to use the flash command twice(gpu1-master bios, gpu2-slave bios) for each position of bios switch right ? Therefore we need 2 bioses; master and slave for each position of switch right ?


The other bios (sliding of the switch) is a failsafe in case of a bad flash or something else terrible happening. I didn't flash the other switch position on my cards.


----------



## fat4l

yeah i got it now.
I dont want to flash both bioses either, just bios#1.
TPUp doesnt have 2 bioses;master and slave. Im searching for that OC bios now in this forum like u suggested ....









here,
http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/760


----------



## joeh4384

295x2OCBioses.zip 72k .zip file


----------



## joeh4384

Quote:


> Originally Posted by *fat4l*
> 
> yeah i got it now.
> I dont want to flash both bioses either, just bios#1.
> TPUp doesnt have 2 bioses;master and slave. Im searching for that OC bios now in this forum like u suggested ....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here,
> http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/760


Here you go. I downloaded these back on 9/28 when I flashed my card.


----------



## Mega Man

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *remnant*
> 
> I'm contemplating getting one of these bad boys for my mITX build. Are people experiencing problems playing games? old or new?
> 
> 
> 
> Shadow of Mordor has distant texture flickering and menus that are basically unusable (they go spastic all over the screen when you roll over them) as well as massive screen tearing when switching out of wraith mode with crossfire enabled. Game is completely fine outside of that. Far Cry 4 has Crossfire disabled in the current build. Those are the main two games causing issues. Other than that I have been able to play just about everything else flawlessly at very high/ultra 0x AA 4k.
> 
> Also I don't know exactly what the cause is, but Unigine Valley does suffer very bad screen tearing when switching scenes. Could just be a 4k thing, but not like it really matters since the actual benchmark runs fine.
Click to expand...

SoM i have none of these problems in eyefinity or 1080p it just freezes and BSOD ( have to bring up task manager and close the process ,from time to time i see driver crashed ) it will be 20ish min to 2+ hours befroe it does it, seems to be worse in different areas of the map ( i think the game needs patched not drivers as it happens on NVIDIA ) also from time to time rain slows to stop then it looks like mini lightnings

besides that in eyefinity ( which is prone to have certain issues as most games are not wide screen format friendly ) the esc menu is not in correct perspective nor is the artifact menu, however this does not affect playability
Quote:


> Originally Posted by *remnant*
> 
> I'm contemplating getting one of these bad boys for my mITX build. Are people experiencing problems playing games? old or new?


i just bought skyrim but i can check bioshock infinite and boarderlands for you

skyrim i am holding off on i want to play 1 -4 first


----------



## fat4l

well i updated the bios. gpuz shows the clocks nicely.
however. afterburner cant see voltages(monitoring). Any fix ? I put a tick in "setting" - enable voltage monitoring but it still doesnt work.

ANy ideas?

edit://
ok fixed it


----------



## remnant

Quote:


> Originally Posted by *Mega Man*
> 
> SoM i have none of these problems in eyefinity or 1080p it just freezes and BSOD ( have to bring up task manager and close the process ,from time to time i see driver crashed ) it will be 20ish min to 2+ hours befroe it does it, seems to be worse in different areas of the map ( i think the game needs patched not drivers as it happens on NVIDIA ) also from time to time rain slows to stop then it looks like mini lightnings
> 
> besides that in eyefinity ( which is prone to have certain issues as most games are not wide screen format friendly ) the esc menu is not in correct perspective nor is the artifact menu, however this does not affect playability
> i just bought skyrim but i can check bioshock infinite and boarderlands for you
> 
> skyrim i am holding off on i want to play 1 -4 first


I appreciate that, o wow 1-4 ?!? I started at Morrowind and played some oblivion, never played 1-2 and never beat the main story of 3-4. good luck


----------



## Mega Man

http://www.amazon.com/gp/product/B00E9I1FPI/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1

i got that on black friday :O ... for 10


----------



## remnant

Quote:


> Originally Posted by *Mega Man*
> 
> http://www.amazon.com/gp/product/B00E9I1FPI/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
> 
> i got that on black friday :O ... for 10


warning















As a fan I really want that, now why can't I again, O right, trying to buy an R9 295x2


----------



## fat4l

I like the ne Sapphire OC bios







thx again guys.

A few more questions...
Which program for bios editing is compatible with R9 2XX bioses? I wanted to check the default voltages for sapphire OC vs MSI default bios.

Whats the default core voltage for 295x2? If I check with afterburner it jumps all the time. Gpu-z too, sometimes its 1.2 sometimes 1.139 etc.

A few notes...
Trixx allows me to use the voltage slider till 1.3v(but its actually more...almost 1.5v for gpu1 I believe)
Afterburner allows me to use still only +100mV (max 1.285v for gpu1)










*edit1://*
it semms to me its
gpu1: ~1.175v
gpu2: ~1.135v

*edit2://*
so with afterburner i can get to ~1.285v for gpu1 and ~1.240v for gpu2

With Trixx u can get far more....need to add this card to my custom wc loop









I wish there was any way to add more voltage in afterburner. :/ I like it much more than Trixx

*Edit3://*
Quote:


> Originally Posted by *cennis*
> 
> I am saying theres no difference for the core temps, for vrm the aquacomputer should do better if it followed the same trend for 290 blocks. i also have the active cooling vrm backplate which should help. pm me if u are interested.


I would buy it but im located in the UK so...







I would also have to pay customs+expensive delivery


----------



## ramos29

i just got my saphire 295x2, i dont know whats going on: the game crashes when i launch shadow of mordor+watchdog call of duty... it tells me unsuffisent memory! fps is not as expected in the other games .... am i supposed to do some tweaks in the amd catalyste or what? i am using the omega driver, i used DDU to deleate old drivers
and when i switched to the beta driver still have the same problemes


----------



## Shaded War

Quote:


> Originally Posted by *ramos29*
> 
> i just got my saphire 295x2, i dont know whats going on: the game crashes when i launch shadow of mordor+watchdog call of duty... it tells me unsuffisent memory! fps is not as expected in the other games .... am i supposed to do some tweaks in the amd catalyste or what? i am using the omega driver, i used DDU to deleate old drivers
> and when i switched to the beta driver still have the same problemes


Interesting, I get this with my 7970 crossfire also. Just because it's crossfire enabled all of a sudden I get low memory errors and it asks me to disable aero on the desktop to save on memory usage or it just kills my game and gives the same error you'r describing. Even monitoring vram usage it never maxes out. Both cards were tested individually and I maxed out the vram with 5760x1080 on ultra to see if it was caused by a dead memory chip on one of the cards, but both cards kept working with no errors even with truly maxed out memory usage. I finally gave up and pulled my second card and not going to use crossfire again.

I was about to buy a 295x2 when they were on sale so I would have 4GB to fix this issue, but now I'm glad I didn't. It must be a driver issue or something.


----------



## ramos29

tomorow i will install a new copy of windows in a seperate hard drive and see if the problem persiste, when i put every thing on low the game starts and most of the time i can can max the settings ingame without problems, but when i start the game again i get the same message, i am starting to regret my choice


----------



## xer0h0ur

I don't know what you two have in common that is causing this issue but its not widespread and you two are the only ones I have heard mention it so far. The only other time recently I had heard of a game giving out of memory errors it was BF4 I believe which was a combination of game and driver updates that were needed.


----------



## ramos29

could this be related to old nviidia drivers which may have persisted dispite the DDU?


----------



## ramos29

i increased manually the virtual memorie now the problem is gone, but i am not satisfied with the fps i am getting, 35 fps in watchdog! 1920x1080!! bottelneck?


----------



## Mega Man

Quote:


> Originally Posted by *ramos29*
> 
> i just got my saphire 295x2, i dont know whats going on: the game crashes when i launch shadow of mordor+watchdog call of duty... it tells me unsuffisent memory! fps is not as expected in the other games .... am i supposed to do some tweaks in the amd catalyste or what? i am using the omega driver, i used DDU to deleate old drivers
> and when i switched to the beta driver still have the same problemes


Quote:


> Originally Posted by *Shaded War*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ramos29*
> 
> i just got my saphire 295x2, i dont know whats going on: the game crashes when i launch shadow of mordor+watchdog call of duty... it tells me unsuffisent memory! fps is not as expected in the other games .... am i supposed to do some tweaks in the amd catalyste or what? i am using the omega driver, i used DDU to deleate old drivers
> and when i switched to the beta driver still have the same problemes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Interesting, I get this with my 7970 crossfire also. Just because it's crossfire enabled all of a sudden I get low memory errors and it asks me to disable aero on the desktop to save on memory usage or it just kills my game and gives the same error you'r describing. Even monitoring vram usage it never maxes out. Both cards were tested individually and I maxed out the vram with 5760x1080 on ultra to see if it was caused by a dead memory chip on one of the cards, but both cards kept working with no errors even with truly maxed out memory usage. I finally gave up and pulled my second card and not going to use crossfire again.
> 
> I was about to buy a 295x2 when they were on sale so I would have 4GB to fix this issue, but now I'm glad I didn't. It must be a driver issue or something.
Click to expand...

Quote:


> Originally Posted by *ramos29*
> 
> i increased manually the virtual memorie now the problem is gone, but i am not satisfied with the fps i am getting, 35 fps in watchdog! 1920x1080!! bottelneck?


yes i was gonna say i had to get 35gb ( not joking ) pagefile to get rid of it with morridor

35 ! that is with 16 gb of ram, that was full as well. next i am getting 64gb ram and 32 ( for my 8350 )

which imo would be game not driver related?


----------



## ramos29

no i tried with the beat one same problem, does my cpu and the pci exp 2.0 bottlenecking my gpu? my cpu seems to work at 80% max and both cores at 80 85% frequency betwenn 850 and 1018mhz


----------



## ColeriaX

I'm running a 2600k with 16 gb of ram and 2 295x2s....I don't have any of those issues. Running the omega driver. Actually I do get crashes in SoM and the frozen rain but so are guys from the green camp. Game just needs a patch. As for bottlenecking its probably not the case as I see 100 percent gpu usage a lot of the time however in some unoptimized games I see awful gpu usage e.g. Warframe. I only get like 7%gpu usage in quadfire but that's pretty much the only game other than fc4 that I've played with serious issues.


----------



## cz1g

I got my 295x2, now I just need to finish my build and can get a xeon 1230v2 for 150 which is pretty good. That shouldn't bottleneck the card right? Right now I have a FX 6300 and I know for a fact that will bottleneck the 295x2.


----------



## xer0h0ur

Quote:


> Originally Posted by *ColeriaX*
> 
> I'm running a 2600k with 16 gb of ram and 2 295x2s....I don't have any of those issues. Running the omega driver. Actually I do get crashes in SoM and the frozen rain but so are guys from the green camp. Game just needs a patch. As for bottlenecking its probably not the case as I see 100 percent gpu usage a lot of the time however in some unoptimized games I see awful gpu usage e.g. Warframe. I only get like 7%gpu usage in quadfire but that's pretty much the only game other than fc4 that I've played with serious issues.


Well, according to his sig he's also running his 2600K at 3.4GHz and you're presumably running 5GHz+. Big difference there.


----------



## ramos29

somehow every thing is fixed now, puting every game at 4k ( without antialiasing ) and i have a solid 60 fps, exept watchdog which seems to be screwed and far cry 4 which does not support croosfire


----------



## electro2u

Acrylic tri-fire 295x2 bridge I just made...


Pain in the butt. Didn't turn out as clean as I wanted but it will do for now.


----------



## xer0h0ur

Bruh, that is what? Your third bridge? You might be obsessed









All ribbing aside it looks great imo.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Bruh, that is what? Your third bridge? You might be obsessed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All ribbing aside it looks great imo.










Just wanted to see if I could do it


----------



## xer0h0ur

Yeah man I want one like that for when I finally waterblock the 290X. This Omega driver has given me new hope with tri-fire. I am getting better performance than before, even in DX9.


----------



## electro2u

Yeah omega has me hurrying this up... Err getting off my butt to try it finally. Had this acrylic sitting in a corner for more than a month and been avoiding giving it a go. I had to put the bridge between the two GPUs with them outside the case and then put the whole assembly in at once. It's doable with just 2 but 3 you'd need help.


----------



## xer0h0ur

I have a question for you electro. Obviously you're also tri-fired so I wanted to see if you also experience this issue. First, do you have your 295X2 or your 290X as the card in your primary slot? Second, if your 295X2 is in the secondary slot and you connect your monitor to it does your motherboard give you an audible error code/black screen?

In my system when I put the 295X2 in the top (secondary) PCI-E slot with the 290X in the bottom (primary) PCI-E slot I will get a 6 beep error code and black screen if I connect my monitor to the 295X2. Soon as I connect it to the 290X....boots up like nothing. I am only using displayport cables btw. Mini-DP to DP on 295X2 or DP to DP with 290X.


----------



## electro2u

I've tried it both ways. I prefer to have the 295x2 on the bottom. The only way I can see my BIOS screen is if I connect my monitor to the card in the primary slot







Sorry don't have any better info for you.


----------



## xer0h0ur

Yeah, I suspect its Alienware motherboard shenanigans but can't know for sure without someone tri-fired replicating what I said. Either way its not a big issue since I am only using a single 4K monitor at the moment.


----------



## Mega Man

Quote:


> Originally Posted by *ColeriaX*
> 
> I'm running a 2600k with 16 gb of ram and 2 295x2s....I don't have any of those issues. Running the omega driver. Actually I do get crashes in SoM and the frozen rain but so are guys from the green camp. Game just needs a patch. As for bottlenecking its probably not the case as I see 100 percent gpu usage a lot of the time however in some unoptimized games I see awful gpu usage e.g. Warframe. I only get like 7%gpu usage in quadfire but that's pretty much the only game other than fc4 that I've played with serious issues.


well do you run eyefinity ? that is why ( i think ) i need 35gigs pagefile !


----------



## dumkopf604

Recently switched from 2X 780 Ti to a 295X2. Installed catalyst drivers, got rid of nVidia drivers. Everything worked fine and went about overclocking this beast. I wanted to switch back to 780 Tis just to compare numbers. Turns out I bricked the BIOS on my 295X2 not just one, but both. I have just finished flashing a stock BIOS back on to the 295X2, and now I'm only getting the performance one would expect out of a single 290X.

Turns out I was flashing a master BIOS to both GPUs and now crossfire is broken. Please help?


----------



## xarot

After using custom loop and watercooling and now only the stock AIO on 295X2, I can definitely see that a 120 rad can't handle 500 W. In Sleeping Dogs, both cores throttle to around 938 - 968 MHz and the thing is damn hot, even the hoses are burning...sometimes throttling goes even as far as 888 MHz. It still works though...


----------



## ramos29

i disableb the crossfire in watchdog and assasin creed unity in order to get more fps ( 45 fps in full hd -_- )
i saw some people puting together an amd card as primary gpu and an nvidia one " for physix " they said, can some one explain?


----------



## xer0h0ur

Quote:


> Originally Posted by *ramos29*
> 
> i disableb the crossfire in watchdog and assasin creed unity in order to get more fps ( 45 fps in full hd -_- )
> i saw some people puting together an amd card as primary gpu and an nvidia one " for physix " they said, can some one explain?


As far as I know you can't do it anymore without using ancient drivers making it useless.


----------



## remnant

Quote:


> Originally Posted by *xer0h0ur*
> 
> As far as I know you can't do it anymore without using ancient drivers making it useless.


or using some third party that may or may not work


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ramos29*
> 
> i disableb the crossfire in watchdog and assasin creed unity in order to get more fps ( 45 fps in full hd -_- )
> i saw some people puting together an amd card as primary gpu and an nvidia one " for physix " they said, can some one explain?
> 
> 
> 
> As far as I know you can't do it anymore without using ancient drivers making it useless.
Click to expand...

its called hybrid physx


----------



## xer0h0ur

Quote:


> Originally Posted by *Mega Man*
> 
> its called hybrid physx


So hows that work then?


----------



## joeh4384

Nvidia's modern drivers disable this when they detect an AMD card as the primary.


----------



## remnant

slightly off topic but are games using physx? I know of borderlands but I didn't know other AAA games were.


----------



## Mega Man

there are a few , but after seeing it, was so unamazed it was junk

batman ( all 3 ) SoM and a few others


----------



## ramos29

metal gear solid ground zero elite dangerous are out tomorow and nvidia already released their game ready driver -_- i dont realy care about elite but i want to play metal gear solid the way its meant to be played at day one


----------



## boredmug

Quote:


> Originally Posted by *Mega Man*
> 
> well do you run eyefinity ? that is why ( i think ) i need 35gigs pagefile !


I run eyefinity 5760x1080 with 2x7950 on shadows of mordor with an 8gb page file no problem. I know they aren't the same cards but ****, that's a big page file. Also, [email protected]


----------



## divotion

Hi All

I have a Problem with my Setup. I have an Asus 295x2 with a XFX DD 290x... With the 14.9 Driver In CrossfireX havt Problems. But with the Beta 14.11 and the last 14.12 omega Release Crossfire want work. At boot the Adapter are with a Yellow Triangle in HW Manager. Then i disable the gous an re enable it. One time the Driver recognize it one time i become bsod. But 14.9 np!

I cant understand it. Have tried fresh installs win7/8.1...

Had think to flash the asus card with sapphire rom. But i dont know that wirks? But want do that. Maybe i can brick something. Or the Issue come be more bad.

Maybe one of you have an Idea...

I need the new Drivers for Inquisition Dragon age. With 14.9 its horrible. Have told that to amd but havt answere about 4 Days.

Ty

Best regards
divo


----------



## electro2u

@divo you can't really brick these cards with a BIOS flash, not permanently. Especially since you have 2 separate cards you can always use one to reflash the other. Also if you have onboard video from a processor like the Haswell or AMD APUs you can use that video in order to reflash a bricked 290 or 295x2 BIOS.

First though, my advice would be to use Display Driver Uninstaller to clean out your previous drivers and retry the latest Omega release.

Hope this helps!


----------



## Mega Man

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> well do you run eyefinity ? that is why ( i think ) i need 35gigs pagefile !
> 
> 
> 
> I run eyefinity 5760x1080 with 2x7950 on shadows of mordor with an 8gb page file no problem. I know they aren't the same cards but ****, that's a big page file. Also, [email protected]
Click to expand...

i tried with 30gb and still ran outta memory and then 35 have had no issues since
Quote:


> Originally Posted by *divotion*
> 
> Hi All
> 
> I have a Problem with my Setup. I have an Asus 295x2 with a XFX DD 290x... With the 14.9 Driver In CrossfireX havt Problems. But with the Beta 14.11 and the last 14.12 omega Release Crossfire want work. At boot the Adapter are with a Yellow Triangle in HW Manager. Then i disable the gous an re enable it. One time the Driver recognize it one time i become bsod. But 14.9 np!
> 
> I cant understand it. Have tried fresh installs win7/8.1...
> 
> Had think to flash the asus card with sapphire rom. But i dont know that wirks? But want do that. Maybe i can brick something. Or the Issue come be more bad.
> 
> Maybe one of you have an Idea...
> 
> I need the new Drivers for Inquisition Dragon age. With 14.9 its horrible. Have told that to amd but havt answere about 4 Days.
> 
> Ty
> 
> Best regards
> divo


like he said ddu and then reinstall drivers


----------



## divotion

Hi Electro2u

Thank you for your Answere... But i dont think that it would be function. While i have fresh installed windows 7 and 8.1 and fresh installed omega drivers. But i will try your Idea with that Display Drivers uninstaller.

I give Feedback after the Try...

For Flash: It would be good to reflash my cards with maybe sapphire (same clock speeds) from sapphire? in HW Manager by my asus i dont see r200 Series.. I see Asus 295x2. Maybe there are a GPU Bios recognize Problem? What you think about that?

And when i flash the bios gpu... what for bios gpu for that cards you recommended?

Im be happy to now here... it seems a good Community...

Have wrote in amd forum too. No Answeres. And Support amd directly... no Answers... Be happy here for now.

Greetings
divo


----------



## divotion

Want work. Maybe in the new driver why my xfx have 1000 clock an asus 1018?


----------



## xer0h0ur

The bone stock 295X2 came with the GPU clocks at 1018MHz while the stock 290X has a 1000MHz clock. Both use 1250MHz clocks on the vRAM. You can easily just increase the clock on the 290X to 1018 to match the 295X2 without any problems or needing additional voltage.


----------



## divotion

aiaiai

now i have flashed my asus 295x2 with the original amd Rom 295x2 and the xfx 290x with the original amd Rom ... now only bsod... every Driver... with fresh install too

then i have tried geht to newest xfx ... same .... then sapphire on 295x2 ... same... then all sapphire roms ... same

lol...

now when i go to hw Manager and install the Driver on every gpu with all sapphire Rom manualy will go with activate the no regognise gpus...

but isnt normal that issue


----------



## xer0h0ur

You have all sorts of shenanigans going on


----------



## electro2u

@DIVO
Sorry for my last suggestion, was not very thoughtful--you did say you tried new installs of windows...
Makes me wonder about your motherboard BIOS

Is your motherboard BIOS up to date?

Is your graphics set to PCIE instead of onboard?


----------



## Mega Man

do you have an intel ( yea rigbuilder would help us )

is it possible that is the integrated graphics ?


----------



## divotion

Hi all

Nothing to do with omega. I can not use the Omega driver at the Moment:

http://forums.amd.com/game/messageview.cfm?catid=454&threadid=181956&enterthread=y

Have a sr-2 evga board a58 Bios.

I reflashed 295x2 and 290x. Now work on 14.9 and Win 7 now. Have reinstalled on win7 after Flash again.

nothing go with omega.


----------



## Mega Man

again this is where rigbuilder would of helped ( see my sig )


----------



## SLADEizGOD

I just got my XFX 295x2 back from RMA. I also got my Qnix 1440p monitor. Got everything to work great. But I noticed when I O.C my CPU to 4.5Ghz. It gets stuck at the windows loading screen. Has anyone experience that? I have DL the Omega drivers & CRU. But I do have to say. Coming from my 2 GTX 670's 2GB versions to this card. just WOW.


----------



## MapRef41N93W

Quote:


> Originally Posted by *SLADEizGOD*
> 
> I just got my XFX 295x2 back from RMA. I also got my Qnix 1440p monitor. Got everything to work great. *But I noticed when I O.C my CPU to 4.5Ghz.* It gets stuck at the windows loading screen. Has anyone experience that? I have DL the Omega drivers & CRU. But I do have to say. Coming from my 2 GTX 670's 2GB versions to this card. just WOW.


Your OC is unstable. Probably voltage too high.


----------



## ColeriaX

So I'm pretty frustrated overall with some issues I've had since I got my UD590D. While 4k is absolutely gorgeous for whatever reason I just cannot get my clocks to stick and run at the said rate (e.g. core 1100 mem 1600 using AB 4.0). Works perfectly fine when i switch back to 1080p, clocks stick and run at their said rates 100% of the time. Temps are not high at all barely in the 50s on both cards but the core and mem clocks are ALL over the place when using 4k res in games. Not only that but as soon as I attempt to run any extra voltage on the cards I start to get blackscreens...crazy because i was using VSR on the omega drivers prior to having my monitor and running that with much higher clocks and voltage. I was able to get through FS Ultra runs with plenty of added voltage/overclocks on my old ASUS 1080p monitor, now when I run it using my new monitor I get black screening every few seconds and eventually crashing. What in the world could have changed that is causing this? Just as a note, checked all power cables nice and snug plenty of headroom for the PSU since Im now using 2, an EVGA 1300 G2 and a Corsair AX850. ULPS is also disabled. I thought maybe it could be a driver issue so I DDU'd and reinstalled the Omega drivers..nope. So, I rolled back to 14.12 beta and same issues. So I figured WTH I'll give a fresh 8.1 install a go to see if maybe something in the registry got borked causing this. Nope, still have the same problems with the clocks bouncing all over the place @ 4k. I know there are a few of you here with the 295x2 and the UD590D, could you please chime in and let me know if you are having similar problems with the set clocks not sticking? Sorry for the wall of text here fellas thanks in advance for any thoughts or fixes. --Cole


----------



## xer0h0ur

Quote:


> Originally Posted by *ColeriaX*
> 
> So I'm pretty frustrated overall with some issues I've had since I got my UD590D. While 4k is absolutely gorgeous for whatever reason I just cannot get my clocks to stick and run at the said rate (e.g. core 1100 mem 1600 using AB 4.0). Works perfectly fine when i switch back to 1080p, clocks stick and run at their said rates 100% of the time. Temps are not high at all barely in the 50s on both cards but the core and mem clocks are ALL over the place when using 4k res in games. Not only that but as soon as I attempt to run any extra voltage on the cards I start to get blackscreens...crazy because i was using VSR on the omega drivers prior to having my monitor and running that with much higher clocks and voltage. I was able to get through FS Ultra runs with plenty of added voltage/overclocks on my old ASUS 1080p monitor, now when I run it using my new monitor I get black screening every few seconds and eventually crashing. What in the world could have changed that is causing this? Just as a note, checked all power cables nice and snug plenty of headroom for the PSU since Im now using 2, an EVGA 1300 G2 and a Corsair AX850. ULPS is also disabled. I thought maybe it could be a driver issue so I DDU'd and reinstalled the Omega drivers..nope. So, I rolled back to 14.12 beta and same issues. So I figured WTH I'll give a fresh 8.1 install a go to see if maybe something in the registry got borked causing this. Nope, still have the same problems with the clocks bouncing all over the place @ 4k. I know there are a few of you here with the 295x2 and the UD590D, could you please chime in and let me know if you are having similar problems with the set clocks not sticking? Sorry for the wall of text here fellas thanks in advance for any thoughts or fixes. --Cole


The only game I have played that did that to me was AC Unity. The clocks were bouncing from 700's to all over the place on all three cores. I got rid of that by manually creating an application profile and setting crossfire to AFR friendly. However since they just recently updated the game now I have a problem with flashing textures for water while playing the game like this. The same issue doesn't occur with Tombraider, CS:GO or Skyrim. I have not however tried to play AC Unity again without using the application profile since the update.


----------



## xer0h0ur

Yup, I just confirmed it by deleting the app profile and loading the game. Performance drops drastically with framerate in the 30's, bad gpu usage and clocks all over the place.


----------



## MapRef41N93W

Shadow of Mordor appears to have been mostly fixed. Not really sure how but I logged in yesterday and didn't change any setting yet the texture flickering and spastic menus are gone. Maybe it was fixed by Monolith? That being said, now at random times my camera will flick around really fast and throw me off. Only happening here and there and not that big of a deal.


----------



## fat4l

So the backplate for my 295X2 is finally here !








Aquacomputer Backplate für kryographics Vesuvius R9 295X2, aktiv XCS


An official pic:


Also waiting for this block, delivery today, soon


















Wondering what voltage I can run with these...


----------



## Orivaa

So The Evil Within just got a patch that's supposed to increase performance on higher end GPUs. Can anyone verify this?


----------



## ColeriaX

Sweet man post some detailed pics if you don't mind when you're done with the install. I'm really considering getting the aquacomputer block bc of the active backplane. What's your water cooling setup?


----------



## electro2u

Quote:


> Originally Posted by *fat4l*
> 
> So the backplate for my 295X2 is finally here !
> 
> 
> 
> 
> 
> 
> 
> 
> Aquacomputer Backplate für kryographics Vesuvius R9 295X2, aktiv XCS
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> An official pic:
> 
> 
> Also waiting for this block, delivery today, soon
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wondering what voltage I can run with these...


Holy crap. Must have. When did they finally release that???


----------



## xer0h0ur

Man. Not even gonna lie. I am going to have to think it over. I originally wanted that block and never got it because they had not released the active backplate. Now I feel like selling the EK block and getting this one.


----------



## electro2u

They seem to make them when they get enough orders. Aquatuning reports unknown date of delivery. I'll have 3 ek backplanes for sale...


----------



## cennis

i have a aquacompuing nickel plexi and active backplate sealed for sale...


----------



## fat4l

I took the AIO cooler from my 295x2 already







very easy to disassemble, takes just a few minutes.
Now I have to clean it and put it all together. Might do it all today as im a bit lime limited.

I also bought quick disconnects so I can later add Mo-Ra3 420 + 230mm fans(push pull) to my setup as I currently only own 2x240 EK rad with corsair sp120 push-pull.
Im running EK dual DDC with x-tops, PWM.
All tubes are 13/19mm Primochill Primoflex.
All fittings are by Bitspower.
CPU block is by EK, Supremacy EVO full nickel ([email protected]_1.415v,delided,direct-die cooling,CL Pro)
MB mosfet block is by EK too(Asus Maximus VII Hero)
2*4GB DDR3 G.Skill TridentX 2933MHz (Need to OC these but no time, NOT cooled by WC)

I'll get some pics guys









Also, I bought this Active backplate of Aquatuning.co.uk. Thats the only place u can find it in the UK. When I bought it they had 2 in stock(~ a week ago). Now they are AOS, sadly


----------



## cennis

Quote:


> Originally Posted by *fat4l*
> 
> I took the AIO cooler from my 295x2 already
> 
> 
> 
> 
> 
> 
> 
> very easy to disassemble, takes just a few minutes.
> Now I have to clean it and put it all together. Might do it all today as im a bit lime limited.
> 
> I also bought quick disconnects so I can later add Mo-Ra3 420 + 230mm fans(push pull) to my setup as I currently only own 2x240 EK rad with corsair sp120 push-pull.
> Im running EK dual DDC with x-tops, PWM.
> All tubes are 13/19mm Primochill Primoflex.
> All fittings are by Bitspower.
> CPU block is by EK, Supremacy EVO full nickel ([email protected]_1.415v,delided,direct-die cooling,CL Pro)
> MB mosfet block is by EK too(Asus Maximus VII Hero)
> 2*4GB DDR3 G.Skill TridentX 2933MHz (Need to OC these but no time, NOT cooled by WC)
> 
> I'll get some pics guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, I bought this Active backplate of Aquatuning.co.uk. Thats the only place u can find it in the UK. When I bought it they had 2 in stock(~ a week ago). Now they are AOS, sadly


sweet build 5ghz lets see some pics


----------



## fat4l

ok its done !









Its still not plugged in my rig and I hope it works but here are some pics..
The waterblock was flushed with destiled water before putting it on so theres some water left inside











This is how it should be according to manual.


This is how i did it. I added some extra thermal pads on the other side of vrms and cores for improved cooling. The same way as EK block.


----------



## ColeriaX

Looks great man! Now let's see some benching with temp data!!


----------



## fat4l

http://www.aquatuning.co.uk/water-cooling/gpu-water-blocks/gpu-backplates/18423/aquacomputer-backplate-fuer-kryographics-vesuvius-r9-295x2-aktiv-xcs

they have the active backplate in stock again








Quote:


> Originally Posted by *ColeriaX*
> 
> Looks great man! Now let's see some benching with temp data!!


thx







will plug it in tomorrow maybe.


----------



## xarot

I have another 295X2 in the mail to be picked up, not sure though if I should try CF on these things for the heck of it, as even my one 295X2 gets throttled over time but it's been working very well so far. Also I'd need to buy a 1500W PSU, which is maybe a bit of a hassle to do, and in our smallish three-room apartment putting out 1000 W out of the graphics cards may heat our apartment too. LOL. Maybe it's best to stick with one 295X2 unless custom water cooled? I'm not going back to full water cooling anytime soon. But still, I have the itch, but for the past years I have had several SLI/3-way SLI setups and cooling them with air can be a nightmare. Not sure what to do.


----------



## fat4l

Quote:


> Originally Posted by *xarot*
> 
> I have another 295X2 in the mail to be picked up, not sure though if I should try CF on these things for the heck of it, as even my one 295X2 gets throttled over time but it's been working very well so far. Also I'd need to buy a 1500W PSU, which is maybe a bit of a hassle to do, and in our smallish three-room apartment putting out 1000 W out of the graphics cards may heat our apartment too. LOL. Maybe it's best to stick with one 295X2 unless custom water cooled? I'm not going back to full water cooling anytime soon. But still, I have the itch, but for the past years I have had several SLI/3-way SLI setups and cooling them with air can be a nightmare. Not sure what to do.


well i believe u need a big case for 2 of them as its recommended to put 295x2's rad above the card for the best cooling performance. Also a very good airflow + some good fans for them rads. Corsair sp120 performance edition should do the job. Push and pull for the best cooling.


----------



## xer0h0ur

Quote:


> Originally Posted by *cennis*
> 
> i have a aquacompuing nickel plexi and active backplate sealed for sale...


Well at least its worth asking, what do you want for both?


----------



## CloverHAL

Hey !

I'm planning on buying a new Radeon R9 295X2 for my rig. Here in France prices have dropped from 1600€ to 600€









I have one question to submit here :

-My rig includes an Intel core i7 4770K and a 750W Power supply (beQuiet Power Zone 80+ Bronze). Would it run correctly ?
I've seen some reviews claiming 750W is the absolute minimum for this card to run but I'm not entirely sure the 80+ Bronze specification is ok...


----------



## joeh4384

Quote:


> Originally Posted by *CloverHAL*
> 
> Hey !
> 
> I'm planning on buying a new Radeon R9 295X2 for my rig. Here in France prices have dropped from 1600€ to 600€
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to submit here :
> 
> -My rig includes an Intel core i7 4770K and a 750W Power supply (beQuiet Power Zone 80+ Bronze). Would it run correctly ?
> I've seen some reviews claiming 750W is the absolute minimum for this card to run but I'm not entirely sure the 80+ Bronze specification is ok...


I wouldn't chance it. The card needs 56 amps on the 12 volt rail.


----------



## CloverHAL

Thanks for your quick reply









Well, a GTX 980 SLI seems to be the best solution out there !


----------



## xer0h0ur

LOL, that changed your tune in split second.


----------



## MapRef41N93W

Quote:


> Originally Posted by *CloverHAL*
> 
> Hey !
> 
> I'm planning on buying a new Radeon R9 295X2 for my rig. Here in France prices have dropped from 1600€ to 600€
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to submit here :
> 
> -My rig includes an Intel core i7 4770K and a 750W Power supply (beQuiet Power Zone 80+ Bronze). Would it run correctly ?
> I've seen some reviews claiming 750W is the absolute minimum for this card to run but I'm not entirely sure the 80+ Bronze specification is ok...


750W is not enough. I have the same 4770k at 1.27v and 295x2 and on a Corsair AX760 I got shutdowns.


----------



## ColeriaX

I've been happy overall with the omega drivers and quadfire but one thing I cannot sort out. Firestrike runs like absolute shiz now..I managed to figure out how to work around powertune to keep my clocks from wavering during gaming but running firestrike is a mess now. I just can't figure out what the issue is. Horrible stuttering, crazy core clocks from 500mhz all the way to 1.1ghz. Anyone running quadfire and 4k that can give their experiences or let me know if you are experiencing something similar? I know its not a big deal really since it's gaming fine but I really enjoy the enthusiast benchmarking more than I do gaming sometimes. Its weird power play being disabled with unofficial overclocking in ab locks clocks in games but for whatever reason not in fs. Tried dduing and reinstalling omega drivers, rolling back to 14.12 beta which worked for me before. Fresh Windows install, using Radeon pro to lock clocks etc etc etc. Thanks gents.


----------



## fat4l

Quote:


> Originally Posted by *CloverHAL*
> 
> Hey !
> 
> I'm planning on buying a new Radeon R9 295X2 for my rig. Here in France prices have dropped from 1600€ to 600€
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to submit here :
> 
> -My rig includes an Intel core i7 4770K and a 750W Power supply (beQuiet Power Zone 80+ Bronze). Would it run correctly ?
> I've seen some reviews claiming 750W is the absolute minimum for this card to run but I'm not entirely sure the 80+ Bronze specification is ok...


750w with other high end components(OCed) is not nuff.

I'd suggest to go with 1000W gold as a minimum. U dont want ur psu to run hot or at 90% of its wattage.

I also plugged my card to try it out w/o watercooling, just the card with a block on it. The card is working nicely(I was worried as always of any damages or something







), temps started on 30C and were going up slowly. I shut it down at 40C. Hopefully I;ll add it to my wc today.


----------



## Mega Man

Quote:


> Originally Posted by *fat4l*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *CloverHAL*
> 
> Hey !
> 
> I'm planning on buying a new Radeon R9 295X2 for my rig. Here in France prices have dropped from 1600€ to 600€
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to submit here :
> 
> -My rig includes an Intel core i7 4770K and a 750W Power supply (beQuiet Power Zone 80+ Bronze). Would it run correctly ?
> I've seen some reviews claiming 750W is the absolute minimum for this card to run but I'm not entirely sure the 80+ Bronze specification is ok...
> 
> 
> 
> 750w with other high end components(OCed) is not nuff.
> 
> 
> I'd suggest to go with 1000W gold as a minimum. U dont want ur psu to run hot or at 90% of its wattage.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I also plugged my card to try it out w/o watercooling, just the card with a block on it. The card is working nicely(I was worried as always of any damages or something
> 
> 
> 
> 
> 
> 
> 
> ), temps started on 30C and were going up slowly. I shut it down at 40C. Hopefully I;ll add it to my wc today.
Click to expand...









Quote:


> Originally Posted by *Phaedrus2129*
> 
> These enthusiast power supplies we all have (high-end Corsair, Antec, SeaSonic, Enermax, Silverstone, etc.) can take a lot more abuse than you think. I have tell people this a lot. These are heavy duty, precision engineered electron pushers, not tinker toys.
> 
> While you can't trust your average cheap or OEM power supply so much, and you can't trust a generic as far as you can throw it... A high-end enthusiast grade power supply is engineered with massive safety margins. Take the Corsair VX550. That's a CWT PSH, and Rocketfish/Best Buy took that same PSU (w/ minor modifications, nothing important) and rated it as a 700W _and it can hold that rating within ATX specs_. _*
> 
> When you buy a high-end PSU you aren't just buying reliability and performance, you're buying your headroom right there. Some people say, "Well the TX750 can push 900W, so when I buy it I'm getting a 900W PSU". That defeats the purpose. The purpose is that you can treat it as a 750W PSU and draw that amount from it long-term, for extended periods, while a cheap 750W might eventually break. That's part of the reason for buying a high-end PSU, instead of something just adequate, like an OCZ ModXStream or a Rosewill RV2 or a CM Silent Pro.*_
> 
> When you buy a high end enthusiast power supply, especially one that I can vouch for, you should know that you're buying into more than just a name. You're buying a machine, and one that's a lot tougher than the typical dreck you might have used before. So don't be afraid to use it for what it's intended for. Forget about "extra headroom", _*forget about babying your PSU or keeping some massive unnecessary safety margin.*_ Go ahead, get that second Fermi and go wild. Most of you have already paid for that ability, so make the most of it.


Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CloverHAL*
> 
> Hey !
> 
> I'm planning on buying a new Radeon R9 295X2 for my rig. Here in France prices have dropped from 1600€ to 600€
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question to submit here :
> 
> -My rig includes an Intel core i7 4770K and a 750W Power supply (beQuiet Power Zone 80+ Bronze). Would it run correctly ?
> I've seen some reviews claiming 750W is the absolute minimum for this card to run but I'm not entirely sure the 80+ Bronze specification is ok...
> 
> 
> 
> 750W is not enough. I have the same 4770k at 1.27v and 295x2 and on a Corsair AX760 I got shutdowns.
Click to expand...

80+bronze is the power eff from the wall has nothing to do with wattage delivered. you may be able to pull it off stock .


----------



## sporti

Please I need your help !!

Can i attach 2 Akasa Apache fans (with y-adapter) on the original fan port at the 295x2 ??

The Akasa Apache power consumption is max. 0,33A so the fan connection would be charged with
0,66A round about 8 Watt.... Is that okay ?

Anybody here who has two fans (with similar consumption and y-cable) connected to
the 295x2 card ?


----------



## CloverHAL

Well I guess because I have the amount of money to purchase this new card I sould be able to upgrade my PSU once and if I discover that the R9 295X2 is too power hungry and can't run with my current 750W PSU.

What do you think about it ?


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 80+bronze is the power eff from the wall has nothing to do with wattage delivered. you may be able to pull it off stock .


Quote:


> Originally Posted by *MapRef41N93W*
> 
> 750W is not enough. I have the same 4770k at 1.27v and 295x2 and on a Corsair AX760 I got shutdowns.


this card is hungry. 850w is the minimum req(branded PSU). if u run OC system + OC card + possibly Wcooling then I'd recommend 1000+ to have a bit more headroom. Simple as that.


----------



## houssem89

hello sorry for my bad english but i have a big problem in fire strike benchmark i have 1498 total score and 22653 in gpu score but in game in crysis 3 i have 62 fps in full hd and 35 fps asc unity full hd and 55 in dragon age aquisition in full hd it's like my old r9 290x tri-x so i think it work only in one gpu but in gpu-z i the first and second gpu load at 100%
i have asrock extreme 4 i7 3770k and 8gb 1866mhz and antec 850 watt so someone have idea and help me







(((((((((((((


----------



## CloverHAL

Quote:


> Originally Posted by *joeh4384*
> 
> I wouldn't chance it. The card needs 56 amps on the 12 volt rail.


My power supply can output 62A on the single 12V rail according to BeQuiet's website ; http://www.bequiet.com/fr/powersupply/386


----------



## Syceo

Quote:


> Originally Posted by *CloverHAL*
> 
> Thanks for your quick reply
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, a GTX 980 SLI seems to be the best solution out there !


or even 970 sli and save yourself even more money


----------



## fat4l

Quote:


> Originally Posted by *CloverHAL*
> 
> My power supply can output 62A on the single 12V rail according to BeQuiet's website ; http://www.bequiet.com/fr/powersupply/386


dont forget u need to power other components mate.

Like the other user said, he has ax760 that has 63A on 12v rail....and still getting shutdowns.

Check ocuk. they have some great promotions on power suplies. 1000w superflower platinum for 125£, 1000W gold evga supernova G1 for 110£ and G2 for 120£


----------



## CloverHAL

Yes I think you guys are right but he said his 4770K have a pretty heavy overclocking...I can't tell myself it don't has any flaws on the peak power consumption of the whole system. Maybe I'm wrong but my stock i7 4770K runs at a peak 110W and once at 4GHz it hits the 140/150W









What a hell of a choice here...


----------



## fat4l

Quote:


> Originally Posted by *CloverHAL*
> 
> Yes I think you guys are right but he said his 4770K have a pretty heavy overclocking...I can't tell myself it don't has any flaws on the peak power consumption of the whole system. Maybe I'm wrong but my stock i7 4770K runs at a peak 110W and once at 4GHz it hits the 140/150W
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What a hell of a choice here...


what about cpu bottlenecking the gpu ? I believe that overclocking the cpu is mandatory with this card.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CloverHAL*
> 
> Yes I think you guys are right but he said his 4770K have a pretty heavy overclocking...I can't tell myself it don't has any flaws on the peak power consumption of the whole system. Maybe I'm wrong but my stock i7 4770K runs at a peak 110W and once at 4GHz it hits the 140/150W
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What a hell of a choice here...
> 
> 
> 
> what about cpu bottlenecking the gpu ? I believe that overclocking the cpu is mandatory with this card.
Click to expand...

Ummm....no

I mean if you plan on running 1080p then sure but 1440p and above you'll be fine.


----------



## houssem89

all person who have a fps problem in game and in firestrike benchmark a good score change the psu i had a 850 watt antec and i got in fire strike benchmark a good score but in game it's a like on e gpu after that i bought 1000wat antec and op all game work perfectally same benchmark in internet crysis 3 dragon age inquisition tomb raider only the bad ac unity so if you want to buy r9 295x2 you must buy a minimum 1000watt psu


----------



## Mega Man

Quote:


> Originally Posted by *CloverHAL*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> I wouldn't chance it. The card needs 56 amps on the 12 volt rail.
> 
> 
> 
> My power supply can output 62A on the single 12V rail according to BeQuiet's website ; http://www.bequiet.com/fr/powersupply/386
Click to expand...

Quote:


> Originally Posted by *CloverHAL*
> 
> Yes I think you guys are right but he said his 4770K have a pretty heavy overclocking...I can't tell myself it don't has any flaws on the peak power consumption of the whole system. Maybe I'm wrong but my stock i7 4770K runs at a peak 110W and once at 4GHz it hits the 140/150W
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What a hell of a choice here...


personally you should just try it, if you are willing to wait ill try for some time to put in my 750w and test most of the system for you ill try this weekend, but yea you should be fine.

110w +500w ( max at stock for 295x2 ) =610 w means you would have 140w for everything else = at stock you should be fine !
Quote:


> Originally Posted by *houssem89*
> 
> all person who have a fps problem in game and in firestrike benchmark a good score change the psu i had a 850 watt antec and i got in fire strike benchmark a good score but in game it's a like on e gpu after that i bought 1000wat antec and op all game work perfectally same benchmark in internet crysis 3 dragon age inquisition tomb raider only the bad ac unity so if you want to buy r9 295x2 you must buy a minimum 1000watt psu


" i say so so it must be true "

that falls under the " trust me i'm an engineer " category


----------



## CloverHAL

Well thanks alot, I'll wait until you tell me if it works.


----------



## sporti

Quote:


> Originally Posted by *sporti*
> 
> Please I need your help !!
> 
> Can i attach 2 Akasa Apache fans (with y-adapter) on the original fan port at the 295x2 ??
> 
> The Akasa Apache power consumption is max. 0,33A so the fan connection would be charged with
> 0,66A round about 8 Watt.... Is that okay ?
> 
> Anybody here who has two fans (with similar consumption and y-cable) connected to
> the 295x2 card ?


Nobody ....??


----------



## Mega Man

no one here knows


----------



## sporti

Nobody here who has connected TWO Radiator Fans (y-cable) to the 295x2 Card ???


----------



## electro2u

I used a y-splitter to connect to 3-pin Noctua fans to mine when I first got it


----------



## sporti

Quote:


> Originally Posted by *electro2u*
> 
> I used a y-splitter to connect to 3-pin Noctua fans to mine when I first got it


What Type of Noctua Fans ? Any Problems with two Fans connected ?
I am afraid about the power consumption for two Fans for the Fan Port of the Card ....

The two Pumps are connected to the same Port. So the Pumps rated about 3,1W (6,2W both) plus two Fans (8W).
So you get round about 15W (1,3A)for the Card Connector....???


----------



## electro2u

NF-P12s. They don't consume much power. They worked fine on a y-splitter


----------



## fat4l

So I plugged it in. It all looks good.
However I feel that my system lacks the radiator power thats why I have it all ready for Mo-Ra3 420. See the back of the case, quick disconnects are ready.

When I run furmark the gpu temps go to 51C max but water temp is really high 35C....

Also the backplate is reeallllly hot, not sure why. Will measure the temp now...







EDIT://

So i measured the backplate temps above one of the cores, its about 45-50C....

But its really hot when u touch it. Is 50C rlyu hot to touch ? or is my temperature meter playing me ?


----------



## xer0h0ur

Aren't the VRMs rated to go above 100C? I believe you're beyond acceptable temperature levels.


----------



## Gilles3000

Quote:


> Originally Posted by *fat4l*
> 
> Also the backplate is reeallllly hot, not sure why.
> 
> But its really hot when u touch it. Is 50C rlyu hot to touch ? or is my temperature meter playing me ?


Don't worry, its supposed to get hot, thats what its for. I can fry eggs on my 290's when overclocked under load.

To get your VRM temps just download gpu-z and look at the sensor tab, you should expect temps anywhere from 50° to 80°C under heavy load when watercooled. They are rated to get much hotter tho so don't worry unless they're over 100°C.


----------



## fat4l

Quote:


> Originally Posted by *Gilles3000*
> 
> Don't worry, its supposed to get hot, thats what its for. I can fry eggs on my 290's when overclocked under load.
> 
> To get your VRM temps just download gpu-z and look at the sensor tab, you should expect temps anywhere from 50° to 80°C under heavy load when watercooled. They are rated to get much hotter tho so don't worry unless they're over 100°C.


okay then







I thought the temp will be lower but actually I didnt touch the original backplate when it was there so not sure about the temp of the original one.

I tried gpu-z and it doesnt show vrm temps for my 295x2.
Is it rly showing to anyonewith the same card ? Mine is by MSI, bios flashed to sapphire OC

owww


----------



## electro2u

Quote:


> Originally Posted by *fat4l*
> 
> okay then
> 
> 
> 
> 
> 
> 
> 
> I thought the temp will be lower but actually I didnt touch the original backplate when it was there so not sure about the temp of the original one.
> 
> I tried gpu-z and it doesnt show vrm temps for my 295x2.
> Is it rly showing to anyonewith the same card ? Mine is by MSI, bios flashed to sapphire OC
> 
> owww


There is no way to monitor VRM temps on the 295... seems by design if you ask me lol

The testing that was done made it clear on the 290x that the active backplate helps a lot. However:
Additional cooling (motherboard VRMs, PCH tend to have very small surface area to work with) on loops with low to moderate radiator capacity is the first thing to suffer, apparently. If you don't have enough cooling capacity to keep the coolant at a sufficient Delta T, then the coolant itself won't be at a low enough temperature to have a major impact on those components. On the one hand, the coolant might not get heated much by a motherboard VRM, but on the other, the VRM may get hotter than it would on air cooling.
There's no doubt the backplate of your 295x2 would get even hotter by some amount if you didn't have the active backplate on it, because the alternative is simply a backplate with no heatpipe.

To put it in perspective--I had a loop design made up when I first got my 295x2 set up for trifire where the return tube that left my GPUs and went to my first pump was actually *between* the 2 graphics cards:

The tube above the 295x2 in this photo actually started to melt while Folding one day. It had been folding for several hours and the backplate was so hot it was melting the tubing. It didn't go through it, but it would have eventually if I hadn't been looking for it.
These tubes are rated to go up to 60C coolant temp. So the melting point is higher than that obviously. Probably closer to 70C/158F-80C/176F and that is hot enough that it will burn you if you leave your finger on it for long.
Some testing results from XtremeRigs.com:
















http://www.xtremerigs.net/2014/01/01/r9-290x-gpu-block-performance-summary/


----------



## fat4l

thx for the input!









I've been testing the rig for several times now..
Looping unigine valley for half an hour.
The backplate temp was about ~40C(using type K thermometer).
The coolant temp was about 33C-34C and core temps were about ~50C.
I think its okay considering high coolant temp.(Im using 2x EK-CoolStream PE 240mm rads, push-pull, corsair sp120 on them, CPU + MB in the loop)

Soon I will add Mo-ra3 420 with push-pull 230mm Bitfenix Spectre Pro fans. I believe this will destroy the temps !


----------



## eqc6

Ok, guys I need your expertise and advice!
Right now I have a trifire setup using a 295x2 and a reference 290x. My motherboard is an Asrock Extreme 4 gen 3 with a 2600k cpu.
Now, my motherboard has TWO PCI Express 3.0 x16 slots, BUT since I have a Sandybridge cpu, they're only running at PCI 2.0.
When I do a Render test,, my cards are running at PCI 2.0 at 8x/8x/8x

Question is, am I being bottlenecked and should I upgrade my cpu to Ivybridge so I can utilize my PCI 3.0 slots? Will I see a huge improvement?

I ask because if I NEED to be at PCI 3.0, I don't want to spend money just for the ivybridge cpu upgrade and would rather get a new mobo/cpu combo.

Any advice would be appreciated!

edit: and to add, I game at 4k as well

Thanks!


----------



## electro2u

Pcie 2.0 8x shouldn't be much different from Pcie 3.0 8x. Maybe 10-15fps over 3 gpus


----------



## houssem89

Quote:


> Originally Posted by *eqc6*
> 
> Ok, guys I need your expertise and advice!
> Right now I have a trifire setup using a 295x2 and a reference 290x. My motherboard is an Asrock Extreme 4 gen 3 with a 2600k cpu.
> Now, my motherboard has TWO PCI Express 3.0 x16 slots, BUT since I have a Sandybridge cpu, they're only running at PCI 2.0.
> When I do a Render test,, my cards are running at PCI 2.0 at 8x/8x/8x
> 
> Question is, am I being bottlenecked and should I upgrade my cpu to Ivybridge so I can utilize my PCI 3.0 slots? Will I see a huge improvement?
> 
> I ask because if I NEED to be at PCI 3.0, I don't want to spend money just for the ivybridge cpu upgrade and would rather get a new mobo/cpu combo.
> 
> Any advice would be appreciated!
> 
> edit: and to add, I game at 4k as well
> 
> Thanks!


my friend i had a same motherboard and i bought it for asrock p67 pro se3 is only pci ex 2.0 and i swear no difference betwen the too motherboard i have i7 3770k


----------



## ramos29

i changed my cpu and bought an i7 4790k, i am using a ''simple'' fan
i am getting 75° sometimes 80 is this temp normal?
i kept the cpu at its original speed 4 ghz


----------



## ramos29

Quote:


> Originally Posted by *eqc6*
> 
> Ok, guys I need your expertise and advice!
> Right now I have a trifire setup using a 295x2 and a reference 290x. My motherboard is an Asrock Extreme 4 gen 3 with a 2600k cpu.
> Now, my motherboard has TWO PCI Express 3.0 x16 slots, BUT since I have a Sandybridge cpu, they're only running at PCI 2.0.
> When I do a Render test,, my cards are running at PCI 2.0 at 8x/8x/8x
> 
> Question is, am I being bottlenecked and should I upgrade my cpu to Ivybridge so I can utilize my PCI 3.0 slots? Will I see a huge improvement?
> 
> I ask because if I NEED to be at PCI 3.0, I don't want to spend money just for the ivybridge cpu upgrade and would rather get a new mobo/cpu combo.
> 
> Any advice would be appreciated!
> 
> edit: and to add, I game at 4k as well
> 
> Thanks!


i had a 2600 and pci 2.0 motherboard, i though i was ok till yesterday whan i changed my cpu and found that i was bottlenecked but i am not sure if it was my cpu or motherborad which was holding my gpu back


----------



## fat4l

Quote:


> Originally Posted by *ramos29*
> 
> i changed my cpu and bought an i7 4790k, i am using a ''simple'' fan
> i am getting 75° sometimes 80 is this temp normal?
> i kept the cpu at its original speed 4 ghz


check the voltage under load. I know that on older bioses(suplied with new motherboards), motherboards are overvolting these new cpus, even up to 1.5v.

Check the voltage and update ur motherboard bios to the newest one.


----------



## ramos29

thx i launched live update it tells me i have the latest version of the bios :/


----------



## eqc6

Quote:


> Originally Posted by *ramos29*
> 
> i had a 2600 and pci 2.0 motherboard, i though i was ok till yesterday whan i changed my cpu and found that i was bottlenecked but i am not sure if it was my cpu or motherborad which was holding my gpu back


I'm assuming you upgraded to i7-3770k? How much improvement did you get? Was it worth the $300 upgrade?


----------



## ramos29

Quote:


> Originally Posted by *eqc6*
> 
> I'm assuming you upgraded to i7-3770k? How much improvement did you get? Was it worth the $300 upgrade?


no switched to 4790k , ingame i noticed stable frame rates in battlefield 4k 4x ( a thing i could not with 2600 ) in far cry i had a variable fps with the 2600 but now i have a stable one ( 60 most of the time )
in other tasks ( winzip ... ) did not see a big difference


----------



## steezebe

Feel free to add me to the owners list! currently just testing at the moment...

switched to this from a GTX 750Ti... differences are immeasurable.



I'll be back here later to look at overclocking stuff!


----------



## HunterKiller-x-

Ordered one on sat through overclockers and an EVGA G2 1300. I must of clicked refresh on the 'your orders' section 100 times today.......

Only went from 'in queue at warehouse' to 'printed at warehouse'

Won't be getting it tommorow then...... Sigh........

I'm upgrading from a gtx680 @ 1080p @ 60hz to 4K.

Also I have a i7 3770k, I assume that the gen jump was not big enough to upgrade to 1150 ?


----------



## kalijaga

Hi huntkiller,

good luck on the new card.
The 3770k should be good enough, and if not, overclock it a bit.


----------



## fat4l

Here she comes...The Beauty herself








(MORA3 420 + 8x Bitfenix Spectre Pro 230mm)



Now I need to wait for hoses and fans...









I'm looking forward to see the cooling performance of this Beast!
................295X2 needs it!


----------



## electro2u

Quote:


> Originally Posted by *fat4l*
> 
> Here she comes...The Beauty herself
> 
> 
> 
> 
> 
> 
> 
> 
> (MORA3 420 + 8x Bitfenix Spectre Pro 230mm)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now I need to wait for hoses and fans...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm looking forward to see the cooling performance of this Beast!
> ................295X2 needs it!


What pump/s you using? Is that going to be the only rad? Just curious not critiquing.


----------



## magicase

I'm having an issue where I can only see 1 GPU in GPUz. I have reinstalled the driver and it still won't show up on.

Does anyone know what is happening?


----------



## fat4l

Quote:


> Originally Posted by *electro2u*
> 
> What pump/s you using? Is that going to be the only rad? Just curious not critiquing.


Im using EK dual DDC in series, both with X-top + EK nickel housing on both.

I have 2 additional rads, 2x240mm EK Coolstream PE, push-pull.












Quote:


> Originally Posted by *magicase*
> 
> I'm having an issue where I can only see 1 GPU in GPUz. I have reinstalled the driver and it still won't show up on.
> 
> Does anyone know what is happening?


U have to disable ULPS to see both.
(ULPS enabled means 1 gpu is turned off when not needed to save energy)

U can use MSI afterburner to do that as well as there are other programs that could do it


----------



## schujd3

I've had my xfx r9 295x2 for about 3 weeks now. I love the card I'm running an asus sabertooth 990fx amd motherboard with an 8320 cpu overclocked to 4.4ghz. I have a 360mm radiator alphacool d5 pump and swiftec apogee xl waterblock.


----------



## Ranma13

I'm interested in picking up a 295x2, and I found this VisionTek card on TigerDirect for $900:

http://www.tigerdirect.com/applications/SearchTools/search.asp?keywords=VisionTek+Radeon+R9+295X2

Does anybody know why it's selling for $900 instead of $1000 for the XFX version? As far as I can tell, all the 295x2's are just rebranded reference cards, but the VisionTek one adds some kind of grill over the center fan. Is it worth getting the VisionTek over the XFX, or should I shell out the extra $100 for the XFX version?


----------



## electro2u

The only real differences besides the fan treatments is the warranty period.


----------



## schujd3

Check newegg first!!!! They're cheaper or at least they were I paid less than 700 for mine on black Friday!


----------



## schujd3

Just checked unfortunately they aren't on sale anymore. I would wait to be honest they should be on sale again soon. I definitely had get one at that price and I have the xfx version as someone said before they are vendors amd supplies them with the hardware they brand their name and I believe write their own bios and some have slightly different clock speeds I believe. I noticed a big difference in temperatures once I got the 990fx motherboard and watercooled the cpu. At stock speeds cpu/gpu at 4k I've seen the card hit 71c this was with ultra sethings v sync enabled on shadow of mordor. Now with this overclock the highest I've seen is 63c ambient temp in my house is 74f.


----------



## fat4l

So as I wanted my 295x2 to be "singleslot" all over so I bought EK-VGA I/O bracket HD7990 SE to see if it fits 295X2.

I can confirm that it fits it!









I have just changed it from stock one to EK 7990 one. It looks awesome








(If anyone wonders its black nickel plated steel and it shines nicely)


----------



## xer0h0ur

Yeah I need to order a replacement for mine. I managed to stain mine with a coolant leak.


----------



## electro2u

xer0 did you get my PM?


----------



## xer0h0ur

Yeah I got your PM, sorry I didn't respond. Not sure what I want to do at the moment.


----------



## electro2u

No worries. I wouldn't recommend the heat pipe backplanes for trifire. The replacement terminals are much larger and it makes connecting them very very difficult because the ports don't line up. I sent mine back.


----------



## xer0h0ur

Heard Dat


----------



## electro2u

Only reason I would like to get rid of the Copper Blocks I have is because everything else I'm working with is Nickel plated... so now I'd like to switch to the AC nickel blocks. But it's not that big a deal.


----------



## xer0h0ur

Yeah I like the nickel plated blocks more than the copper blocks myself.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I like the nickel plated blocks more than the copper blocks myself.




I still think it looks OK. I got the VGA blocks before any of the rest of the stuff and would have stuck with copper, but they don't make the Monoblock for the motherboard in anything but nickel.


----------



## xer0h0ur

Yeah I am just saying the nickel block looks better in my case along with the rest of my chrome/nickel components. The only reason I am not a huge fan of the copper blocks is the need to clean them or else they develop a petina. I know some people like petinas but its not my thing. They still look great though. I do like the look of copper when its nice and shiny.


----------



## cennis

Quote:


> Originally Posted by *electro2u*
> 
> 
> 
> I still think it looks OK. I got the VGA blocks before any of the rest of the stuff and would have stuck with copper, but they don't make the Monoblock for the motherboard in anything but nickel.


build looks great man.


----------



## electro2u

Quote:


> Originally Posted by *cennis*
> 
> build looks great man.


Thanks!








Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I am just saying the nickel block looks better in my case along with the rest of my chrome/nickel components. The only reason I am not a huge fan of the copper blocks is the need to clean them or else they develop a petina. I know some people like petinas but its not my thing. They still look great though. I do like the look of copper when its nice and shiny.


Well I knew what to expect with regard to patina, which is why I got the black acrylic. It's easy enough to clean the outside of the block but cracking it open is a pain on these. I use some metal polish on them if I tear down the loop and only handle them with gloves.
Nickel dulls and tarnishes over time too... But it's a lot more resistant to being touched.


----------



## Sgt Bilko

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Yeah I like the nickel plated blocks more than the copper blocks myself.
> 
> 
> 
> 
> 
> I still think it looks OK. I got the VGA blocks before any of the rest of the stuff and would have stuck with copper, but they don't make the Monoblock for the motherboard in anything but nickel.
Click to expand...

Thats a sexy looking rig mate


----------



## Cool Mike

Managed to pick up a XFX 295x2 for 749.99 one day before the price increase. Will receive it Saturday.


----------



## silencespr

Quote:


> Originally Posted by *Cool Mike*
> 
> Managed to pick up a XFX 295x2 for 749.99 one day before the price increase. Will receive it Saturday.


got mine for $709 and $30 rebate i was thinking 2 long it was $650 the day before with $30 rebate


----------



## Ranma13

Quote:


> Originally Posted by *silencespr*
> 
> got mine for $709 and $30 rebate i was thinking 2 long it was $650 the day before with $30 rebate


You lucky bastards







. I had to drop $900 for one...


----------



## electro2u

$1500...


----------



## jsheradin

I just got my XFX 295x2 but have been having some issues regarding the radiator fan. It just doesn't kick on. The card goes (under load) straight up to 75C and throttles down to 300MHz and stays there. The fan makes no attempt to move. I swapped it out for a known to be working Noctua NF-F12 Industrial with no effect. My multimeter reads 0.00v on the fan header. The drivers are the latest Catalyst Omega ones downloaded via the AMD auto detect app on their site and the BIOS on the card is whatever it came with (I can't find how to update it anywhere). Is there anything I can do besides RMA? Please help!


----------



## joeh4384

Quote:


> Originally Posted by *jsheradin*
> 
> I just got my XFX 295x2 but have been having some issues regarding the radiator fan. It just doesn't kick on. The card goes (under load) straight up to 75C and throttles down to 300MHz and stays there. The fan makes no attempt to move. I swapped it out for a known to be working Noctua NF-F12 Industrial with no effect. My multimeter reads 0.00v on the fan header. The drivers are the latest Catalyst Omega ones downloaded via the AMD auto detect app on their site and the BIOS on the card is whatever it came with (I can't find how to update it anywhere). Is there anything I can do besides RMA? Please help!


Maybe you could take the shroud off and check if the header is plugged into the card or run the noctua fan from a mobo header.


----------



## doctakedooty

Hey where can I find a bios for my 295x2 its a xfx. I know there were some custom ones floating around.


----------



## ColeriaX

Techpowerup has the bios. However most have flashed the sapphire oc bios found earlier in this thread.


----------



## silencespr

Quote:


> Originally Posted by *ColeriaX*
> 
> Techpowerup has the bios. However most have flashed the sapphire oc bios found earlier in this thread.


possible you can find a link ? and what's the advantage ? my XFX cant over clock for **** right now.


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> possible you can find a link ? and what's the advantage ? my XFX cant over clock for **** right now.


here
http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/760
Quote:


> Originally Posted by *joeh4384*
> 
> 295x2OCBioses.zip 72k .zip file


u will have these parametres as default
http://www.sapphiretech.com/presentation/product/product_index.aspx?cid=1&gid=3&sgid=1227&pid=2293&psn=000101&lid=1
1030 MHz Core Clock
5200 MHz Effective


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> here
> http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/760
> u will have these parametres as default
> http://www.sapphiretech.com/presentation/product/product_index.aspx?cid=1&gid=3&sgid=1227&pid=2293&psn=000101&lid=1
> 1030 MHz Core Clock
> 5200 MHz Effective


thx i already found it =D now i need to find easiest way to flash bios as i have never done it...


----------



## ColeriaX

My xfx is a poor clocker as well. My powercolor is a beast. Just Google techpowerup bios I'd link it but doing stuff on my phone is annoying. The sapphire bios give the card a factory oc...while not significant it seemed to help with clocking a bit higher maybe it modifies the stock volts but I'm not certain.


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> thx i already found it =D now i need to find easiest way to flash bios as i have never done it...


Use this:
http://forums.overclockers.co.uk/showthread.php?t=18558655
or
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
(dont use all the steps like "download 7990 bios" etc, it makes sense right ?







)
(download the latest versions of them programs if possible)
(when u copy the atiflash to ur usb make sure its called atiflash.exe not atiflash_417.exe)
(dont forget to put the bios switch located on the gpu to position "1")
(u will need to flash only 1 bios position, but u need to flash 2 bioses, master and slave)
(use something like :
atiflash -f -p 0 OC.rom (this flash the master card)
atiflash -f -p 1 OC2.rom (this flash the Slave card)
if u name ur bioses OC.rom and OC2.rom)
(dont mix the bioses, go slowly step by step)


----------



## steezebe

Quote:


> Originally Posted by *fat4l*
> 
> Use this:
> http://forums.overclockers.co.uk/showthread.php?t=18558655
> or
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> (dont use all the steps like "download 7990 bios" etc, it makes sense right ?
> 
> 
> 
> 
> 
> 
> 
> )
> (download the latest versions of them programs if possible)
> (when u copy the atiflash to ur usb make sure its called atiflash.exe not atiflash_417.exe)
> (dont forget to put the bios switch located on the gpu to position "1")
> (u will need to flash only 1 bios position, but u need to flash 2 bioses, master and slave)
> (use something like :
> atiflash -f -p 0 OC.rom (this flash the master card)
> atiflash -f -p 1 OC2.rom (this flash the Slave card)
> if u name ur bioses OC.rom and OC2.rom)
> (dont mix the bioses, go slowly step by step)


Thanks for that!

The first link you provided states the 7990 runs stable at 1.3V; for overclocking safety and avoiding damage to the card, is there any mean OC voltage for the 295x2?


----------



## fat4l

Quote:


> Originally Posted by *steezebe*
> 
> Thanks for that!
> 
> The first link you provided states the 7990 runs stable at 1.3V; for overclocking safety and avoiding damage to the card, is there any mean OC voltage for the 295x2?


thats what im trying to find out








Im reading this thread from the beginning, currently at page 221








On stock cooling I would add to much voltage or maybe no voltage at all. With watercooling and good temps I would go +200mV but not sure yet


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> thats what im trying to find out
> 
> 
> 
> 
> 
> 
> 
> 
> Im reading this thread from the beginning, currently at page 221
> 
> 
> 
> 
> 
> 
> 
> 
> On stock cooling I would add to much voltage or maybe no voltage at all. With watercooling and good temps I would go +200mV but not sure yet


I'm on stock cooler but my rig is next to an open window it's so cold my Gpu won't even hit 60c under full load.


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> I'm on stock cooler but my rig is next to an open window it's so cold my Gpu won't even hit 60c under full load.


then, +100mV is a minimum i would go. Can rly say what it will do with temps as im still waiting for fans/hoses to plug my mora3 in the loop. Cant do any testing now with my current rad setup









edit://
forgot to say, vrm temps are very important on this card. if u have any k-probe it would be great to keep them as cool as possible. try to avoid 100C


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> then, +100mV is a minimum i would go. Can rly say what it will do with temps as im still waiting for fans/hoses to plug my mora3 in the loop. Cant do any testing now with my current rad setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit://
> forgot to say, vrm temps are very important on this card. if u have any k-probe it would be great to keep them as cool as possible. try to avoid 100C


Quote:


> Originally Posted by *fat4l*
> 
> then, +100mV is a minimum i would go. Can rly say what it will do with temps as im still waiting for fans/hoses to plug my mora3 in the loop. Cant do any testing now with my current rad setup


this is my set up i have a fan blowing from top on the VRM... should i put some heat sinks on them ?


----------



## silencespr

my R9 295x2 is a secondary card in my PC until i get the mini to display port adapter in .... seems that the GPU core clock while gaming does not go past 710MHz .....


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> Use this:
> http://forums.overclockers.co.uk/showthread.php?t=18558655
> or
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> (dont use all the steps like "download 7990 bios" etc, it makes sense right ?
> 
> 
> 
> 
> 
> 
> 
> )
> (download the latest versions of them programs if possible)
> (when u copy the atiflash to ur usb make sure its called atiflash.exe not atiflash_417.exe)
> (dont forget to put the bios switch located on the gpu to position "1")
> (u will need to flash only 1 bios position, but u need to flash 2 bioses, master and slave)
> (use something like :
> atiflash -f -p 0 OC.rom (this flash the master card)
> atiflash -f -p 1 OC2.rom (this flash the Slave card)
> if u name ur bioses OC.rom and OC2.rom)
> (dont mix the bioses, go slowly step by step)


thc for the info but none of my damn USB sticks want to format with the windows files...


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> this is my set up i have a fan blowing from top on the VRM... should i put some heat sinks on them ?


in reality VRMs are located at the front side of the card where the red fan is, but the back of the pcb/backplate gets very hot as the front heatsink+fan is having trouble cooling them. Also, the stock backplate doenst even touch that part of the pcb where vrms are mounted(from the other side, I hope u get me







). U can see it at the vid I posted. vmrs are exaclty on the other side where the hottest areas are.
So thats why i dont think any fan blowing at the back of the pcb will rly help with temps, unless u use custom(ek,aq etc) backplate.
Quote:


> Originally Posted by *silencespr*
> 
> thc for the info but none of my damn USB sticks want to format with the windows files...


did u follow the steps ?

Code:



Code:


1. Download and install the USB disk format tool here.

2. Download the Windows98 system files here.

3. Create a folder called Win98boot on your desktop, extract the files from step 2 into the folder.

4. Plug in your usb stick. Launch the USB disk format tool. Copy these settings, then click format. You need to select quick format, tick dos startup and select the Win98 folder, like ive done below.

If so try to format ur usb the normal way first, or try to do it on another PC.


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> in reality VRMs are located at the front side of the card where the red fan is, but the back of the pcb/backplate gets very hot as the front heatsink+fan is having trouble cooling them. Also, the stock backplate doenst even touch that part of the pcb where vrms are mounted(from the other side, I hope u get me
> 
> 
> 
> 
> 
> 
> 
> ). U can see it at the vid I posted. vmrs are exaclty on the other side where the hottest areas are.
> So thats why i dont think any fan blowing at the back of the pcb will rly help with temps, unless u use custom(ek,aq etc) backplate.
> did u follow the steps ?
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> 1. Download and install the USB disk format tool here.
> 
> 2. Download the Windows98 system files here.
> 
> 3. Create a folder called Win98boot on your desktop, extract the files from step 2 into the folder.
> 
> 4. Plug in your usb stick. Launch the USB disk format tool. Copy these settings, then click format. You need to select quick format, tick dos startup and select the Win98 folder, like ive done below.
> 
> If so try to format ur usb the normal way first, or try to do it on another PC.


did it with like 5 diff usb stick all give same error


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> did it with like 5 diff usb stick all give same error


hm. what kind of error ur getting ?


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> hm. what kind of error ur getting ?


the one i attached in the earlier post... i tried on my laptop with windows 8.1 same window pops up....


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> the one i attached in the earlier post... i tried on my laptop with windows 8.1 same window pops up....


im not rly sure thats an error or im just blind








i think thats a confirmation message


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> im not rly sure thats an error or im just blind
> 
> 
> 
> 
> 
> 
> 
> 
> i think thats a confirmation message


says cant find card


----------



## Mega Man

although i cant remember who i have not forgot about running the 750w, just have not had time :/ sorry

i want to find another great game like tomb raider, seems all the games i have been playing have been frame capped to 30/60 and it is pissing me off !


----------



## jsheradin

I have the Noctua running on a motherboard header right now fixed at 80%. The card get into the high 60s during Metro:LL, so it's decently cool. Is the internal header soldered on or is it an actual header? Going from this image, it just disappears into one of the pumps. Is it actually user serviceable or should I just return it?


----------



## steezebe

Quote:

Originally Posted by *fat4l* 

Use this:
http://forums.overclockers.co.uk/showthread.php?t=18558655
or
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
(dont use all the steps like "download 7990 bios" etc, it makes sense right ?







)
(download the latest versions of them programs if possible)
(when u copy the atiflash to ur usb make sure its called atiflash.exe not atiflash_417.exe)
(dont forget to put the bios switch located on the gpu to position "1")
(u will need to flash only 1 bios position, but u need to flash 2 bioses, master and slave)
(use something like :
atiflash -f -p 0 OC.rom (this flash the master card)
atiflash -f -p 1 OC2.rom (this flash the Slave card)
if u name ur bioses OC.rom and OC2.rom)
(dont mix the bioses, go slowly step by step)










Quote:
Originally Posted by *silencespr* 

Quote:


> Originally Posted by *fat4l*
> 
> im not rly sure thats an error or im just blind
> 
> 
> 
> 
> 
> 
> 
> 
> i think thats a confirmation message
> says cant find card


Are you running the USB drive program as administrator? If you are, I think it'll work. I just did the steps fat4l provided in the first link with the bios, and it worked remarkably well. What I had to do was hit F11 on boot-up, not F8, because I have an ASRock mobo. I got my memory and clock up to the Sapphire OC now, and it seems to be accepting clock adjustments in Afterburner much better!


----------



## silencespr

Quote:


> Originally Posted by *steezebe*
> 
> Are you running the USB drive program as administrator? If you are, I think it'll work. I just did the steps fat4l provided in the first link with the bios, and it worked remarkably well. What I had to do was hit F11 on boot-up, not F8, because I have an ASRock mobo. I got my memory and clock up to the Sapphire OC now, and it seems to be accepting clock adjustments in Afterburner much better!


yeah but when im in MS dos the after exciting command it tells me no GPU found


----------



## electro2u

Quote:


> Originally Posted by *jsheradin*
> 
> I have the Noctua running on a motherboard header right now fixed at 80%. The card get into the high 60s during Metro:LL, so it's decently cool. Is the internal header soldered on or is it an actual header? Going from this image, it just disappears into one of the pumps. Is it actually user serviceable or should I just return it?


It's an actual header. I'm really surprised it's not functional out of the box and it makes me wonder if it was used and returned. Where did you get it? I would RMA. I know it's a hassle but it's a hassle taking the shroud off and it's not your problem.


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> yeah but when im in MS dos the after exciting command it tells me no GPU found


when do u exit ms dos ? after updating bios ?

when in msdos try this

Code:



Code:


"12. You should be at dos prompt. Type atiflash -i to get the adapter number for both your gpu's. Typically it will be 0 and 1, unless you have a gpu in a third pci-e slot. You need the adapter number to tell it which gpu to flash."


----------



## electro2u

i have a bricked bios on one side of the switch but not the other because atiflash says it cant find the adapter in dos on that particular bios. too lazy to solve it


----------



## jsheradin

Quote:


> Originally Posted by *electro2u*
> 
> It's an actual header. I'm really surprised it's not functional out of the box and it makes me wonder if it was used and returned. Where did you get it? I would RMA. I know it's a hassle but it's a hassle taking the shroud off and it's not your problem.


I got it new via Amazon. I will take the shroud off later today and see if I can get power from the header. Roughly where will the header be? From the pictures it looks like it is inside the pump itself, but I don't know. Thanks a ton for all the help!


----------



## joeh4384

Quote:


> Originally Posted by *jsheradin*
> 
> I got it new via Amazon. I will take the shroud off later today and see if I can get power from the header. Roughly where will the header be? From the pictures it looks like it is inside the pump itself, but I don't know. Thanks a ton for all the help!


Amazon is great with returns. I would just exchange it and they probably would expedite a replacement.


----------



## silencespr

dunno whats going on but both of my cards are now under clocking to 300 clock and
Quote:


> Originally Posted by *fat4l*
> 
> when do u exit ms dos ? after updating bios ?
> 
> when in msdos try this
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "12. You should be at dos prompt. Type atiflash -i to get the adapter number for both your gpu's. Typically it will be 0 and 1, unless you have a gpu in a third pci-e slot. You need the adapter number to tell it which gpu to flash."


thats what i did and got the no GPU found error.


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> dunno whats going on but both of my cards are now under clocking to 300 clock and
> thats what i did and got the no GPU found error.


are u sure u have it in the right pcie slot ? the main one.
hint: the first one is not always the main one


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> are u sure u have it in the right pcie slot ? the main one.
> hint: the first one is not always the main one


all of my slots are filled with a card =D


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> all of my slots are filled with a card =D


haha
then plug just 295x2 and do it !


----------



## silencespr

Quote:


> Originally Posted by *fat4l*
> 
> haha
> then plug just 295x2 and do it !


did that also


----------



## joeh4384

Hey guys, my 295x2 is doing something strange that I noticed recently. If I cold boot the PC from a powered off state, the 2nd GPU will boot at its 3d core and memory clocks. However, I then can restart the computer, and the clocks behave normally. I can also run a game or something that uses crossfire and exit and then the clocks return to normal as well. Has anyone seen something similar?


----------



## fat4l

Quote:


> Originally Posted by *silencespr*
> 
> did that also










then idk man...im sorry. maybe try it all over again


----------



## houssem89

hello some one can help me i have a big problem my xfx r9 295 x2 when it attain 69 celcius the pc shut down every time when i play a power game and benchmark

2.txt 146k .txt file


----------



## joeh4384

Quote:


> Originally Posted by *houssem89*
> 
> hello some one can help me i have a big problem my xfx r9 295 x2 when it attain 69 celcius the pc shut down every time when i play a power game and benchmark
> 
> 2.txt 146k .txt file


What kind of power supply do you have? I am willing to bet that the PSU isn't providing enough amps.


----------



## ozlay

has anyone done led replacement on the card to make it green or blue?


----------



## electro2u

Quote:


> Originally Posted by *ozlay*
> 
> has anyone done led replacement on the card to make it green or blue?


The LED is from the fan. Would need to get a different one probably.


----------



## steezebe

Quote:


> Originally Posted by *ozlay*
> 
> has anyone done led replacement on the card to make it green or blue?


I'm about ready to turn them off. Nvidia cards can do this via software; is a similar feature available with amd?

I remember playing DOOM as a kid in a dark room and getting scared out of my mind, so out of novelty I'm not one for distracting lights while I'm playing a game... for immersion purposes.


----------



## electro2u

Quote:


> Originally Posted by *steezebe*
> 
> I'm about ready to turn them off. Nvidia cards can do this via software; is a similar feature available with amd?


Nope. Would have to do it permanently by modifying the fan.


----------



## axiumone

Quote:


> Originally Posted by *electro2u*
> 
> Nope. Would have to do it permanently by modifying the fan.


It's not permanent. Well, it's not irreversible anyway.

If you take a look back a few pages, I posted a video on how to access full control over the VRM fan, in that video I pointed out that there are two connectors under the shroud. One to control the VRM fan and the other for the Radeon led. All you have to do is take the shroud off and unplug the led.


----------



## houssem89

Quote:


> Originally Posted by *joeh4384*
> 
> What kind of power supply do you have? I am willing to bet that the PSU isn't providing enough amps.


i have Antec Truepower Quattro 1200W and it's not psu i'm sure


----------



## houssem89

can i change the liquid in my xfx r9 295x2???


----------



## Mega Man

depends on how handy you are you would have to mod your card and void your warranty


----------



## joeh4384

Quote:


> Originally Posted by *houssem89*
> 
> i have Antec Truepower Quattro 1200W and it's not psu i'm sure


Is that single or multi-rail? It could be how you have your rails balanced.


----------



## Roxycon

I'm so late to the game but do any of you have any recommendations for radiator size for a single 295x2 paired with an i7 4790k?

Hope to fit a custom loop and a mATX board in a fairly small case


----------



## axiumone

Quote:


> Originally Posted by *Roxycon*
> 
> I'm so late to the game but do any of you have any recommendations for radiator size for a single 295x2 paired with an i7 4790k?
> 
> Hope to fit a custom loop and a mATX board in a fairly small case


I'd say you need at least a 240 dedicated just for the 295 if you'd like to have some half decent temps and another 240 for the 4790k.


----------



## ozlay

Quote:


> Originally Posted by *joeh4384*
> 
> Is that single or multi-rail? It could be how you have your rails balanced.


it has six 38A rails it should be fine have same model in the sr2 build in my sig with overvolted 680s


----------



## doctakedooty

Quote:


> Originally Posted by *Roxycon*
> 
> I'm so late to the game but do any of you have any recommendations for radiator size for a single 295x2 paired with an i7 4790k?
> 
> Hope to fit a custom loop and a mATX board in a fairly small case


In my 250D mitx case I got a custom loop on my 25th and 4790k. With my cpu clocked at 4.8 get and 1.25v and 295x2 stock clocked. I have a 240 mm alpha cool 30 mm slim radiator and a alphacool 60 mm thick 120 mm rad cooling them my water Temps peak out around 39 C during gaming on a 4K monitor. The gpu will peak at 59 C and cup at 65 C. I would say that's the minimum rad space to keep things cool. As much rad space as you can squeeze though will help lower those temps. Of course with my set up I see water temps of around almost 18 C increase during load and ambient.


----------



## doctakedooty

Quote:


> Originally Posted by *ozlay*
> 
> it has six 38A rails it should be fine have same model in the sr2 build in my sig with overvolted 680s


Make sure you have one 8 pin from each rail plugged into the 295x2. You need a min of 28 amps per 8 pin plug.

The shut downs are due to drawing to much amperage on each rail. So make sure each of your pcie plugs have there own dedicated rail with nothing each drawing from those two rails besides the 295x2. I had the same issue with mine even though it was advertised and stamped as a single rail after digging I found it was multiple rails so I had to use one 8 pin harness plugged into the psu rail and then my other 8 pin plugged into my card and into another rail to give it the required total of 56 amps needed for the card to not trip the ocp on the psu and cause reboots during gaming or benchmarks


----------



## houssem89

Quote:


> Originally Posted by *doctakedooty*
> 
> Make sure you have one 8 pin from each rail plugged into the 295x2. You need a min of 28 amps per 8 pin plug.
> 
> The shut downs are due to drawing to much amperage on each rail. So make sure each of your pcie plugs have there own dedicated rail with nothing each drawing from those two rails besides the 295x2. I had the same issue with mine even though it was advertised and stamped as a single rail after digging I found it was multiple rails so I had to use one 8 pin harness plugged into the psu rail and then my other 8 pin plugged into my card and into another rail to give it the required total of 56 amps needed for the card to not trip the ocp on the psu and cause reboots during gaming or benchmarks


i tried many configuration and it's same but after 3 benchmark test or 4 heaven benchmark when my graphic card attain 69 degree my pc shut down so i think if it's from the psu the will close in the first benchmark test ????


----------



## doctakedooty

Quote:


> Originally Posted by *houssem89*
> 
> i tried many configuration and it's same but after 3 benchmark test or 4 heaven benchmark when my graphic card attain 69 degree my pc shut down so i think if it's from the psu the will close in the first benchmark test ????


Not necessarily I made it through a benchmark or two then a shut down myself which is why my psu was my last thing to look at. It could be the card but I would lean towards psu.


----------



## houssem89

so the problem come from my card ??


----------



## electro2u

Quote:


> Originally Posted by *houssem89*
> 
> so the problem come from my card ??


If the computer is shutting down it's most likely a problem with the power supply. You may have enough power total, but you may be placing to great an amperage load on the 12v rail you are using for the 295x2. You need to run separate PCIE cables to each 8pin connector on the card.

The Antec 1200W Pro PSU has 6 12V rails all rated at 30amps each. The 295x2 needs 50amps. So use 2 PCIE cables.


----------



## ozlay

Quote:


> Originally Posted by *electro2u*
> 
> If the computer is shutting down it's most likely a problem with the power supply. You may have enough power total, but you may be placing to great an amperage load on the 12v rail you are using for the 295x2. You need to run separate PCIE cables to each 8pin connector on the card.
> 
> The Antec 1200W Pro PSU has 6 12V rails all rated at 30amps each. The 295x2 needs 50amps. So use 2 PCIE cables.


this ^ with that psu the pcie cables are spread across 4 rails each has 1 8pin and 1 6pin per rail half are modular half are not so i would just use the 2 8pin that are non modular which should be enough unless you are also using the 6pin pcie on them rails

rails 3 and 4 are the non modular cables and rails 5 an 6 are the modular rails for the pcie connectors


----------



## melkor1

Hi. I have a problem with my new Pc:

i7 5820K
Corsair H100i
Asus X99 Deluxe
16 Gb DDR4 2400 Gskill
Corsair AX 860i
XFX Radeon 295X2 8gb (driver 14.501.1003-141120a-178000C)
Samsung 256gb 850PRO
Case Corsair D650
Winz 8.1 Pro 64bit

Call of Duty Advanced Warfare for example don't work... Read only 2.8gb video ram

And the FAN is always in high mode

I used 2 PCI-E power cable from my AX860i to Graphic card

Do you have idea?

Thanks

Diego


----------



## electro2u

Quote:


> Originally Posted by *melkor1*
> 
> Hi. I have a problem with my new Pc:
> 
> i7 5820K
> Corsair H100i
> Asus X99 Deluxe
> 16 Gb DDR4 2400 Gskill
> Corsair AX 860i
> XFX Radeon 295X2 8gb (driver 14.501.1003-141120a-178000C)
> Samsung 256gb 850PRO
> Case Corsair D650
> Winz 8.1 Pro 64bit
> 
> Call of Duty Advanced Warfare for example don't work... Read only 2.8gb video ram
> 
> And the FAN is always in high mode
> 
> I used 2 PCI-E power cable from my AX860i to Graphic card
> 
> Do you have idea?
> 
> Thanks
> 
> Diego


Fan always being in high mode indicates drivers are not properly installed


----------



## houssem89

i used the 2 no modular cable 8 pin they delivry 76 amp i think .......


----------



## houssem89

i found the problem, the problem come from my psu i tried now a xfx 850 watt and the pc don't shut down so my psu become very old


----------



## Roxycon

Quote:


> Originally Posted by *axiumone*
> 
> I'd say you need at least a 240 dedicated just for the 295 if you'd like to have some half decent temps and another 240 for the 4790k.


Quote:


> Originally Posted by *doctakedooty*
> 
> In my 250D mitx case I got a custom loop on my 25th and 4790k. With my cpu clocked at 4.8 get and 1.25v and 295x2 stock clocked. I have a 240 mm alpha cool 30 mm slim radiator and a alphacool 60 mm thick 120 mm rad cooling them my water Temps peak out around 39 C during gaming on a 4K monitor. The gpu will peak at 59 C and cup at 65 C. I would say that's the minimum rad space to keep things cool. As much rad space as you can squeeze though will help lower those temps. Of course with my set up I see water temps of around almost 18 C increase during load and ambient.


i do not think i can dedicate such big radiators but i'm planing two 30 or 45 mm 120's and a 30 mm 280 with some proper fans for this project







looks like i'm at the borders of cooling but i'd just go a little bit more conservative on my cpu if sound levels get annoying









+1 guys


----------



## houssem89

please this buttom it did what ??

20140410113723_R9-295X-8QF_1.JPG 73k .JPG file


----------



## joeh4384

Quote:


> Originally Posted by *houssem89*
> 
> please this buttom it did what ??
> 
> 20140410113723_R9-295X-8QF_1.JPG 73k .JPG file


It is the bios switch. The 295x2 comes with 2 bioses.


----------



## houssem89

Quote:


> Originally Posted by *joeh4384*
> 
> It is the bios switch. The 295x2 comes with 2 bioses.


ah ok thanks so that can be more stable for one of two bioses


----------



## joeh4384

I think it is more a fail-safe if you flash a bad bios. I personally flashed the sapphire OC bios on mine.


----------



## grok1982

Hey guys. Just built my first multi-card tri-fire rig, using a 295x2 and 290x. Had a question about pump orientation: Should I place the pump so it's above or below the fan?

Tried it both ways, and I don't know if there's an optimal or safer solution:




Happy to be a part of the club, and also to tinker and oc with my first gaming build.


----------



## joeh4384

Quote:


> Originally Posted by *grok1982*
> 
> Hey guys. Just built my first multi-card tri-fire rig, using a 295x2 and 290x. Had a question about pump orientation: Should I place the pump so it's above or below the fan?
> 
> Tried it both ways, and I don't know if there's an optimal or safer solution:
> 
> 
> 
> 
> Happy to be a part of the club, and also to tinker and oc with my first gaming build.


Welcome to the club. I have mine below the fan.


----------



## steezebe

Quote:


> Originally Posted by *grok1982*
> 
> Hey guys. Just built my first multi-card tri-fire rig, using a 295x2 and 290x. Had a question about pump orientation: Should I place the pump so it's above or below the fan?
> 
> Tried it both ways, and I don't know if there's an optimal or safer solution:
> 
> 
> 
> 
> Happy to be a part of the club, and also to tinker and oc with my first gaming build.


http://www.pcgameware.co.uk/reviews/graphics-cards/msi-radeon-r9-295-x2-graphics-card-review/

Near the bottom of the review they discuss the orientation of the rad. And I QUOTE:

For optimum water flow (and to avoid picking up any air) the radiator should be mounted with the inlet and outlet at the base. this allows any air to be coolected in the upper end-tank, avoiding any air in the system loop. This is strangely not mentioned in the install instructions! BUT IT IS VERY IMPORTANT!









Hope that helps


----------



## Ranma13

Quote:


> Originally Posted by *steezebe*
> 
> For optimum water flow (and to avoid picking up any air) the radiator should be mounted with the inlet and outlet at the base. this allows any air to be coolected in the upper end-tank, avoiding any air in the system loop.


It's not that important. A lot of air in your loop? Yeah, that'll affect your temps. A little bit of air? You won't notice any difference. My recommendation is to orient the rad with the ports facing down, if only just to prevent the hoses from blocking the fan.


----------



## silencespr

bottom here


----------



## houssem89

Quote:


> Originally Posted by *joeh4384*
> 
> I think it is more a fail-safe if you flash a bad bios. I personally flashed the sapphire OC bios on mine.


it's better sapphire bios


----------



## eqc6

Just FYI, Amazon has XFX 295x2's price down to $689 right now.


----------



## joeh4384

Quote:


> Originally Posted by *eqc6*
> 
> Just FYI, Amazon has XFX 295x2's price down to $689 right now.


Newegg has the same price plus a $30 dollar rebate.


----------



## F4ze0ne

Quote:


> Originally Posted by *joeh4384*
> 
> Newegg has the same price plus a $30 dollar rebate.


Newegg has the rebate, but they also won't allow you to return the card should you change your mind.


----------



## eqc6

Quote:


> Originally Posted by *joeh4384*
> 
> Newegg has the same price plus a $30 dollar rebate.


ah, figured thats why amazon price matched. i still haven't received my $30 rebate from xfx and I bought it November.
Quote:


> Originally Posted by *F4ze0ne*
> 
> Newegg has the rebate, but they also won't allow you to return the card should you change your mind.


Did not know this! Thank god for Amazon! I'm still debating if I should go for a 2nd 295x2. I currently run trifire with xfx 295x2 + sapphire 290x. I don't have any more space for fans, so if I buy a 2nd 295x2 I also have to buy a new case


----------



## F4ze0ne

Quote:


> Originally Posted by *eqc6*
> 
> Did not know this! Thank god for Amazon! I'm still debating if I should go for a 2nd 295x2. I currently run trifire with xfx 295x2 + sapphire 290x. I don't have any more space for fans, so if I buy a 2nd 295x2 I also have to buy a new case


When purchasing from Newegg, I'd recommend ALWAYS read their return policy for the item.

I usually don't order from them because of this practice. I'd rather miss out on a $30 rebate to have Amazons return policy.


----------



## joeh4384

Quote:


> Originally Posted by *F4ze0ne*
> 
> When purchasing from Newegg, I'd recommend ALWAYS read their return policy for the item.
> 
> I usually don't order from them because of this practice. I'd rather miss out on a $30 rebate to have Amazons return policy.


Yeah me too.


----------



## HoneyBadger84

I'm baaaaaaaaaaaack









And I'm bringing friends!



They arrive sometime between Monday & Wednesday of next week!


----------



## DividebyZERO

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm baaaaaaaaaaaack
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I'm bringing friends!
> 
> 
> 
> They arrive sometime between Monday & Wednesday of next week!


did you try any nvidia during your break?


----------



## HoneyBadger84

Quote:


> Originally Posted by *DividebyZERO*
> 
> did you try any nvidia during your break?


Ummm, my immediate response to that would get me another infraction.

So I'll just do this:



lol

I sold most of my 290Xs when they irritated me one day, I was running 280Xs for a while, then I went down to a single 290 which is what I'm running now. Got these as a sort of post birthday/Christmas present to myself, figure my tax return will pseudo pay for most of it anyway, got a great deal on'em, just gotta figure out how the heck I'm gonna mount the second radiator... where more accurately.


----------



## cliptags

Hello all!

Can I be added to the club? Running a XFX Hydra edition which I managed to get for 500 pounds which is awesome!





Got a question about temps though! I'm running my 295X2 in 'trifire' with my old 290x running 4k content. I booted up Tomb Raider and was getting about 80-100fps which is awesome! After about 5 mins GPU1 hit 74 degrees and downclocks. GPU2 hovers around 72 while GPU3 (the 290x) only hits about 64 degrees. Seems high for me and I don't really want it to downclock! Am I doing something wrong with my set up or could it be a problem with the card?

I think I'm getting much higher temps than what I'm seeing online on benchmarks. Got the XFX cooling on with a noctura fan in push config exhausting out of the back of my case. Should I get rid of the 290x as that may be putting heat onto the 295x2 GPU's? First time I've got a card/cards of this power/heat so any help would be much appreciated!!

Cliptags


----------



## joeh4384

Quote:


> Originally Posted by *cliptags*
> 
> Hello all!
> 
> Can I be added to the club? Running a XFX Hydra edition which I managed to get for 500 pounds which is awesome!
> 
> 
> 
> 
> 
> Got a question about temps though! I'm running my 295X2 in 'trifire' with my old 290x running 4k content. I booted up Tomb Raider and was getting about 80-100fps which is awesome! After about 5 mins GPU1 hit 74 degrees and downclocks. GPU2 hovers around 72 while GPU3 (the 290x) only hits about 64 degrees. Seems high for me and I don't really want it to downclock! Am I doing something wrong with my set up or could it be a problem with the card?
> 
> I think I'm getting much higher temps than what I'm seeing online on benchmarks. Got the XFX cooling on with a noctura fan in push config exhausting out of the back of my case. Should I get rid of the 290x as that may be putting heat onto the 295x2 GPU's? First time I've got a card/cards of this power/heat so any help would be much appreciated!!
> 
> Cliptags


Are you running push-pull on your fans? You could be running into issues with the heat the 290x is dumping in your case.


----------



## cliptags

Thanks for the reply. I've tried just push and then tried using another fan for push/pull same problem. Was worried that it may be the 290x putting the heat into the case but wasn't sure because it had such low temps compared to the 295x2. I may try removing the 290x and see if that makes a diff.


----------



## Menno

Guys, I can get a 295x2 pretty cheap and already have 2x 290x Sapphire Tri-X 2-way running. 4-way crossfire with an extra 295x2 is possible now (just for the enthusiast fun of it lol). (Rampage IV Extreme + 3930K +corsair 750d well ventilated), but what PSU should I get? evga supernova G2/P2/ 1200W, 1300W or 1600W? The 2 sapphires are running of a old corsair HX850 now.


----------



## joeh4384

Quote:


> Originally Posted by *Menno*
> 
> Guys, I can get a 295x2 pretty cheap and already have 2x 290x Sapphire Tri-X 2-way running. 4-way crossfire with an extra 295x2 is possible now (just for the enthusiast fun of it lol). (Rampage IV Extreme + 3930K +corsair 750d well ventilated), but what PSU should I get? evga supernova G2/P2/ 1200W, 1300W or 1600W? The 2 sapphires are running of a old corsair HX850 now.


Most people with 2 here use 1500 or 1600 watts.


----------



## eqc6

How does Far Cry 4 run in 295x2 + 290x trifire? Has anyone here tried? I want to get the game but afraid I won't be able to run it well since last time I read it still doesn't support crossfire. Anyone?


----------



## F4ze0ne

Quote:


> Originally Posted by *eqc6*
> 
> How does Far Cry 4 run in 295x2 + 290x trifire? Has anyone here tried? I want to get the game but afraid I won't be able to run it well since last time I read it still doesn't support crossfire. Anyone?


I would not buy the game right now. FC4 has lots of issues with AMD that needs to be fixed.

http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review


----------



## joeh4384

Far cry plays ok on 1 gpu. Crossfire is still broken.


----------



## fat4l

Quote:


> Originally Posted by *axiumone*
> 
> I'd say you need at least a 240 dedicated just for the 295 if you'd like to have some half decent temps and another 240 for the 4790k.


i would recommend it unless ur fine with water temps above 30C.
I had 2x240mm + push/pull and it wasnt nuff. while looping valley water temps got up to 31/32C. Not saying furmark lol


----------



## Mega Man

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *axiumone*
> 
> I'd say you need at least a 240 dedicated just for the 295 if you'd like to have some half decent temps and another 240 for the 4790k.
> 
> 
> 
> i would recommend it unless ur fine with water temps above 30C.
> I had 2x240mm + push/pull and it wasnt nuff. while looping valley water temps got up to 31/32C. Not saying furmark lol
Click to expand...

what about mine ? think 5 480s is enough ? 3 monstas 1 ut60 1 xt45


----------



## electro2u




----------



## eqc6

Quote:


> Originally Posted by *F4ze0ne*
> 
> I would not buy the game right now. FC4 has lots of issues with AMD that needs to be fixed.
> 
> http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review


That sucks, pretty frustrating


----------



## HoneyBadger84

Quote:


> Originally Posted by *Menno*
> 
> Guys, I can get a 295x2 pretty cheap and already have 2x 290x Sapphire Tri-X 2-way running. 4-way crossfire with an extra 295x2 is possible now (just for the enthusiast fun of it lol). (Rampage IV Extreme + 3930K +corsair 750d well ventilated), but what PSU should I get? evga supernova G2/P2/ 1200W, 1300W or 1600W? The 2 sapphires are running of a old corsair HX850 now.


I can tell you 4 GPUs of 290X stature & a 3930k setup aka pretty much my setup, draws at least 1300W if not more from the wall, and that was on a PSU that has 91+% efficiency while pulling that much wattage. So I'd recommend a hefty unit of at least 1300W. I updated to a LEPA G 1600W. I do NOT recommend this PSU, it's wires suck butt, plugs are stiff & you can actually cut yourself trying to unplug them if you're not careful cuz they're that hard to unplug.


----------



## Mega Man

Quote:


> Originally Posted by *eqc6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F4ze0ne*
> 
> I would not buy the game right now. FC4 has lots of issues .
> 
> http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review
> 
> 
> 
> That sucks, pretty frustrating
Click to expand...

fixed, unisoft sucks

also guys

regarding the x750 being able to power a CPU and 295x2

i am sorry for the time it took

here are some benches




i also was able to bench valley and prime at the same time

3dm gave poor scoring but i am in the middle of updating it ( prompted an update )

long story short i think this is quite doable with a quality PSU

i did power 4 pumps from my 750 but not my 2xaq6s ( aquaero 6 xts ) it was too hard to transfer the wiring

total load on the x750

cpu 24pin/8pin
295x2
4 ddcs

several fans

total load powered *not* on the x750
4pin ( GPU at bottom of mobo )

2xaq6s and a few fans
ssdsx3/hddsx2

please note when i say several fans my system has 5x480s

i do feel this was a normal load for most people

my system is
3930k/RIVBE @ 4.8ghz 2400ram
259x2 ( second one was disabled via dip switches

x750 as described above and
1kw as described above

also to note all benches were run in eyefinity at stock on 295x2, however have the rad fans powered by the aq6 ( ap30s pwm modded ) !~

http://www.3dmark.com/fs/3769779


----------



## kayan

Just a quick question:

Do you guys think it'd be worth selling both of my watercooled 290x's and picking up a single 295x2? I figure I can get 250-300 for each reference 290x + block and backplate, and spend around 100 for the 295x2. Think I should bother?


----------



## joeh4384

Quote:


> Originally Posted by *kayan*
> 
> Just a quick question:
> 
> Do you guys think it'd be worth selling both of my watercooled 290x's and picking up a single 295x2? I figure I can get 250-300 for each reference 290x + block and backplate, and spend around 100 for the 295x2. Think I should bother?


I wouldn't. You already have a better setup.


----------



## Mega Man

Agreed unless you desperately need a single card solution


----------



## silencespr

i have both R9 295x2 and and two water cooled R9 290x and single card performs in certain games.


----------



## fat4l

Finally, my beauty is plugged in and fully fitted with fans









Mora 3 420 + 8x230mm Bitfenix Spectre Pro Red Led, push-pull

Cools my 295X2 incredibly!

MSI 295X2, Sapphire OC bios, 1030/1300MHz

Water temp at idle: 22C (deltaT=3C)
Water temp at load: 25C(deltaT=6C)

GPU1 load: 40C
GPU2 load: 40C

Looping Heaven 4.0 for ~30mins.

Full: http://postimg.org/image/azlyyyuqr/




































Also......


----------



## arieldeboca

Hi guys.
I'm playing my Shappire farcry 4 295x2.
I'm a little desepcionado by low fps that I have. The graph is 1080p ultra settings but with HBAO + and Enhanced Godrays Impact.
Overall average is 50 fps and have parts 40 and up to 30.
The problem is ubisoft? amd driver?
this game does not work well with CF?
I played call of duty modern advance and up to 100 fps, but with no fc4.
The rest of my team is very powerful, 4820k and 16gb ram memory


----------



## kayan

Quote:


> Originally Posted by *arieldeboca*
> 
> Hi guys.
> I'm playing my Shappire farcry 4 295x2.
> I'm a little desepcionado by low fps that I have. The graph is 1080p ultra settings but with HBAO + and Enhanced Godrays Impact.
> Overall average is 50 fps and have parts 40 and up to 30.
> The problem is ubisoft? amd driver?
> this game does not work well with CF?
> I played call of duty modern advance and up to 100 fps, but with no fc4.
> The rest of my team is very powerful, 4820k and 16gb ram memory


FC4 has no Crossfire Support, which is extremely sad! I hover around 45-55 @ 1440p Ultra settings with dual 290x (even though it only sees one).


----------



## arieldeboca

Quote:


> Originally Posted by *kayan*
> 
> FC4 has no Crossfire Support, which is extremely sad! I hover around 45-55 @ 1440p Ultra settings with dual 290x (even though it only sees one).


but in my capture shows 2 cores running, that is CF?
If not work the CF should show a core function and the other at 0%?


----------



## ssiperko

Quote:


> Originally Posted by *lowgun*
> 
> Do you mean like put a waterblock on it and add it to a custom waterloop? There is no way to change the radiator while continuing to use the stock AIO solution.


Sure ya could.
A reservoir, some line and fittings bingo yer done.

I'm gonna do one with 5 AIO systems I have using 3 Seidon pumps.
H110, 240M, 2x 120M's and a H55 in a 750D, just need to sort out the hose routes.









SS


----------



## ssiperko

Quote:


> Originally Posted by *joeh4384*
> 
> I wouldn't. You already have a better setup.


Yeap.

My Red Modded 290's at 1075/1350 in CF score better than a 295 x2 in most tests I've seen.

SS


----------



## Mega Man

Quote:


> Originally Posted by *arieldeboca*
> 
> Hi guys.
> I'm playing my Shappire farcry 4 295x2.
> I'm a little desepcionado by low fps that I have. The graph is 1080p ultra settings but with HBAO + and Enhanced Godrays Impact.
> Overall average is 50 fps and have parts 40 and up to 30.
> The problem is ubisoft? amd driver?
> this game does not work well with CF?
> I played call of duty modern advance and up to 100 fps, but with no fc4.
> The rest of my team is very powerful, 4820k and 16gb ram memory


FC4 is known to be buggy for red or green, another rushed ubisoft product that frankly is trash,

people just need to stop buying ubi products untill they clean up their acts, but as it stands they can rush products out that are not complete or optimized, and sell alot of them, so they will just keep doing it


----------



## F4ze0ne

Quote:


> Originally Posted by *arieldeboca*
> 
> but in my capture shows 2 cores running, that is CF?
> If not work the CF should show a core function and the other at 0%?


The drivers probably initialize the 2nd card, but the game is ignoring it.


----------



## joeh4384

Quote:


> Originally Posted by *arieldeboca*
> 
> but in my capture shows 2 cores running, that is CF?
> If not work the CF should show a core function and the other at 0%?


It isn't using the 2nd GPU. If you were monitoring core clocks, it would be at 300mhz. I have been getting 45-65 at the ultra preset with no Nvidia goodies at 1440p on just one GPU.


----------



## fat4l

Guys, whats ur default voltage at load for both cores ? I know it may be fluctuating but on average~ ?


----------



## HoneyBadger84

Question for y'all as I'm testing my first card now and I've never had these before, wondering what y'all's experiences have been. What is your usual idle temps?

Right now I'm idling at 34-38C & the highest loads I've seen thus far are in the high 50s (keep in mind my case flow is godly).

I have the radiator mounted at the top of my case as once I go QuadFire I plan to have one mounted on back, one mounted up top, with the top card being the one mounted on top & the bottom card mounted on back.



Upside down pic ftw. I'll post up some installed pictures once they're in their final form. Right now I'm testing each card individually before I go Quad.


----------



## joeh4384

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Question for y'all as I'm testing my first card now and I've never had these before, wondering what y'all's experiences have been. What is your usual idle temps?
> 
> Right now I'm idling at 34-38C & the highest loads I've seen thus far are in the high 50s (keep in mind my case flow is godly).
> 
> I have the radiator mounted at the top of my case as once I go QuadFire I plan to have one mounted on back, one mounted up top, with the top card being the one mounted on top & the bottom card mounted on back.
> 
> 
> 
> Upside down pic ftw. I'll post up some installed pictures once they're in their final form. Right now I'm testing each card individually before I go Quad.


Mine idles at 32/34 with max load temps at 68/70 on a game that really cranks both GPUs close to 100% such as Shadow of Mordor or a benchmark like Valley. I switched my fan for push/pull corsair 120 SP performances with the one fan connected to my board just running in a standard fan level. Most of the times it is in the 60s.


----------



## F4ze0ne

Quote:


> Originally Posted by *joeh4384*
> 
> Mine idles at 32/34 with max load temps at 68/70 on a game that really cranks both GPUs close to 100% such as Shadow of Mordor or a benchmark like Valley. I switched my fan for push/pull corsair 120 SP performances with the one fan connected to my board just running in a standard fan level. Most of the times it is in the 60s.


My temps are similar. Highest I've seen is 73 after letting the Valley run for a while.


----------



## HoneyBadger84

Well so far I haven't even seen 60C yet and that's through quite a few runs of the various 3DMarks. I'm fixing to put on Watch_Dogs & give'er a look see, see if we can't level some serious load to these bad boys.

I did come up with an interesting (messy looking but very efficient) way of mounting them though, check it out lol:



Edit: Power draw peaked out about 1200W at the wall for those curious, so probably about 1050W actual, and that's with my CPU @ 4.2GHz, so not full bore, but that was on 3DMark FireStrike Extreme, so definitely not light load on the GPUs lol


----------



## cliptags

So I'm considering going water cooling on my 295x2 (and just leaving air cooling on my 290x). Never done water cooling before so looking for advice.

My case is a HAF X so I should be able to fit everything in. I have speced up the following WC components after doing some online research and wanted to make sure I wasn't going down totally the wrong path! If you have any advice on different components etc. it that would be much appreciated as well!

- EK Water Blocks EK-FC R9-295X2 - Acetal+Nickel
- XSPC D5 Dual Bay Reservoir/Pump Combo V2
- XSPC RX360 Triple Fan Radiator V3 (Noctura fans in Push/Pull)
- Koolance CPU-380A Water Block (AMD FX9370 Processor)
- Monsoon 16/11mm (ID 7/16 OD 5/8) Free Center Compression Fitting Six Pack - Green
- Mayhems Ultra Pure H2O 5Ltr
- XSPC HighFlex Hose 7/16" ID, 5/8" OD, 15.9/11.1mm, 2m, Green/UV Green
- Mayhems Fine Silver Kill Coil

As I say, totally new to water cooling so if I've made any glaring errors I'm happy to learn


----------



## HoneyBadger84

Anyone have Watch_Dogs & experience the issue of the game just plain not starting? It worked fine when I launched it 2 days ago with the R9 290 I was running, and that was on the same drivers. I already did a "verify files & repair", and I've disabled Crossfire to see if maybe QuadFire was the issue.

I click play, it acts like the game is going to load... then a few moments later another "Loading" cursor happens & it acts like the game closed.  Anyone experience this & know how to fix it? I tried a quick Google search but nothing is coming up that will help (update drivers, etc, all of which I've done for the card install)


----------



## joeh4384

Do you have afterburner running? I remember one time it just stopped loading for me with afterburner running in the background for no reason on that game.


----------



## HoneyBadger84

Quote:


> Originally Posted by *joeh4384*
> 
> Do you have afterburner running? I remember one time it just stopped loading for me with afterburner running in the background for no reason on that game.


What I did was after the repair of the files, I closed UPlay, disabled Crossfire, restarted Afterburner etc that stops working whenever you mess with Crossfire settings, ran the game, closed everything out again, reenabled Crossfire & it worked after that.

Having an issue with what seems to be the GPUs going in to low power state even though they're no where near overheating (I'm playing with 144Hz on VSync on, so they're only hitting the high 40s/low 50s in game). I was about 3/4s of the way through the first mission, exiting the stadium when my FPS went from pretty much always pegged at 144FPS to down in the 60s & 70s with the GPU core speeds in the 500-700MHz range, like it does when you're watching a video or something.

I had the same issue with just 2 way crossfire & Metro Last Light as well, guess I'll have to do some research & see if there's any easy fix for it, other than disabling ULPS, which I obviously don't wanna do since these things suck a lotta power when they're not allowed to drop in to full idle mode.


----------



## fat4l

Quote:


> Originally Posted by *cliptags*
> 
> So I'm considering going water cooling on my 295x2 (and just leaving air cooling on my 290x). Never done water cooling before so looking for advice.
> 
> My case is a HAF X so I should be able to fit everything in. I have speced up the following WC components after doing some online research and wanted to make sure I wasn't going down totally the wrong path! If you have any advice on different components etc. it that would be much appreciated as well!
> 
> - EK Water Blocks EK-FC R9-295X2 - Acetal+Nickel
> - XSPC D5 Dual Bay Reservoir/Pump Combo V2
> - XSPC RX360 Triple Fan Radiator V3 (Noctura fans in Push/Pull)
> - Koolance CPU-380A Water Block (AMD FX9370 Processor)
> - Monsoon 16/11mm (ID 7/16 OD 5/8) Free Center Compression Fitting Six Pack - Green
> - Mayhems Ultra Pure H2O 5Ltr
> - XSPC HighFlex Hose 7/16" ID, 5/8" OD, 15.9/11.1mm, 2m, Green/UV Green
> - Mayhems Fine Silver Kill Coil
> 
> As I say, totally new to water cooling so if I've made any glaring errors I'm happy to learn


Hi mate.
I was doing my wcooling in the past few months so if u go back in the thread u will see my pics + discussion about cooling 295x2.

From my point of view, Aquacomputer gpu block is the best one for 295X2 + they have active backplate which is also the best one. I also think it looks the best








One 360mm rad will not handle cpu+295X2. Ur water temp will be rly high.I think 2x360mm rad is minimum for "acceptable" water temps.
Also, do not use silver kill coil with nickel plated blocks as it may corrode the nickel plating on blocks(wash it away). Rather use Mayhems X1 which contains the needed inhibitors and biocides to protect ur loop.
I also believe that EK makes the best CPU blocks, EK Supremacy EVO or even older EK Supremacy.


----------



## electro2u

Considering the fact they ship it with a single 120mm rad and you can cool a 4790k adequately with a single 120mm rad, a single 360 can at least handle the situation. Water temps might get a bit into the 50s but that's still going to give better core temps than stock 295x2. I run trifire with a 295 x2 and a 4820k on a 360 and a 280 and it's not ideal but the 295 doesn't throttle like it does stock.

That said obviously more rads is better. Long as they have fans on em.


----------



## Orivaa

Speaking of poorly optimized games that won't start, Assassin's Creed Unity won't open up at all for me. I launch it, it gives me the little splash logo in the middle of the screen, and then nothing ever happens. Anyone familiar with this?


----------



## fat4l

Quote:


> Originally Posted by *electro2u*
> 
> Considering the fact they ship it with a single 120mm rad and you can cool a 4790k adequately with a single 120mm rad, a single 360 can at least handle the situation. Water temps might get a bit into the 50s but that's still going to give better core temps than stock 295x2. I run trifire with a 295 x2 and a 4820k on a 360 and a 280 and it's not ideal but the 295 doesn't throttle like it does stock.
> 
> That said obviously more rads is better. Long as they have fans on em.


Theres no point in custom wcooling and spend ****load of money to have 50C water and 5C better temps on core lol.
When I had my stock cooler, push pull SP120 performance ed, the card was never throttling unless I run furmark. Even tho, the rad temps spiked to high 40 which I dont consider "ok". Imho the stock cooler has not nuff cooling capacity considering how hot the card is. If it was 290X it would be ideal but not for 295X2 and I', not even talking about VRMs. Before I added mora to my system I run 2x240mm push pull rads and water temp was about 32C looping valley which I also dont believe is "ok" considering the fact that ambient temp is 19-22C.

Well this all is just about personal opinion and it may differ so...


----------



## electro2u

Some people just want to go h20 for the looks. I am tapped out as far as radiator space and unwilling to go bigger. This is also why I'm building a sli rig with 980s. I'll take that beer though!
/cheers


----------



## m00ter

Just caught up on last few pages....

Regarding orientation of the rad, I found it made a big difference for me to orientate it the right way - i.e. with tubes at the bottom. I initially had it so the tubes were on the side and I kept on getting throttled. After reorientation temps dropped and no more throttling, so I guess there's some air in there?

I also added a couple of Coolermaster Jetflo's hooked up in push/pull, connected to my fan controller with a Y splitter. If I dial them all the way up (2160rpm) it sounds like a jet engine but I don't top 50C!

As such I've now overclocked to the Sapphire spec (1030 / 1350, no change to voltage) and everything is cool, literally!


----------



## ssiperko

Quote:


> Originally Posted by *fat4l*
> 
> Theres no point in custom wcooling and spend ****load of money to have 50C water and 5C better temps on core lol.
> When I had my stock cooler, push pull SP120 performance ed, the card was never throttling unless I run furmark. Even tho, the rad temps spiked to high 40 which I dont consider "ok". Imho the stock cooler has not nuff cooling capacity considering how hot the card is. If it was 290X it would be ideal but not for 295X2 and I', not even talking about VRMs. Before I added mora to my system I run 2x240mm push pull rads and water temp was about 32C looping valley which I also dont believe is "ok" considering the fact that ambient temp is 19-22C.
> 
> Well this all is just about personal opinion and it may differ so...


What would you consider to be "ok" at 19-22C ............ 18-21C?









Adding 10-15C to would be a nicely tuned system unless you're running TEC's.

SS


----------



## electro2u

Quote:


> Originally Posted by *m00ter*
> 
> Just caught up on last few pages....
> 
> Regarding orientation of the rad, I found it made a big difference for me to orientate it the right way - i.e. with tubes at the bottom. I initially had it so the tubes were on the side and I kept on getting throttled. After reorientation temps dropped and no more throttling, so I guess there's some air in there?
> 
> I also added a couple of Coolermaster Jetflo's hooked up in push/pull, connected to my fan controller with a Y splitter. If I dial them all the way up (2160rpm) it sounds like a jet engine but I don't top 50C!
> 
> As such I've now overclocked to the Sapphire spec (1030 / 1350, no change to voltage) and everything is cool, literally!


24C lower than what I was at stock... Something don't add up. 50C max under prolonged load is not going to be possible with the 295x2 on stock. It idles at 40c


----------



## m00ter

Jetflos? Carbide 540 case with a gazillion fans running through it?










I just sat the fans at 1980rpm and had a couple of endurance races in GRID Autosport (running 3 monitors in eyefinity, all candy turned up to max) and she topped out at 56C.

I'm idly wondering how far I can push the OC now. I just stuck it on "Sapphire settings" as a test and as it worked I left it there. Hmmmm.


----------



## cliptags

Thanks for the tips everyone. I've been looking at how to fit more radiator space in after your comments because the HAF X really only allows for 1x 360mm and 1x120mm. I've planned out two potential loops with differing effects. What do you all reckon to the following:



So in this one I would remove the current HDD bay at the bottom right of the picture and put a 240mm radiator at the front of the case (where one of the intake fans currently is). I'd move the HDD's up into the 5.6"/hot swappable bays. I can then move the intake onto the back and keep the intake on my case. With this I could always add a 120mm rad at the back but then I'd just be left with 1 intake fan which wouldn't be great.

The other option as I see it is:



In this I'd keep the pump and res and 360mm radiator at the same spot but move the other radiator to be mounted externally at the back of the case. This would keep the intake fans and HDD's where they are and would also allow me to put potentially a 360mm rad on the outside but obviously sacrifices the case's look a bit.

On both of these I've got the 290x as well at the 295x2 and the CPU in the loop. Do you think either of these set ups would be enough for that or should I remove the 290x from the loop all together (as GPU 3 it's temps dont seem to go above 65 anyway at the moment).

Any advice on which of the two above would be most suitable/easiest to do would be appreciated! Or if I've totally got it wrong let me know that as well









I've also have a look at the Aquacomputer gpu block and it looks pretty damn nice! I may well have been swayed! I think this is going to end up costing much more than I anticipated...Oh well!


----------



## joeh4384

Quote:


> Originally Posted by *m00ter*
> 
> Jetflos? Carbide 540 case with a gazillion fans running through it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just sat the fans at 1980rpm and had a couple of endurance races in GRID Autosport (running 3 monitors in eyefinity, all candy turned up to max) and she topped out at 56C.
> 
> I'm idly wondering how far I can push the OC now. I just stuck it on "Sapphire settings" as a test and as it worked I left it there. Hmmmm.


Are you sure crossfire is on? 56 load temp is what I get in Far Cry 4 on single GPU.


----------



## m00ter

Yup. If I enable the logo in CCC it appears when the game loads. Both GPU's are definitely active too, according to HWiNFO64 anyway.


----------



## malik22

hey guys im about to receive my sapphire 295x2 can i use asus gpu tweak with it i really like that program?


----------



## Mega Man

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *electro2u*
> 
> Considering the fact they ship it with a single 120mm rad and you can cool a 4790k adequately with a single 120mm rad, a single 360 can at least handle the situation. Water temps might get a bit into the 50s but that's still going to give better core temps than stock 295x2. I run trifire with a 295 x2 and a 4820k on a 360 and a 280 and it's not ideal but the 295 doesn't throttle like it does stock.
> 
> That said obviously more rads is better. Long as they have fans on em.
> 
> 
> 
> Theres no point in custom wcooling and spend ****load of money to have 50C water and 5C better temps on core lol.
> When I had my stock cooler, push pull SP120 performance ed, the card was never throttling unless I run furmark. Even tho, the rad temps spiked to high 40 which I dont consider "ok". Imho the stock cooler has not nuff cooling capacity considering how hot the card is. If it was 290X it would be ideal but not for 295X2 and I', not even talking about VRMs. Before I added mora to my system I run 2x240mm push pull rads and water temp was about 32C looping valley which I also dont believe is "ok" considering the fact that ambient temp is 19-22C.
> 
> Well this all is just about personal opinion and it may differ so...
Click to expand...

Ummm silence? Some people just want silence


----------



## fat4l

Quote:


> Originally Posted by *ssiperko*
> 
> What would you consider to be "ok" at 19-22C ............ 18-21C?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Adding 10-15C to would be a nicely tuned system unless you're running TEC's.
> 
> SS


im getting deltaT=6-7C.
I'd be happy with up to 10C...


----------



## electro2u

I'm happy with my 295x2 not throttling.


----------



## m00ter

Quote:


> Originally Posted by *electro2u*
> 
> 24C lower than what I was at stock... Something don't add up. 50C max under prolonged load is not going to be possible with the 295x2 on stock. It idles at 40c


Well, it is possible, and mine idles at 28 - 29C. I literally just booted her up, but here's a shot at idle. I'm about to fire up GRID Autosport, I'll leave HWiNFO running in the background and post another screenie in an hour or so


----------



## joeh4384

Quote:


> Originally Posted by *m00ter*
> 
> Well, it is possible, and mine idles at 28 - 29C. I literally just booted her up, but here's a shot at idle. I'm about to fire up GRID Autosport, I'll leave HWiNFO running in the background and post another screenie in an hour or so


Can you run valley or heaven? Is your PC in an igloo?


----------



## m00ter

In a bit I'm still racing, and kind of I guess:



Note the external h100i







It was the only way to get a 2nd fan on the 295x2 rad.


----------



## m00ter

Here's the evidence, m'lud. An hours gaming in eyefinity @ 1100 / 1350 with the fans on 1980rpm and both cores topped out at 54C. Which I'm very, very happy with.



Valley coming next...


----------



## caste1200

hey guys, any of you has replaced the stock fan of the radiator?


----------



## m00ter

Yes. 2 x Coolermaster Jetflos in push/pull at full whack is mental









Oh, and here's Valley @ 1920 x 1080 single screen (45 / 46C hahah!):



And here's Valley @ 6100x1080 (56 / 58C, not bad for an OC me thinks?):



I guess Valley pushes the card a little more than gaming does, but the Mrs has put the heating on in the last hour so maybe the extra couple of degrees is that


----------



## joeh4384

Quote:


> Originally Posted by *m00ter*
> 
> Yes. 2 x Coolermaster Jetflos in push/pull at full whack is mental
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, and here's Valley @ 1920 x 1080 single screen (45 / 46C hahah!):
> 
> 
> 
> And here's Valley @ 6100x1080 (56 / 58C, not bad for an OC me thinks?):
> 
> 
> 
> I guess Valley pushes the card a little more than gaming does, but the Mrs has put the heating on in the last hour so maybe the extra couple of degrees is that


Nice, I do not get anywhere close with Corsair SP120 performance editions but then again my computer doesn't sound like an airplane.


----------



## steezebe

HELP:

I'm getting a 'red screen of death' when I benchmark in Valley. It happened yesterday, so I rebooted, and only one GPU core was showing up. so I did a reboot, and then switched to the stock GPU bios-- same thing happened again. Now only one core is showing in GPU-z.

What happened? And more importantly, what do I do???


----------



## m00ter

Clean driver uninstall an reinstall?

RMA?

And Joeh I don't wind it up to airplane when gaming, more lawnmower


----------



## joeh4384

Quote:


> Originally Posted by *steezebe*
> 
> HELP:
> 
> I'm getting a 'red screen of death' when I benchmark in Valley. It happened yesterday, so I rebooted, and only one GPU core was showing up. so I did a reboot, and then switched to the stock GPU bios-- same thing happened again. Now only one core is showing in GPU-z.
> 
> What happened? And more importantly, what do I do???


Did you flash a different bios on the card?


----------



## steezebe

Quote:


> Originally Posted by *joeh4384*
> 
> Did you flash a different bios on the card?


Yeah I got the XFX card and put the Sapphire OC bios on it.


----------



## joeh4384

Quote:


> Originally Posted by *steezebe*
> 
> Yeah I got the XFX card and put the Sapphire OC bios on it.


You flashed both master and slave bioses?


----------



## caste1200

hey man, where did you plugged both of the fans? the card has enough juice to power to ~4W fans?


----------



## m00ter

I connected both fans to a Y splitter amd then connected that to a single channel on my fan controller. I don't use the card connector at all.

You could probably connect a Y splitter to the card header itself without hassle, but I wanted to control the speeds.

I also ordered a cable from moddiy that will give me control over the VRM fan, which will be nice. I still don't understand why they locked it.


----------



## caste1200

im just worried that the fans are a bit to powerfull for the card, my case fan controller can't handle them








2x Noiseblocker NB-eLoop B12-4 (120mm)... 4W 12v








im guessing i'll have to get a fan controller or use the corsair thingy of my h100i...

EDIT:

as I thought, the card can't handle not even one od them haha, next week im getting my new asus rog rampage v x99 board and my new 5930k, i'll leave my fans full speend until then and I will use the oc panel that comes with the board


----------



## m00ter

I'm running the new Lamptron FC5 V3 with a couple of LED Jetflo's on a single channel and it's fine, but according to the specs it maxes out at 30w per channel so not surprising really.

X99 sounds fun though, I've been loyal to AMD for years but even I'm considering Intel for the next build. Although I fear it'll be a bit of a learning curve in terms of oc'ing and temps etc.


----------



## steezebe

Quote:


> Originally Posted by *joeh4384*
> 
> You flashed both master and slave bioses?


Oh gosh no. I have the stock bios on 2, with the Sapphire on 1.

I'm guessing it's a driver issue; the cores are back, and without issues. I'm re-installing windows at this point though, just because my windows rot has accumulated for a bit too much. Plus I know there are Nvidia dll's still on the computer, particularly in the System32 folder, but I'm not sure exactly which ones they are, and deleting willy-nilly in that folder usually constitutes a clean install anyway...


----------



## malik22

what programs can I use to oce the saphhire 295x? does gpu tweak work


----------



## caste1200

same here, good times since athlon x2







, but i finally decided to go to intel... im exited about it!


----------



## caste1200

Quote:


> Originally Posted by *m00ter*
> 
> I'm running the new Lamptron FC5 V3 with a couple of LED Jetflo's on a single channel and it's fine, but according to the specs it maxes out at 30w per channel so not surprising really.
> 
> X99 sounds fun though, I've been loyal to AMD for years but even I'm considering Intel for the next build. Although I fear it'll be a bit of a learning curve in terms of oc'ing and temps etc.


same here, good times since athlon x2 tongue.gif, but i finally decided to go to intel... im exited about it!


----------



## magicase

Does anyone have trouble playing H1Z1 with their 295x2?

I can't get above 40fps even on lowest settings while others with Nvidia are getting 100+fps.


----------



## fat4l

Quote:


> Originally Posted by *steezebe*
> 
> Oh gosh no. I have the stock bios on 2, with the Sapphire on 1.
> 
> I'm guessing it's a driver issue; the cores are back, and without issues. I'm re-installing windows at this point though, just because my windows rot has accumulated for a bit too much. Plus I know there are Nvidia dll's still on the computer, particularly in the System32 folder, but I'm not sure exactly which ones they are, and deleting willy-nilly in that folder usually constitutes a clean install anyway...


u prolly dont get it mate









Each gpu require its own bios. Master and slave.
When flashing a dualchip card(295x2) u need to use 2 bioses(master and slave bios) for each bios switch position(1 and 2).
However u dont need to flash both bios switches so on one position u have stock bios(master and slave) and on position 2 u have sapphire OC bios(master and slave).

I hope it makes sense.....


----------



## steezebe

Quote:


> Originally Posted by *fat4l*
> 
> u prolly dont get it mate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Each gpu require its own bios. Master and slave.
> When flashing a dualchip card(295x2) u need to use 2 bioses(master and slave bios) for each bios switch position(1 and 2).
> However u dont need to flash both bios switches so on one position u have stock bios(master and slave) and on position 2 u have sapphire OC bios(master and slave).
> 
> I hope it makes sense.....


Absolument! I flashed both the master and slave. I was referring to the switch position.


----------



## Coppy

Hello everybody !

May i join the Club ?











Today with EK Waterblocks :


----------



## veaseomat

Add me to the club. I'm rocking an xfx r9 295x2 with 2 sapphire r9 290 tri-x oc for a 3way quadfire. Currently waiting on my new 1600w psu to arrive so right now I'm using a ghetto 2psu setup.


----------



## HoneyBadger84

Quote:


> Originally Posted by *veaseomat*
> 
> Add me to the club. I'm rocking an xfx r9 295x2 with 2 sapphire r9 290 tri-x oc for a 3way quadfire. Currently waiting on my new 1600w psu to arrive so right now I'm using a ghetto 2psu setup.


heh, looks more of a monstrocity than when I first setup a Dual PSU QuadFire rig before swapping to a 1600W... Back in the day (bout 6 months ago lol)! This round's quadfire was much simpler even though I hate the LEPA G 1600W due to it's stiff plugs & wires, it definitely handles 2 295x2s no problem.


----------



## meankeys

Hi all
I wanted to try a little experiment and try and run 3 or four of these r9 295x2 on the bench. I cant get my board to post. will just loop last qcode 94 /95 (PCI enumeration) I tried on both my RVE AND MY RIVBE any input would be appreciated.



I know i cant oct-fire but should be able to run these cards independent. I want to do a realbeanch run with them.
thanks

http://hwbot.org/user/meankeys/


----------



## DividebyZERO

Quote:


> Originally Posted by *meankeys*
> 
> Hi all
> I wanted to try a little experiment and try and run 3 or four of these r9 295x2 on the bench. I cant get my board to post. will just loop last qcode 94 /95 (PCI enumeration) I tried on both my RVE AND MY RIVBE any input would be appreciated.
> 
> 
> 
> I know i cant oct-fire but should be able to run these cards independent. I want to do a realbeanch run with them.
> thanks
> 
> http://hwbot.org/user/meankeys/


can you try adding one at a time and test boot? Maybe its some sort of power issue?


----------



## HoneyBadger84

I'm assuming this is normal but thought I'd post up out of curiosity:

GPU 1 is running about 2C cooler than GPU 2 during gaming & heavy use (1 is peaking between 52-58C depending on use, 2 is peaking at 54-60C)

My thought process is liquid goes from one GPU to the other & is slightly warmer when it gets there, so herp derp, it would run warmer, just wanting to confirm. lol


----------



## joeh4384

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm assuming this is normal but thought I'd post up out of curiosity:
> 
> GPU 1 is running about 2C cooler than GPU 2 during gaming & heavy use (1 is peaking between 52-58C depending on use, 2 is peaking at 54-60C)
> 
> My thought process is liquid goes from one GPU to the other & is slightly warmer when it gets there, so herp derp, it would run warmer, just wanting to confirm. lol


It seems like GPU 2 for me is always 1-2 degrees warmer.


----------



## meankeys

You would be correct in that assumption on a dual card water cooled setup. The R9 295x2 each have there own rad and run independent from one another. even when pushing them the temps stayed around 32c

http://hwbot.org/submission/2693878_meankeys_3dmark___fire_strike_extreme_2x_radeon_r9_295x2_18168_marks


----------



## meankeys

Quote:


> Originally Posted by *DividebyZERO*
> 
> can you try adding one at a time and test boot? Maybe its some sort of power issue?


Two boot fine. It's when I add the third card I get the boot loop. I did manage to boot 3 cards on my RIVBE. GPU-Z posted 3 gpu's not 6 so no joy. I do have a modded driver to run 6+ gpu's on win 7 64.
Power is not a problem I am using a 1200W on the main bench rig and run the other two cards on a 1000W

The lights dim a bit when I fire it up but lol jk
thanks for the input


----------



## steezebe

Whelp an update to the RSoD with my 295... I've tried everything, but to no avail. I've re-installed windows and a fresh driver set from a stock MB, and I'm still getting them. Plus, the card won't boot with both cores; the only way I've found to get both cores to boot is to clear the CMOS. Needless to say, it's a RMA...

Has anyone else experienced this? Part of me is thinking that it may be due to me using a PCI-e riser card--- would that make a difference? Reviews have shown the 295 doesn't use MB power, so I'm thinking that's not the case. My other thought would be that it's my MB itself, but I'm not sure why.

Insight is welcome!


----------



## Lordevan83

Delete


----------



## Lordevan83

My 2x295x2's seem to be running close to the 75 deg throttlethrottle pt. Most of the time 72 deg + when gaming. Should I go push pull? If so what fans sshould I go for? I would like something quieter than stock.

AT this pt do u think waterblock is worth it since new gen is only a few month away


----------



## meankeys

Quote:


> Originally Posted by *steezebe*
> 
> Whelp an update to the RSoD with my 295... I've tried everything, but to no avail. I've re-installed windows and a fresh driver set from a stock MB, and I'm still getting them. Plus, the card won't boot with both cores; the only way I've found to get both cores to boot is to clear the CMOS. Needless to say, it's a RMA...
> 
> Has anyone else experienced this? Part of me is thinking that it may be due to me using a PCI-e riser card--- would that make a difference? Reviews have shown the 295 doesn't use MB power, so I'm thinking that's not the case. My other thought would be that it's my MB itself, but I'm not sure why.
> 
> Insight is welcome!


Have you tried installing with out the riser card/ and what pcie slot is it in?


----------



## meankeys

Quote:


> Originally Posted by *Lordevan83*
> 
> My 2x295x2's seem to be running close to the 75 deg throttlethrottle pt. Most of the time 72 deg + when gaming. Should I go push pull? If so what fans sshould I go for? I would like something quieter than stock.
> 
> AT this pt do u think waterblock is worth it since new gen is only a few month away


Are you pushing air from inside your pc through the rads or are you pulling cool air in to the pc through the rads of the video cards (prefer d)


----------



## Mega Man

Quote:


> Originally Posted by *meankeys*
> 
> Quote:
> 
> 
> 
> Originally Posted by *steezebe*
> 
> Whelp an update to the RSoD with my 295... I've tried everything, but to no avail. I've re-installed windows and a fresh driver set from a stock MB, and I'm still getting them. Plus, the card won't boot with both cores; the only way I've found to get both cores to boot is to clear the CMOS. Needless to say, it's a RMA...
> 
> Has anyone else experienced this? Part of me is thinking that it may be due to me using a PCI-e riser card--- would that make a difference? Reviews have shown the 295 doesn't use MB power, so I'm thinking that's not the case. My other thought would be that it's my MB itself, but I'm not sure why.
> 
> Insight is welcome!
> 
> 
> 
> Have you tried installing with out the riser card/ and what pcie slot is it in?
Click to expand...

+1

this is why i dont trust riser cards as most people buy the ebay ones ( not saying you did or didnt ) !


----------



## HoneyBadger84

Quote:


> Originally Posted by *Lordevan83*
> 
> My 2x295x2's seem to be running close to the 75 deg throttlethrottle pt. Most of the time 72 deg + when gaming. Should I go push pull? If so what fans sshould I go for? I would like something quieter than stock.
> 
> AT this pt do u think waterblock is worth it since new gen is only a few month away


Corsair SP120 fans I just got in seem to be doing an excellent job thus far, I have one setup as Push & the stock fan setup as pull, I imagine getting a 2-pack like I did & going push/pull with both would work very well. SP fans are Static Pressure designed so they're made for radiator use & have a solid CFM. Don't know about noise levels though, my system has a general hum to it because of all the fans, so I wouldn't really notice if it were "louder" than the stock one... I'd wager it's not as most Corsair Fans are marketed as "quieter" or "low-noise".


----------



## HemantThakur

Hi Guys,

Just received my XFX 295x2 yesterday

Setup completed on 1080 p Dell monitor via DP to HDMI.

The question i wanted to ask is that even at max resolution display is on Windowed mode always??? Games, Benchmarks, normal home screen everything...

Is this normal or i am doing something wrong?

Do advice.

Cheers


----------



## Lordevan83

i will try to pull air from outside case instead of push from inside to see if it gets better results. thanks.


----------



## Lordevan83

if I use a 3 pin splitter for push pull, will the card be able to manage fan speed, or will fans be max speed all the time?


----------



## ramos29

Quote:


> Originally Posted by *HemantThakur*
> 
> Hi Guys,
> 
> Just received my XFX 295x2 yesterday
> 
> Setup completed on 1080 p Dell monitor via DP to HDMI.
> 
> The question i wanted to ask is that even at max resolution display is on Windowed mode always??? Games, Benchmarks, normal home screen everything...
> 
> Is this normal or i am doing something wrong?
> 
> Do advice.
> 
> Cheers


if you press alt-enter does it fix your problem?
i was going to ask about borderless gaming, looks like the game runs smoother in windowed full screen mode , 20 fps drop but the game is silk smooth, as i am using a tv so this fixes a part of my input lag


----------



## doctakedooty

Well guys the 295x2 has been a fun card but sadly I am going to replace it and my power supply now. Tonight I was gaming and smelt a horrible smell and realized my 295x2 had caused my seasonic 1250w psu wires to catch on fire luckily no hard damage was done the psu is fine but 2 pcie connectors were destroyed on the psu. At this point this card for me is to dangerous so I am going to sell my ek waterblock and move to either 980 sli or back to 780 ti sli. Sad day I really liked the 295x2 too.


----------



## ramos29

Quote:


> Originally Posted by *doctakedooty*
> 
> Well guys the 295x2 has been a fun card but sadly I am going to replace it and my power supply now. Tonight I was gaming and smelt a horrible smell and realized my 295x2 had caused my seasonic 1250w psu wires to catch on fire luckily no hard damage was done the psu is fine but 2 pcie connectors were destroyed on the psu. At this point this card for me is to dangerous so I am going to sell my ek waterblock and move to either 980 sli or back to 780 ti sli. Sad day I really liked the 295x2 too.


sad to hear that :/ i dont understand if the psu took fire maybe its is it which is responsible , i think you pluged the gpu the wrong way that why , your 295x2 is innocent and if you still think its guilty i can buy it from you


----------



## DividebyZERO

Quote:


> Originally Posted by *ramos29*
> 
> sad to hear that :/ i dont understand if the psu took fire maybe its is it which is responsible , i think you pluged the gpu the wrong way that why , your 295x2 is innocent and if you still think its guilty i can buy it from you


I had something like this happen a few years ago when I was running dual PSU and quad 480gtx. I found out it was my fault for not using the extra power plug for PCIE on the main board. Mine melted the 12 lines over my 24pin atx and part of the socket on the mainboard. Since it was modular I was able to replace the cable and fix the mobo connector. Didnt have anymore issues after that.


----------



## doctakedooty

Quote:


> Originally Posted by *ramos29*
> 
> sad to hear that :/ i dont understand if the psu took fire maybe its is it which is responsible , i think you pluged the gpu the wrong way that why , your 295x2 is innocent and if you still think its guilty i can buy it from you


Could have been me but I highly doubt it the 8 pins were not daisy chained on the same pcie connector on the psu each 8 pin had its own dedicated pcie plug on the psu and not y split. The board also had the extra power plugged into the pcie extra power on the board.


----------



## Ramzaiii

Morning everyone. New to the forum! I just finished my rebuild! Bought my 295x2 two weeks ago and all my water cooling stuff got here last night. A jIst of my rig is the R9 open loop with xspc gpu and cpu water block, in my corsair 250D.

IMAG0442.jpg 1709k .jpg file
 I'm about 95% done. Just need to get a one more thing in and clean up my cables


----------



## HoneyBadger84

Quote:


> Originally Posted by *Lordevan83*
> 
> if I use a 3 pin splitter for push pull, will the card be able to manage fan speed, or will fans be max speed all the time?


I'd recommend not plugging both fans in to the card as it already has a high power profile. Would be better if you want fan speed management to plug it in to a motherboard plug that you can manage the speed of. TBH it's best to run at least the push fan on full speed. Unless you get an obnoxiously loud fan, I doubt you'll notice it.


----------



## meankeys

Quote:


> Originally Posted by *HemantThakur*
> 
> Hi Guys,
> 
> Just received my XFX 295x2 yesterday
> 
> Setup completed on 1080 p Dell monitor via DP to HDMI.
> 
> The question i wanted to ask is that even at max resolution display is on Windowed mode always??? Games, Benchmarks, normal home screen everything...
> 
> Is this normal or i am doing something wrong?
> 
> Do advice.
> 
> Cheers


You should be able to run full screen. not windowed. Got to be a setting some were. what is your system spec's


----------



## F4ze0ne

Quote:


> Originally Posted by *doctakedooty*
> 
> Well guys the 295x2 has been a fun card but sadly I am going to replace it and my power supply now. Tonight I was gaming and smelt a horrible smell and realized my 295x2 had caused my seasonic 1250w psu wires to catch on fire luckily no hard damage was done the psu is fine but 2 pcie connectors were destroyed on the psu. At this point this card for me is to dangerous so I am going to sell my ek waterblock and move to either 980 sli or back to 780 ti sli. Sad day I really liked the 295x2 too.


Sorry to hear about your card and psu. Maybe seasonic will help reimburse your loss?

Not sure if you have this particular version... But I've read online that the older x-1250 has 4 rails and it can cause problems with the 295x2 because of the amps required. You're not the first guy to have a seasonic burn up a video card. This also happened to a forum member on a single 290x card.

http://www.overclock.net/t/1461040/so-seasonic-saved-me-from-a-fire-or-did-it

Found this article from JG that shows the XFX version based on the x-1250 has the same issue. *It was advertised as a single rail design*.


----------



## wermad

Helping a friend with planning a rig. He's interested in the 295x2 due to the cls. What temps are typical? He didnt wanna go custom water due to the added costs. Most reviews have it at 70-75°, but I hear it throttles down at 75+, ???


----------



## F4ze0ne

Quote:


> Originally Posted by *wermad*
> 
> Helping a friend with planning a rig. He's interested in the 295x2 due to the cls. What temps are typical? He didnt wanna go custom water due to the added costs. Most reviews have it at 70-75°, but I hear it throttles down at 75+, ???


It'll throttle if it goes above 75*.

Mine runs around 68*-70* when benching. The vrms also get really hot in the center of the card, so I have a side fan pushing air across it.


----------



## wermad

Thanks! He wants a 4k screen (I told him a wqhd might be better) and to run SoM, so a strong card is in order. How are temps in gaming? Curious if there's any way to bypass this thermal limit?


----------



## F4ze0ne

Quote:


> Originally Posted by *wermad*
> 
> Thanks! He wants a 4k screen (I told him a wqhd might be better) and to run SoM, so a strong card is in order. How are temps in gaming? Curious if there's any way to bypass this thermal limit?


When I was playing DAI, I saw temps in the range of 65*-71*. This is after playing for an hour or so.

*Edit: Temps after running Metro LL bench maxed 10x looped:*



Spoiler: Warning: Spoiler!







As far as I know, the thermal limit of 75* is locked down and can't be changed.

I'd vote for WQHD at this time. I think we need more power for 4K. Although its doable on this card. I'm planning to move up to 1440 this year when freesync monitors come out.


----------



## doctakedooty

Quote:


> Originally Posted by *F4ze0ne*
> 
> Sorry to hear about your card and psu. Maybe seasonic will help reimburse your loss?
> 
> Not sure if you have this particular version... But I've read online that the older x-1250 has 4 rails and it can cause problems with the 295x2 because of the amps required. You're not the first guy to have a seasonic burn up a video card. This also happened to a forum member on a single 290x card.
> 
> http://www.overclock.net/t/1461040/so-seasonic-saved-me-from-a-fire-or-did-it
> 
> Found this article from JG that shows the XFX version based on the x-1250 has the same issue. *It was advertised as a single rail design*.


That's the one I have or had it was split on 2 rails for the gpu. I got some 980s on the way now should be here Thursday now I just got to sell my ek 295x2 waterblock and backplate.


----------



## m00ter

I'm eagerly awaiting freesync monitors too. Nice ultrawide 21:9 will be lovely.

My 295x2 has no trouble playing any game flawlessly with my 3 x 1920x1080 monitors in eyefinity, and I imagine performance will be similar if not better on a single wide screen.


----------



## Mega Man

yea i been thinking the same


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Thanks! He wants a 4k screen (I told him a wqhd might be better) and to run SoM, so a strong card is in order. How are temps in gaming? Curious if there's any way to bypass this thermal limit?


I get up to 72c on the second core during summer but i solved that by slapping a NF-F12 Industrial 3k rpm fan on it, at 2000rpm i never go over 70c and at 2500rpm i dont go over 65c playing DA:I

I have Shadows of Mordor but havent fired it up yet and this card is perfect for 1440p 120hz+ gaming imo......absolutely loving mine


----------



## wermad

Quote:


> Originally Posted by *F4ze0ne*
> 
> When I was playing DAI, I saw temps in the range of 65*-71*. This is after playing for an hour or so.
> 
> *Edit: Temps after running Metro LL bench maxed 10x looped:*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> As far as I know, the thermal limit of 75* is locked down and can't be changed.
> 
> I'd vote for WQHD at this time. I think we need more power for 4K. Although its doable on this card. I'm planning to move up to 1440 this year when freesync monitors come out.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I get up to 72c on the second core during summer but i solved that by slapping a NF-F12 Industrial 3k rpm fan on it, at 2000rpm i never go over 70c and at 2500rpm i dont go over 65c playing DA:I
> 
> I have Shadows of Mordor but havent fired it up yet and this card is perfect for 1440p 120hz+ gaming imo......absolutely loving mine
Click to expand...

Thanks guys. I relay my findings to my friend. I tried to gently persuade him to wqhd w/ the upcoming acer 144hz, ips, 1440 or the swift (or monoprice 120hz 30" 1600), but he wants to jump on the 4k bandwagon.

I've been tempted to get a couple myself since I have three monsta 480s to tackle the heat. Prices on ebay seem to hover ~$600 for new ones (new one w/ new block sold for $660).


----------



## xarot

Hmm I am thinking I am seeing better scaling on my 295X2 than my Ares III. But I don't understand why.

The difference is that I believe Ares III isn't actually an 295X2, but rather dual 290X Matrix Platinum cards on one PCB. Also the Ares III has an option to enable/disable CF and 295X2 does not.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xarot*
> 
> Hmm I am thinking I am seeing better scaling on my 295X2 than my Ares III. But I don't understand why.
> 
> The difference is that I believe Ares III isn't actually an 295X2, but rather dual 290X Matrix Platinum cards on one PCB. Also the Ares III has an option to enable/disable CF and 295X2 does not.


That is interesting indeed, I wish i had the option to disable CF for some games (I know i can disable through profiles but it's a pain), Are you able to see vrm temps as well by chance?


----------



## ebhsimon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I get up to 72c on the second core during summer but i solved that by slapping a NF-F12 Industrial 3k rpm fan on it, at 2000rpm i never go over 70c and at 2500rpm i dont go over 65c playing DA:I
> 
> I have Shadows of Mordor but havent fired it up yet and this card is perfect for 1440p 120hz+ gaming imo......absolutely loving mine


What kind of ambient temperatures do you have for it to get to 72c? Is it possible to change the stock fan out?


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I get up to 72c on the second core during summer but i solved that by slapping a NF-F12 Industrial 3k rpm fan on it, at 2000rpm i never go over 70c and at 2500rpm i dont go over 65c playing DA:I
> 
> I have Shadows of Mordor but havent fired it up yet and this card is perfect for 1440p 120hz+ gaming imo......absolutely loving mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What kind of ambient temperatures do you have for it to get to 72c? Is it possible to change the stock fan out?
Click to expand...

Quote:


> Location: NSW, Australia


Ambient was 35c today :/

hence the reason i swapped out the stock fan, Only running one industrial on it atm, might get another at some point or just get a fullcover block for it but more worried about raising my CPU Temp during summer though


----------



## ebhsimon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ambient was 35c today :/
> 
> hence the reason i swapped out the stock fan, Only running one industrial on it atm, might get another at some point or just get a fullcover block for it but more worried about raising my CPU Temp during summer though


Damn, is the industrial at 2k rpm loud?


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Ambient was 35c today :/
> 
> hence the reason i swapped out the stock fan, Only running one industrial on it atm, might get another at some point or just get a fullcover block for it but more worried about raising my CPU Temp during summer though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn, is the industrial at 2k rpm loud?
Click to expand...

Nope, at 2500 it's quite noticeable and at 3000 it's very very loud but the amount of air it pushes through is amazing......thank you headset!!


----------



## shcsha

Do you connect your noctua fan to the motherboard?

I have a NF-F12 industrial, that won't work on the r9295x2 GPU fan header. When I use the motherboard fan header, I can't control it using speedfan because my motherboard is an asus z87ipro. The noctua is much quieter than the stock fan.

Do you have any advice for me to make the most of my situation?


----------



## Sgt Bilko

Quote:


> Originally Posted by *shcsha*
> 
> Do you connect your noctua fan to the motherboard?
> 
> I have a NF-F12 industrial, that won't work on the r9295x2 GPU fan header. When I use the motherboard fan header, I can't control it using speedfan because my motherboard is an asus z87ipro. The noctua is much quieter than the stock fan.
> 
> Do you have any advice for me to make the most of my situation?


I'm using a fan controller actually and yeah, the 295x2's header can only take a 3 pin connection.

Not sure why you can't control it via the motherboard though...


----------



## shcsha

I can control it using fan xpert 2, but not speed fan. So I can't link fan speed to GPU temps.


----------



## wermad

So the stock cooler hits its thermal limit pretty easily? Hmmmm...might have to warn my friend. Maybe two 290x or 980s w/ non-reference coolers. Seems like this beast is best at home on a bigger rad or custom loop/block.

Curious if anyone tried adding a larger aluminum rad (240, 360, 280)? I know it would void your warranty but some like to roll like that


----------



## F4ze0ne

Quote:


> Originally Posted by *wermad*
> 
> So the stock cooler hits its thermal limit pretty easily? Hmmmm...might have to warn my friend. Maybe two 290x or 980s w/ non-reference coolers. Seems like this beast is best at home on a bigger rad or custom loop/block.


Mine hasn't throttled yet. Even after hours of playing games it never hit 75*.

If someone is having throttling issues, then there must be airflow problems inside the case causing overheating. I've test mounted the RAD on the back and top of the case and pretty much have the same thermal results.


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> So the stock cooler hits its thermal limit pretty easily? Hmmmm...might have to warn my friend. Maybe two 290x or 980s w/ non-reference coolers. Seems like this beast is best at home on a bigger rad or custom loop/block.
> 
> Curious if anyone tried adding a larger aluminum rad (240, 360, 280)? I know it would void your warranty but some like to roll like that


i use my aq to control them and actually they hardly ever go above 50


----------



## wermad

The issue is that my friend lives in El Central and its an oven over there, even during the winters. Colorado is a nice place to be in during the winter to dispel the heat from your rig (







). I'll leave it up to him but I think this card just won't have enough stock cooling where he lives. There's always PI if it does close to a 295x2. May sell him two of my TriX sapphires if he wants.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> So the stock cooler hits its thermal limit pretty easily? Hmmmm...might have to warn my friend. Maybe two 290x or 980s w/ non-reference coolers. Seems like this beast is best at home on a bigger rad or custom loop/block.
> 
> Curious if anyone tried adding a larger aluminum rad (240, 360, 280)? I know it would void your warranty but some like to roll like that


Only time mine ever throttled was with 38c ambients and i was overclocked 1150/1500 with 2 R9 290's jammed in there as well hehe.

But no, i dont have temp issues after i sorted out my case airflow and actually did some cable management


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Only time mine ever throttled was with 38c ambients and i was overclocked 1150/1500 with 2 R9 290's jammed in there as well hehe.
> 
> But no, i dont have temp issues after i sorted out my case airflow and actually did some cable management


You have pics? Thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Only time mine ever throttled was with 38c ambients and i was overclocked 1150/1500 with 2 R9 290's jammed in there as well hehe.
> 
> But no, i dont have temp issues after i sorted out my case airflow and actually did some cable management
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have pics? Thanks
Click to expand...

Pics of my rig or temps?


----------



## malik22

hey guys will my thermaltake 1200 tp be enough for this card and a oced 4770K here are the stats of the psu 12v4 and v3 have 36 amps.

AC Input 100V-240V, 15A, 47-63 Hz
DC Voltage +12V1 +12V4 +3.3V +12V2 +12V3 +5V -12V +5VSB
Max. Output 20A 36A 30A 20A 36A 30A 0.8A 3.5A
600W 600W 9.6W 17.5W
1200W


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Pics of my rig or temps?


rig, so I can show my friend on your setup. Thanks








Quote:


> Originally Posted by *malik22*
> 
> hey guys will my thermaltake 1200 tp be enough for this card and a oced 4770K here are the stats of the psu 12v4 and v3 have 36 amps.
> 
> AC Input 100V-240V, 15A, 47-63 Hz
> DC Voltage +12V1 +12V4 +3.3V +12V2 +12V3 +5V -12V +5VSB
> Max. Output 20A 36A 30A 20A 36A 30A 0.8A 3.5A
> 600W 600W 9.6W 17.5W
> 1200W


for a single 295x2, you're fine with that psu. Per TT's site, I would recommend setting it up on the second rail (12v2).


Spoiler: Warning: Spoiler!



Compliance with Intel ATX 12V 2.3 & SSI EPS 12V 2.92 standards.
80 PLUS Gold certified - with 87-93% extreme high efficiency @ 20-100% load to cut down electric cost.
24/7 @ 50oC: Guaranteed to deliver 1200W continuous power.
Pure aesthetic design with uncompromising performance.
Proprietary dual ball bearing 14cm flower-shape fan enables longer lifespan and lowers overall noise output by dramatically reducing bearing frictions.
100% 105oC (221oF) Japanese made electrolytic & solid state capacitors: provide uncompromised performance and reliability under the harshest operating environment.
Double-forward switching circuitry: offers low power loss and high reliability.
Unparalleled DC to DC converter provides highest efficiency, most stable performance, and perfect regulation.
3oz PCB design reduces heat generation and allows greater efficiency.
*Robust & dedicated +12V output: comes with dual +12V rails design providing up to 40A for 12V1 and 85A for 12V2.*
FanDelayCool Technology allows 14cm fan to continue to operate 15-30 sec after system shuts-down to ensure all components are properly cooled.
Multi-GPU ready: comes with 8 x PCI-E 6+2pin for cutting-edge gaming machine.
Auto switching circuitry for universal AC input from 90-264V.
Active Power Factor Correction (PFC) with PF value of 0.95 at full load.
High reliability: MTBF>120,000 hours.
DIMENSION: 5.9"(W) x 3.4"(H) x ˙7.1" (L);150mm(W) x 86mm(H) x 180mm(L)
Built-in industry grade protections: Over Current, Over Power, Over Voltage, Under Voltage, Over Temperature and Short-Circuit protection. Safety / EMI Approvals: CE, TUV, FCC, UL, CUL, GOST and BSMI certified.

http://www.thermaltakeusa.com/Power_Supply/Toughpower_Series_/Toughpower_Grand/C_00001718/Toughpower_Grand_1200W/Design.htm


----------



## ebhsimon

@Sgt Bilko What are your VRM temps like?


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Pics of my rig or temps?
> 
> 
> 
> rig, so I can show my friend on your setup. Thanks
Click to expand...

Here's a couple:




There is no fans on the bottom or on the sidepanel....been thinking about it but i'm pretty sure i'd butcher my case








Quote:


> Originally Posted by *ebhsimon*
> 
> @Sgt Bilko What are your VRM temps like?


I have no idea, afaik you can't read the vrm temps on a 295x2 which is why i was asking if @xarot could or not since they have the option to disable Crossfire unlike the rest of us


----------



## wermad

Thanks! Big fans btw! I gave him the info and I'll wait for his reply. I think he might go w/ a 980 and a gsync monitor since my cousin has a Maxwell setup and i told him about Gysnc.

I have my sp120s on 5v as I can't stand them at 12v. I may sell my rig and go small since I've been itching to go sff for a while. Big rigs are fun and you have a ton of room but they're cumbersome to move around when you have to and they take up a lot of room. Right now I have my 900D is sitting under my desk since we downsized this room to make it more kiddie friendly. Eventually the kids will take over completely and i won't have room for. Dreams of 5x1 return are over (







).

Hopefully my pal can join you guys if not, I may. Don't hold it to me


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Thanks! Big fans btw! I gave him the info and I'll wait for his reply. I think he might go w/ a 980 and a gsync monitor since my cousin has a Maxwell setup and i told him about Gysnc.
> 
> I have my sp120s on 5v as I can't stand them at 12v. I may sell my rig and go small since I've been itching to go sff for a while. Big rigs are fun and you have a ton of room but they're cumbersome to move around when you have to and they take up a lot of room. Right now I have my 900D is sitting under my desk since we downsized this room to make it more kiddie friendly. Eventually the kids will take over completely and i won't have room for. Dreams of 5x1 return are over (
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> Hopefully my pal can join you guys if not, I may. Don't hold it to me


lol, all good mate, those are only 120mm fans, running 5 of them atm, 4 on 2 x 240mm radss and one for the 295x2's 120mm


----------



## xarot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Here's a couple:
> 
> 
> 
> 
> There is no fans on the bottom or on the sidepanel....been thinking about it but i'm pretty sure i'd butcher my case
> 
> 
> 
> 
> 
> 
> 
> 
> I have no idea, afaik you can't read the vrm temps on a 295x2 which is why i was asking if @xarot could or not since they have the option to disable Crossfire unlike the rest of us


I don't have my Ares III installed right now, so I cannot remember...I became a father, and a bit before that I got rid of my WC setup to make life easier for the first months, going to install it sooner or later though









But I think I found answer to your question, which is yes, scroll down to GPU-Z screenies: http://www.overclex.net/articles/test-de-lasus-ares-iii-bi-gpu-r9-290x/4/


----------



## Sgt Bilko

Quote:


> Originally Posted by *xarot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Here's a couple:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There is no fans on the bottom or on the sidepanel....been thinking about it but i'm pretty sure i'd butcher my case
> 
> 
> 
> 
> 
> 
> 
> 
> I have no idea, afaik you can't read the vrm temps on a 295x2 which is why i was asking if @xarot could or not since they have the option to disable Crossfire unlike the rest of us
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't have my Ares III installed right now, so I cannot remember...I became a father, and a bit before that I got rid of my WC setup to make life easier for the first months, going to install it sooner or later though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I think I found answer to your question, which is yes, scroll down to GPU-Z screenies: http://www.overclex.net/articles/test-de-lasus-ares-iii-bi-gpu-r9-290x/4/
Click to expand...

Interesting, so the Ares 3 reads as Hawaii while the 295x2 reads as Vesuvius, maybe the Ares has more in common with the Devil 13 than it does the 295x2? and congrats on becoming a Daddy!


----------



## xarot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Interesting, so the Ares 3 reads as Hawaii while the 295x2 reads as Vesuvius, maybe the Ares has more in common with the Devil 13 than it does the 295x2? and congrats on becoming a Daddy!


Thanks for the kind words









Yeah, I think Ares III is even being sold as 'dual 290X' rather than 295X2:

http://www.asus.com/Graphics_Cards/ROG_ARESIII8GD5/

"Dual R9 290X factory-overclocked to 1030MHz" and "Powered by dual AMD Radeon™ Hawaii XT GPUs factory-overclocked at 1030MHz"


----------



## ebhsimon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> There is no fans on the bottom or on the sidepanel....been thinking about it but i'm pretty sure i'd butcher my case
> 
> 
> 
> 
> 
> 
> 
> 
> I have no idea, afaik you can't read the vrm temps on a 295x2 which is why i was asking if @xarot could or not since they have the option to disable Crossfire unlike the rest of us


I tried cutting a hole in my windowed side panel. It works, but it wasn't very pretty... Not sure how it would fare with 2 cards, but I get lower temps with my side panel on rather than off, so it'd probably screw up the air flow if i chucked it on now.
It's unfortunate that the VRM temps are hidden.. Oh well. Also nice fans you have there. Are they all 3000 rpm models? Do you let them spin up to 3000 rpm? I wanted to get some Noctuas, but 5 of them costs ~$200 - $40 a pop for the 140mm variants. Ended up getting Cougar fans which are quiet, but also don't move that much air especially through a radiator. The ones on my radiator occasionally click though, which is annoying.


----------



## joeh4384

Quote:


> Originally Posted by *ebhsimon*
> 
> I tried cutting a hole in my windowed side panel. It works, but it wasn't very pretty... Not sure how it would fare with 2 cards, but I get lower temps with my side panel on rather than off, so it'd probably screw up the air flow if i chucked it on now.
> It's unfortunate that the VRM temps are hidden.. Oh well. Also nice fans you have there. Are they all 3000 rpm models? Do you let them spin up to 3000 rpm? I wanted to get some Noctuas, but 5 of them costs ~$200 - $40 a pop for the 140mm variants. Ended up getting Cougar fans which are quiet, but also don't move that much air especially through a radiator. The ones on my radiator occasionally click though, which is annoying.


I suspect AMD didn't want everyone to see how hot they run. I pointed my temp ir gun at them and temp ranges were from 80c to 95c.


----------



## ebhsimon

Quote:


> Originally Posted by *joeh4384*
> 
> I suspect AMD didn't want everyone to see how hot they run. I pointed my temp ir gun at them and temp ranges were from 80c to 95c.


Interesting, I'll do that to my 290s now. Should I be overclocked or stock? Are you overclocked or stock? What's your ambient temp? Shoot the thermometer at the wall next to you.

I'll shoot it just above where the VRMs are and compare it to the GPU-Z reading


----------



## joeh4384

Quote:


> Originally Posted by *ebhsimon*
> 
> Interesting, I'll do that to my 290s now. Should I be overclocked or stock? Are you overclocked or stock? What's your ambient temp? Shoot the thermometer at the wall next to you.
> 
> I'll shoot it just above where the VRMs are and compare it to the GPU-Z reading


That was on stock but with the OC bios that has the core at 1030. I am at work now but have no idea how accurate my ir gun is as it was pretty cheap.


----------



## ebhsimon

With an ambient temp of 24c (actually this is irrelevant to what this is about actually), I concluded that I can't accurately measure the temps of the VRM since I don't know where the VRMs are (I think it's close to the power connectors), but temps of the backplate do not reflect it. I don't know if it's because of the back plate or what, but I'm confused.

According to GPU-Z:
VRM1: 62C
VRM2: 71C

I only got close to these numbers when I aimed the ir gun directly at the PCB through a little hole in the back plate (as shown in pictures). While I pointed at the back plate I got temps ranging from 48 to 67.








@joeh4384 my ir gun was pretty cheap too, but I think it is quite accurate (although I have no way to calibrate it).


----------



## Sploosh

Question. If I put together an 295x2 and a 290 together in trifire, will the 290 throttle the 295x2? I set up my 290 with its own rad via the G10 bracket, but I'm having issues making the radiator fit (radiator interferes with the power supply wiring when I try to mount to the bottom vents, and tubing isn't long enough to put the rad in a different orientation). Before I start looking into custom WC for a solution to last the next two years until the eventual upgrade, I'd like to know for sure that the 290 won't drag things down.


----------



## SLADEizGOD

Quote:


> Originally Posted by *xarot*
> 
> I don't have my Ares III installed right now, so I cannot remember...I became a father, and a bit before that I got rid of my WC setup to make life easier for the first months, going to install it sooner or later though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I think I found answer to your question, which is yes, scroll down to GPU-Z screenies: http://www.overclex.net/articles/test-de-lasus-ares-iii-bi-gpu-r9-290x/4/


Nice setup. And congratz on being a dad. I just can't wait to get my second XFX R9 295x2.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> There is no fans on the bottom or on the sidepanel....been thinking about it but i'm pretty sure i'd butcher my case
> 
> 
> 
> 
> 
> 
> 
> 
> I have no idea, afaik you can't read the vrm temps on a 295x2 which is why i was asking if @xarot could or not since they have the option to disable Crossfire unlike the rest of us
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tried cutting a hole in my windowed side panel. It works, but it wasn't very pretty... Not sure how it would fare with 2 cards, but I get lower temps with my side panel on rather than off, so it'd probably screw up the air flow if i chucked it on now.
> It's unfortunate that the VRM temps are hidden.. Oh well. Also nice fans you have there. Are they all 3000 rpm models? Do you let them spin up to 3000 rpm? I wanted to get some Noctuas, but 5 of them costs ~$200 - $40 a pop for the 140mm variants. Ended up getting Cougar fans which are quiet, but also don't move that much air especially through a radiator. The ones on my radiator occasionally click though, which is annoying.
Click to expand...

I don't have the 290 in there anymore so it's just the 295x2 and the fans are on a static 2k rpm all the time, i bump them up to 3k when stressing and they push a lot of air through the rads.....noise is...well, loud lol

Quote:


> Originally Posted by *Sploosh*
> 
> Question. If I put together an 295x2 and a 290 together in trifire, will the 290 throttle the 295x2? I set up my 290 with its own rad via the G10 bracket, but I'm having issues making the radiator fit (radiator interferes with the power supply wiring when I try to mount to the bottom vents, and tubing isn't long enough to put the rad in a different orientation). Before I start looking into custom WC for a solution to last the next two years until the eventual upgrade, I'd like to know for sure that the 290 won't drag things down.


i was running a 295x2 + XFX DD R9 290 and the 290 would auto clock itself up to 1018/1250 to match the 295x2 speeds and it was great, never had any issues outside of games that didn't support Trifire, you'll be fine with it i imagine, there wasn't really a temp increase either from what i noticed, just make sure you have some airflow going on and you'll be set


----------



## malik22

hello guys I just bought a used sapphire 295x2 without accesories and i have a few questions.

1.What type of mounting screws does the rad use?

2.Can I use a PHOBYA Adapter 4Pin PWM to 3Pin to connect a corsair sp120 pwm (4pin) to the rad and the gpu itself?

3.and I have to buy a Mini DisplayPort - DisplayPort Adapter to connect a displayport cable to a 4k screen correct?


----------



## F4ze0ne

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i was running a 295x2 + XFX DD R9 290 and the 290 would auto clock itself up to 1018/1250 to match the 295x2 speeds and it was great


My XFX 290x doesn't do this...


----------



## Orivaa

Heyo. I recently decided to play some "older" games with my 295x2, and I booted up Far Cry 3, 'cause I love me some Vaas.
At first, I had a bug in which I was stuck at, like, 7 FPS. Found a fix where you just use the Bioshock profile for the game instead of the normal one.
However, I wasn't able to play the game at Ultra. Every option that went up to Ultra (Post FX, Shadows, and Geometry) had to be turned down to 1 below (Post FX even had to go down to "high," instead of "very high") before it stayed consistently on 60.
This doesn't match what I've seen on benchmarks , or even in a 



 (Note: The YouTube guy is using MSAA x8 and a higher resolution, which is probably why it isn't stable at 60.)

Do you guys think this is because of the Bioshock profile, and if so, anyone know of a fix for the FPS bug on the normal profile? Or hell, does anyone else even _have_ that bug?


----------



## m00ter

My Moddiy cable turned up today so I just took the card apart and got it he vrm fan rigged up to my fan controller!

I also have a temp sensor probe tucked under the backplate. Before the mod it was reading 57C. I'll turn the fans up a bit, say 2850 rpm, and have a game now before reporting back


----------



## m00ter

Interesting results!

The external probe shows backplate temps down ~8C, but only if I turn the fan up to 3250 rpm. But HWiNFO is showing overall core temps down to 53C, and that's with the 2 x Jetflo's @ 1800rpm.

I'm going to stick my earphones in (!) and turn them both up to full and see what temps are like then. Regardless, the probe stuck under the backplate doesn't really tell me what the vrm temps are like exactly (and no fancy laser heat gun for me), but they must be loving the extra airflow if the backplate and overall temps are down by a decent margin?

edit - So backplate doesn't really get much colder (turns out the vrm fan maxes at 3300 so I guess it was there already in the last test), but the overall core temps are down to 51C max each with the Jetflo's turned up - not bad for a 1109 + 1350 OC!

Two fingers up to whoever at AMD decided I couldn't control the VRM fan


----------



## m00ter

Quote:


> Originally Posted by *malik22*
> 
> hello guys I just bought a used sapphire 295x2 without accesories and i have a few questions.
> 
> 1.What type of mounting screws does the rad use?
> 
> 2.Can I use a PHOBYA Adapter 4Pin PWM to 3Pin to connect a corsair sp120 pwm (4pin) to the rad and the gpu itself?
> 
> 3.and I have to buy a Mini DisplayPort - DisplayPort Adapter to connect a displayport cable to a 4k screen correct?


1. 25mm (?) M3 screws

2. Probably, not tried it though so give it a go and see, or wait for someone who has tried it to come along! If you're only connecting one fan I'm not sure I see the problem.

3. I think so, yes. HDMI is limited to 30fps at 4K, and DVI depends on single or dual link cable, but even then DisplayPort is better I think.


----------



## joeh4384

Quote:


> Originally Posted by *Orivaa*
> 
> Heyo. I recently decided to play some "older" games with my 295x2, and I booted up Far Cry 3, 'cause I love me some Vaas.
> At first, I had a bug in which I was stuck at, like, 7 FPS. Found a fix where you just use the Bioshock profile for the game instead of the normal one.
> However, I wasn't able to play the game at Ultra. Every option that went up to Ultra (Post FX, Shadows, and Geometry) had to be turned down to 1 below (Post FX even had to go down to "high," instead of "very high") before it stayed consistently on 60.
> This doesn't match what I've seen on benchmarks , or even in a
> 
> 
> 
> (Note: The YouTube guy is using MSAA x8 and a higher resolution, which is probably why it isn't stable at 60.)
> 
> Do you guys think this is because of the Bioshock profile, and if so, anyone know of a fix for the FPS bug on the normal profile? Or hell, does anyone else even _have_ that bug?


I get roughly 60fps with 8xmsaa and every maxed at 1440p. I didn't change the profile for Far Cry 3.


----------



## veaseomat

Yo I got a 295x2 with 2 r9 290 for a quad fire. Is the 295x2 legit 8gb? Or is it still just 4gb? Would I be better off with an 8gb r9 290x in my #1 slot for 4k res or will it not make a difference with my current setup. If there is ANY performance increase I'll get a new 8gb card for the main vram slot.


----------



## wermad

^^^^^Its two 290x cores with 4gb effective vram.

Had any one ram 8x with a 295x2? The pcper review says its a bandwidth limitation in 4k, and they recommend 16x.


----------



## veaseomat

So you're saying that there would be a preformance boost with an 8gb card in the primary slot?


----------



## Mega Man

Quote:


> Originally Posted by *malik22*
> 
> hello guys I just bought a used sapphire 295x2 without accesories and i have a few questions.
> 
> 1.What type of mounting screws does the rad use?
> 
> 2.Can I use a PHOBYA Adapter 4Pin PWM to 3Pin to connect a corsair sp120 pwm (4pin) to the rad and the gpu itself?
> 
> 3.and I have to buy a Mini DisplayPort - DisplayPort Adapter to connect a displayport cable to a 4k screen correct?


1 either normal fan screws or m3 ( iirc ) screws either3-5mm ( direct to rad ) or 25mm ( for attaching through a fan )

2 you can do alot with the fan

3 no not nessisarrily

you can just get a mini dp to dp cable ( no adapter )
or ( for 30 fps )
DVI ( DL ))
Quote:


> Originally Posted by *veaseomat*
> 
> Yo I got a 295x2 with 2 r9 290 for a quad fire. Is the 295x2 legit 8gb? Or is it still just 4gb? Would I be better off with an 8gb r9 290x in my #1 slot for 4k res or will it not make a difference with my current setup. If there is ANY performance increase I'll get a new 8gb card for the main vram slot.


total memory is 8gb but it is 4 gb each chip


----------



## wermad

Quote:


> Originally Posted by *veaseomat*
> 
> So you're saying that there would be a preformance boost with an 8gb card in the primary slot?


I believe your 8gb 290x will lower to 4gb to match the 295x2.


----------



## veaseomat

Quote:


> Originally Posted by *wermad*
> 
> I believe your 8gb 290x will lower to 4gb to match the 295x2.


Can anyone confirm this?


----------



## F4ze0ne

Quote:


> Originally Posted by *veaseomat*
> 
> Can anyone confirm this?


I can't find a specific article on this, but I've read in crossfire/sli articles that they need to match each others framebuffer to work together.

Afterburner will show the effective VRAM being used in crossfire if you run the riva tuner OSD while gaming.


----------



## veaseomat

Quote:


> Originally Posted by *F4ze0ne*
> 
> I can't find a specific article on this, but I've read in crossfire/sli articles that they need to match each others framebuffer to work together.
> 
> Afterburner will show the effective VRAM being used in crossfire if you run the riva tuner OSD while gaming.


Roger, I just googled it and came to the same conclusion. It will downgrade to 4gb if it's cited with a 4gb card. Lame.


----------



## Mega Man

correct, and it isnt lame it is protecting you
cards mem has to be the same for all cards in CFX or SLI

so it will be a mirror of the rest of the cards


----------



## malik22

Quote:


> Originally Posted by *Mega Man*
> 
> 1 either normal fan screws or m3 ( iirc ) screws either3-5mm ( direct to rad ) or 25mm ( for attaching through a fan )
> 
> 2 you can do alot with the fan
> 
> 3 no not nessisarrily
> 
> you can just get a mini dp to dp cable ( no adapter )
> or ( for 30 fps )
> DVI ( DL ))
> total memory is 8gb but it is 4 gb each chip


thanks for the reply mega man so I can use the corsair sp with the adapter mentioned?
and is this cable good enough quality for 4k
https://www.digitec.ch/fr/s1/product/roline-mini-displayport-displayport-kabel-2m-midrange-noir-cable-video-2584203


----------



## veaseomat

Quote:


> Originally Posted by *Mega Man*
> 
> correct, and it isnt lame it is protecting you
> cards mem has to be the same for all cards in CFX or SLI
> 
> so it will be a mirror of the rest of the cards


My dual 290 and 295x2 quad setup shows up as x2 next to both cards. And in Msi after burner I have to oc both pairs separately... Weird but they work.


----------



## wermad

Clocks will run independent so the old ways of clock synchro are long and gone.

Btw, any one have info on running at 8x 3.0? I was told a year ago that it will run but the pcper quad review says they were advised by amd (not 100% on this tbh) to run at 16x and avoid plx boards. Im heading over to amd's site to get the recommended requirements. Im toying with the idea of an matx build w/ quads and few x79 matx boards are out there (x99 will need a 5930k for 40 lanes). I was thinking a sniper m5 to run stock coolers as it gives you a slot in between. Also, it comes with the core3d audio (love this and helluva better then realtek). Though it limits my matx cases to five slots (think I found a good one). If I cant make this happen w/in my budget ill just keep me current setup


----------



## Mega Man

Quote:


> Originally Posted by *malik22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> 1 either normal fan screws or m3 ( iirc ) screws either3-5mm ( direct to rad ) or 25mm ( for attaching through a fan )
> 
> 2 you can do alot with the fan
> 
> 3 no not nessisarrily
> 
> you can just get a mini dp to dp cable ( no adapter )
> or ( for 30 fps )
> DVI ( DL ))
> total memory is 8gb but it is 4 gb each chip
> 
> 
> 
> thanks for the reply mega man so I can use the corsair sp with the adapter mentioned?
> and is this cable good enough quality for 4k
> https://www.digitec.ch/fr/s1/product/roline-mini-displayport-displayport-kabel-2m-midrange-noir-cable-video-2584203
Click to expand...

I don't know that brand sorry. Just don't skimp on dp cable


----------



## wermad

I purchased some oem (hp) dp cables, 6 & 10', off ebay. One of the 6' went bad but the rest worked properly. think they were ~$5-6 a piece. Ran three 6' w/ an mst hub and two 10' directly to the gpu. I used the supplied dp to mini dp adapters that came w/ my old Lightnings.


----------



## veaseomat

okay slightly off topic, but what is better for 4k res? 4x 4gb cards or 2x 8gb cards? is the gpu power of four enough to overcome its ram shortcoming in 4k? I saw alot of people saying 4gb is enough based off of a linus tech tips video I have yet to find.


----------



## F4ze0ne

Quote:


> Originally Posted by *wermad*
> 
> Btw, any one have info on running at 8x 3.0? I was told a year ago that it will run but the pcper quad review says they were advised by amd (not 100% on this tbh) to run at 16x and avoid plx boards. Im heading over to amd's site to get the recommended requirements. Im toying with the idea of an matx build w/ quads and few x79 matx boards are out there (x99 will need a 5930k for 40 lanes). I was thinking a sniper m5 to run stock coolers as it gives you a slot in between. Also, it comes with the core3d audio (love this and helluva better then realtek). Though it limits my matx cases to five slots (think I found a good one). If I cant make this happen w/in my budget ill just keep me current setup


[H] got great trifire performance out of the 3770K w/ an Asus Maximus Extreme PLX motherboard at 4K.


----------



## wermad

Quote:


> Originally Posted by *F4ze0ne*
> 
> [H] got great trifire performance out of the 3770K w/ an Asus Maximus Extreme PLX motherboard at 4K.


That's weird that they've mentioned this (pcper and amd). I think its mainly to reduce the latency of three plx chipsets vs two or one. A tricky one tbh....gonna mull over it. 8x 3.0 is pretty fat for a 295x2 as I've come to learned a while back.

Did some more research:


Spoiler: Warning: Spoiler!



Quote:


> Why this move? Especially with standards like PCI-E Gen 3.0 there's plenty of bandwidth there, but even at Gen 2.0, it really should not be an issue. For the R290/290X/295x2 cards setup in Crossfire, PCIE Gen 3.0 is recommended. We'll be performing some bus flood tests over Gen 3.0 in a later stage. What if you do not have PCIE 3.0 compatibility? Well depending on your montherboard chipset and processor, the bus will revert to Gen 2.0 which will probably not make more then a marginal difference as it is really hard to flood even two x8 Gen 2.0 ports. BTW, it is a myth that Crossfire with 290, 290X and 295 X2 cards would not work on Gen 2.0 slots.


http://www.guru3d.com/articles-pages/amd-radeon-r9-295x2-review,8.html



Thanks for the info. I'm leaning a bit on the "go for it" side for now







.


----------



## Sgt Bilko

Quote:


> Originally Posted by *F4ze0ne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> i was running a 295x2 + XFX DD R9 290 and the 290 would auto clock itself up to 1018/1250 to match the 295x2 speeds and it was great
> 
> 
> 
> My XFX 290x doesn't do this...
Click to expand...

It didn't do it all the time, just the majority of it.

Even when it ran at 980/1250 alongside the 295x2's 1018/1250 it didn't slow it down or become a hindrance though


----------



## Sculptor-Skulls

Hi

i'm planing on buying XFX R9 295x2 and i'll buy 3x PG278Q's

will the R9 295x2 going to run it on 35 ~ 60 FPS on games like DAI OR Shadow of Mordor?

or should i wait for the R9 3xx ?

Thanks


----------



## Orivaa

Quote:


> Originally Posted by *joeh4384*
> 
> I get roughly 60fps with 8xmsaa and every maxed at 1440p. I didn't change the profile for Far Cry 3.


I guess the results would be the same for me if the default profile actually worked for me. I really don't understand why I get this weird bug.


----------



## Lordevan83

Anyone else having problems with their 295x2 and Dragon Age Inquisition? My game dont run. i have latest driver.


----------



## Mega Man

Quote:


> Originally Posted by *Sculptor-Skulls*
> 
> Hi
> 
> i'm planing on buying XFX R9 295x2 and i'll buy 3x PG278Q's
> 
> will the R9 295x2 going to run it on 35 ~ 60 FPS on games like DAI OR Shadow of Mordor?
> 
> or should i wait for the R9 3xx ?
> 
> Thanks


I can check for you when I get home


----------



## F4ze0ne

Quote:


> Originally Posted by *Lordevan83*
> 
> Anyone else having problems with their 295x2 and Dragon Age Inquisition? My game dont run. i have latest driver.


Try repairing the game.


----------



## steezebe

Quote:


> Originally Posted by *meankeys*
> 
> Have you tried installing with out the riser card/ and what pcie slot is it in?


Sry for the delay; spend a few days in Vegas and 100 posts go up...

I got a mITX case & board, so my options for what slot I use are rather limited. As is the use of a riser card, until I get my next case built up. Thing is, I see many successful builds with riser cards, so why would this be suspect?

I didn't purchase a cheap ebay one, but it's not a $100 3M Shielded riser card either....


----------



## Roxycon

Have any of you experienced your 295x2 to stop working (driver or other software related issue) then recovering into windows desktop when trying to open games in a eyefinity array, making it impossible to start games and being able to play them at 5760x1080?

gave me once a 116 bsod code

First week ever owning a amd card has sadly made me miss my previous nvidia set-up and their drivers.

Software:
Windows 7u (the install was done 3-5 weeks before x-mas 2014)
CCC 14.12

Games:
Borderlands sequel
the crew
grid sequel


----------



## wermad

I've had zero major issues since I've switched in 13' after struggling w/ Nvidia drivers back then (tri 780s SC). Occasionally, new drivers may have issues, and its true for the green team, so a bit of troubleshooting will help:

-go back to stock clocks (cpu and gpu's)
-If you're on the latest omega, roll back to 14.9. I've heard some issues w/ omega in the 290/290X club. I've postponed the update for now even though the pesky auto update notification comes up frequently.
-Check your mb. I had a bunch of crashing and nothing but bsod 1033. Updated the bios and its been stellar.
-Also, the 295x2 has a lower thermal limit (75°C). Try running some intense gpu benchmarks and record your temps and usage (Afterburner, gpuz, etc.).


----------



## TheReciever

Hey guys, recently gained interest in the red team and the 295x2.

How well would quadfire push 120fps in modern titles at 1080p?


----------



## Orivaa

Quote:


> Originally Posted by *TheReciever*
> 
> Hey guys, recently gained interest in the red team and the 295x2.
> 
> How well would quadfire push 120fps in modern titles at 1080p?


Depends on whether or not the game is well-optimized for CrossFire.
If it's a greatly optimized game like Tomb Raider 2013, I imagine it would get there quite easily.
If it's a badly optimized game like The Evil Within, I imagine it wouldn't even use all the GPUs.


----------



## Mega Man

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sculptor-Skulls*
> 
> Hi
> 
> i'm planing on buying XFX R9 295x2 and i'll buy 3x PG278Q's
> 
> will the R9 295x2 going to run it on 35 ~ 60 FPS on games like DAI OR Shadow of Mordor?
> 
> or should i wait for the R9 3xx ?
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can check for you when I get home
Click to expand...

@Sculptor-Skulls
sorry for the delay

used
http://www.flawlesswidescreen.org/


Spoiler: Warning: Spoiler!











hope that helps and again this is at ultra everything


----------



## SLADEizGOD

Does anyone have The XFX hardware Rep's email? My card died on me & I submitted a Ticket like 2 days ago on their site. No response. SMH. Need a little help.


----------



## F4ze0ne

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Does anyone have The XFX hardware Rep's email? My card died on me & I submitted a Ticket like 2 days ago on their site. No response. SMH. Need a little help.


Try contacting Warsam (XFX rep).


----------



## Sgt Bilko

Quote:


> Originally Posted by *F4ze0ne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SLADEizGOD*
> 
> Does anyone have The XFX hardware Rep's email? My card died on me & I submitted a Ticket like 2 days ago on their site. No response. SMH. Need a little help.
> 
> 
> 
> Try contacting Warsam (XFX rep).
Click to expand...

Warsam is the AMD rep, for XFX gear its best to go through thier support site, ive always found them quite helpful









Can take them a few days to respond unfortunately but rest assured they'll help out.


----------



## F4ze0ne

I was always get them mixed up for some reason.









SLADEizGOD, I think it takes about 2-3 business days to process a ticket. I'd wait a bit longer and send a follow up message if you don't hear from them. XFX is very good with warranty service on their cards.


----------



## Ov3Rk1ll

After a few weeks of research for my new rig I went ahead and purchased the Sapphire 100360SR R9 295X2 8GD5 last night. This machine will be used for 75% work (Visual Studio Ultimate, AutoCAD LT 2015, Adobe Creative Suite 2015, lots of VMware Workstation) and 25% gaming (not sure what game(s) yet).

Was this a good choice for me to go with? I couldn't really justify a workstation class card, and I really wanted to be able to play some gaming titles on it as well. After all the NVidia issues going on right now and my past excellent experience with AMD/ATI I figured this was a good choice. I also ordered 3 new monitors and have read over and over again that Eyefinity is great.

Any comments, advice, or observations would be greatly appreciated. Hope to have some pics up in the Intel build log tomorrow evening (4790K proc). All my parts shipped this evening and are scheduled for delivery tomorrow.


----------



## caste1200

Quote:


> Originally Posted by *Ov3Rk1ll*
> 
> After a few weeks of research for my new rig I went ahead and purchased the Sapphire 100360SR R9 295X2 8GD5 last night. This machine will be used for 75% work (Visual Studio Ultimate, AutoCAD LT 2015, Adobe Creative Suite 2015, lots of VMware Workstation) and 25% gaming (not sure what game(s) yet).
> 
> Was this a good choice for me to go with? I couldn't really justify a workstation class card, and I really wanted to be able to play some gaming titles on it as well. After all the NVidia issues going on right now and my past excellent experience with AMD/ATI I figured this was a good choice. I also ordered 3 new monitors and have read over and over again that Eyefinity is great.
> 
> Any comments, advice, or observations would be greatly appreciated. Hope to have some pics up in the Intel build log tomorrow evening (4790K proc). All my parts shipped this evening and are scheduled for delivery tomorrow.


FYI the card only has 4GB, (4GB per gpu) but when using multiple gpus, you the ram gets cloned into each card. so you have only 4GB.

im using cs6 with it and it works great! super fast! but your cpu will make a huge difference too..


----------



## Sculptor-Skulls

thank you so much for your help

i just ordered my r9 295x2


----------



## Sculptor-Skulls

Quote:


> Originally Posted by *Mega Man*
> 
> @Sculptor-Skulls
> sorry for the delay
> 
> used
> http://www.flawlesswidescreen.org/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hope that helps and again this is at ultra everything


thank you very much for your help

what do you think of the vram temp under load ? is it a problem ?


----------



## Mega Man

Not that I notice


----------



## ranjeetsodhi

Got a Sapphire R9 295x2 inside a SFF chassis (Elite 130).

Does anyone know how I can turn the red LED light OFF. Prefer a stealth approach - no lights







.


----------



## Orivaa

Quote:


> Originally Posted by *ranjeetsodhi*
> 
> Got a Sapphire R9 295x2 inside a SFF chassis (Elite 130).
> 
> Does anyone know how I can turn the red LED light OFF. Prefer a stealth approach - no lights
> 
> 
> 
> 
> 
> 
> 
> .


You could take off the shroud and unplug the LED light cable.


----------



## axiumone

Quote:


> Originally Posted by *Orivaa*
> 
> You could take off the shroud and unplug the LED light cable.


Yep, that's probably the only option. Fortunately, it's super easy to do. There's 10 screws holding the shroud in place and no further dis-assembly is required.


----------



## Orivaa

Or maybe he's unlucky like me, and one of the screws got stuck, not budging no matter the screwdriver or force applied.


----------



## HoneyBadger84

Brought the QuadFire back after running single 295x2 for ... well since the day after I got both cards in.

Have to say, despite some minor issues that pop up occasionally, QuadFire does actually run Watch_Dogs very well, albeit the game still has "issues" with running all-Ultra settings even with such a beast setup, that have still not been fixed. I've resorted to running mixed high/ultra settings with some AA (keep in mind I'm at 1080p so in reality a single 295x2 SHOULD be able to run this game on Ultra with high FPS, but alas), and it runs smoothly, both while walking & driving anywhere.

And the major plusses are of course, while the power draw is higher (much less high than one would think actually), it does allow all the GPUs to run cooler as they're not working as hard, and it keeps my FPS up above 80 almost all the time. Currently I'm running with it VSync capped at 144Hz, and for the most part it's very smooth overall, no complaints considering I'm only running my CPU @ 4.2GHz so "bottlenecking" would definitely be a thing with 2 of these cards in any application one would wager.

I just got Hitman Absolution in yesterday, have heard good things about it's scaling & the game in general, looking forward to playing that after I get done with Watch_Dogs replay. All in an effort to "stall" until GTA V comes out finally X_X

Currently I have both cards setup with their radiators mounted on top of my case, in push/pull with the push fan being Corsair SP 120MM fans (static pressure fans) & the exhaust being the stock R9 295x2 fans. Both are mounted as exhaust & are pretty much being fed directly by the side-fan flow of my case.

Temps are... well, ridiculously low, but that's at least partly because the GPUs aren't ever being fully tasked. I'd have to run FireStrike Extreme or something to get constant high GPU load I think, I'll have to run through that again see what the temps are like with the Push/Pull setup.

It ain't the prettiest setup in the world in terms of internal visuals, but the tubing actually doesn't mess with the air profile that much inside the case, since it's right near the side fans, and overall I haven't noticed much of a difference in noise.... but then again, I do game on headphones so yeah :-D

I'll run some benchmarks tomorrow, maybe crank up to 4.6GHz & see if I notice marked improvements in actual game benchmarks. Gotta sleep before work now so no time to run anything else.


----------



## TheReciever

Wouldnt 4 gpus struggle at that resolution? Its rather small...


----------



## xarot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Brought the QuadFire back after running single 295x2 for ... well since the day after I got both cards in.
> 
> Have to say, despite some minor issues that pop up occasionally, QuadFire does actually run Watch_Dogs very well, albeit the game still has "issues" with running all-Ultra settings even with such a beast setup, that have still not been fixed. I've resorted to running mixed high/ultra settings with some AA (keep in mind I'm at 1080p so in reality a single 295x2 SHOULD be able to run this game on Ultra with high FPS, but alas), and it runs smoothly, both while walking & driving anywhere.
> 
> And the major plusses are of course, while the power draw is higher (much less high than one would think actually), it does allow all the GPUs to run cooler as they're not working as hard, and it keeps my FPS up above 80 almost all the time. Currently I'm running with it VSync capped at 144Hz, and for the most part it's very smooth overall, no complaints considering I'm only running my CPU @ 4.2GHz so "bottlenecking" would definitely be a thing with 2 of these cards in any application one would wager.
> 
> I just got Hitman Absolution in yesterday, have heard good things about it's scaling & the game in general, looking forward to playing that after I get done with Watch_Dogs replay. All in an effort to "stall" until GTA V comes out finally X_X
> 
> Currently I have both cards setup with their radiators mounted on top of my case, in push/pull with the push fan being Corsair SP 120MM fans (static pressure fans) & the exhaust being the stock R9 295x2 fans. Both are mounted as exhaust & are pretty much being fed directly by the side-fan flow of my case.
> 
> Temps are... well, ridiculously low, but that's at least partly because the GPUs aren't ever being fully tasked. I'd have to run FireStrike Extreme or something to get constant high GPU load I think, I'll have to run through that again see what the temps are like with the Push/Pull setup.
> 
> It ain't the prettiest setup in the world in terms of internal visuals, but the tubing actually doesn't mess with the air profile that much inside the case, since it's right near the side fans, and overall I haven't noticed much of a difference in noise.... but then again, I do game on headphones so yeah :-D
> 
> I'll run some benchmarks tomorrow, maybe crank up to 4.6GHz & see if I notice marked improvements in actual game benchmarks. Gotta sleep before work now so no time to run anything else.


Try Sleeping Dogs 30 minutes without Vsync if you want to see max temps...


----------



## m00ter

My cards were "running hot" at idle recently and I couldn't work it out at first, they were a good 10C over normal (for me) 28-29C.

Then I realised I had ULPS disabled and the 2nd core was running at max while the 1st was down at 300 / 150. Doh. Flicked the switch in MSI and now the 2nd core is back to normal and so are my temps. 40C wasn't bad as such, but it wasn't right either haha!


----------



## joeh4384

I have seen it happen where my 2nd card will run at 3d clocks. I can get them back to normal by rendering something in crossfire then closing. I think my issue has something to do with using an active display port adapter as it seems to have gone away when i hooked up with just my dvi-d monitor and then connected the display port adapter one.


----------



## Ov3Rk1ll

How do I join this club? Specs


----------



## jackalopeater

Same here







quite excited to have my R9 295x2 even if I am late to the game!


----------



## Sgt Bilko

Quote:


> Originally Posted by *jackalopeater*
> 
> Same here
> 
> 
> 
> 
> 
> 
> 
> quite excited to have my R9 295x2 even if I am late to the game!


Nice looking build there mate


----------



## cliptags

Quote:


> Originally Posted by *fat4l*
> 
> Hi mate.
> I was doing my wcooling in the past few months so if u go back in the thread u will see my pics + discussion about cooling 295x2.
> 
> From my point of view, Aquacomputer gpu block is the best one for 295X2 + they have active backplate which is also the best one. I also think it looks the best
> 
> 
> 
> 
> 
> 
> 
> 
> One 360mm rad will not handle cpu+295X2. Ur water temp will be rly high.I think 2x360mm rad is minimum for "acceptable" water temps.
> Also, do not use silver kill coil with nickel plated blocks as it may corrode the nickel plating on blocks(wash it away). Rather use Mayhems X1 which contains the needed inhibitors and biocides to protect ur loop.
> I also believe that EK makes the best CPU blocks, EK Supremacy EVO or even older EK Supremacy.


Thanks for the advice dude! Got my water-cooling loop installed! No more throttling at 74 degrees! Getting 50 degrees when running Tomb Raider in ultra 4K. Pretty awesome! Got 2x360mm (60mm thickness rads), one externally mounted with 3x Noctura 120mm fans and one rad in the top cooled by 3xCorsair 120sp all in pull config. I went for the Aquacomputer block for the 295x2 as well







.

My 290x isn't working now though (included it in the loop) which is a bit rubbish! Not had a chance to test it on another motherboard to see if it's the card or the PCI slot.

Loving the temps and it was a blast to build! Going to run some temp tests soon!

Did you do any overlocking? If so what did you get doing it? This is what it looks like:


----------



## steezebe

Quote:


> Originally Posted by *cliptags*
> 
> Thanks for the advice dude! Got my water-cooling loop installed! No more throttling at 74 degrees! Getting 50 degrees when running Tomb Raider in ultra 4K. Pretty awesome! Got 2x360mm (60mm thickness rads), one externally mounted with 3x Noctura 120mm fans and one rad in the top cooled by 3xCorsair 120sp all in pull config. I went for the Aquacomputer block for the 295x2 as well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My 290x isn't working now though (included it in the loop) which is a bit rubbish! Not had a chance to test it on another motherboard to see if it's the card or the PCI slot.
> 
> Loving the temps and it was a blast to build! Going to run some temp tests soon!
> 
> Did you do any overlocking? If so what did you get doing it? This is what it looks like:


Do you use the Aquacomputer backplate as well? I'm trying to decide between the EK, XSPC and Aquacomputer blocks, but I'm on the fence as none really address the VRM heat very discretely, with Aquacomputer being the half-exception.


----------



## cliptags

Quote:


> Originally Posted by *steezebe*
> 
> Do you use the Aquacomputer backplate as well? I'm trying to decide between the EK, XSPC and Aquacomputer blocks, but I'm on the fence as none really address the VRM heat very discretely, with Aquacomputer being the half-exception.


Yeah I went for Aquacomputer on recommendation here. I'm liking it so far. I haven't used the other WB's so can't really say much but with the VRM's on AC you don't even need to use thermal pads as they've managed to get the WB ridiculously close to the VRM's so just like with the GPU cooling seems to be pretty good!


----------



## fat4l

Quote:


> Originally Posted by *cliptags*
> 
> Thanks for the advice dude! Got my water-cooling loop installed! No more throttling at 74 degrees! Getting 50 degrees when running Tomb Raider in ultra 4K. Pretty awesome! Got 2x360mm (60mm thickness rads), one externally mounted with 3x Noctura 120mm fans and one rad in the top cooled by 3xCorsair 120sp all in pull config. I went for the Aquacomputer block for the 295x2 as well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My 290x isn't working now though (included it in the loop) which is a bit rubbish! Not had a chance to test it on another motherboard to see if it's the card or the PCI slot.
> 
> Loving the temps and it was a blast to build! Going to run some temp tests soon!
> 
> Did you do any overlocking? If so what did you get doing it? This is what it looks like:


cool stuff mate








no i still havent had time to do ocing...Busy days u know


----------



## joeh4384

Quote:


> Originally Posted by *xarot*
> 
> Try Sleeping Dogs 30 minutes without Vsync if you want to see max temps...


I finally setup AFR friendly for Sleeping Dogs Definite Edition and you are correct, it had my card scorching hot. I was maxing out with the GPUs at 71,73. It was also the first time I noticed my PC pulling more than 800 watts from the wall too.


----------



## wanderingwaldo

how do i change a .txt to a .rom i tried renameing it but it didnt work


----------



## steezebe

Quote:


> Originally Posted by *wanderingwaldo*
> 
> how do i change a .txt to a .rom i tried renameing it but it didnt work


 on a windows 8 comp, go to the View tab on the top, and then check "File Name Extensions" in the Show/Hide section, and then rename the file, first deleting 'txt' and replacing it with 'rom'

hope that helps


----------



## steezebe

Just got my RMA replacement in: booted it up, and the center fan doesn't run! Help!?


----------



## Mega Man

Are you on cfx ( 2 cards)

If so and it is your bottom card upls ( on phone but I think I spelled it right ) will shut off that fan


----------



## steezebe

nah I've got one card. The fan on the rad is running, and the led's work, but the red fan just sits there.


----------



## Mega Man

As you just got it back I would only be able to suggest removing the shroud ( should not violate warranty but do a your own risk ) and see if they plugged in the fan


----------



## steezebe

Yeah the fan is dead; i sent a PM to the xfx rep to see if only the fan can be replaced. Such is my luck with these things... In the meantime I'll put another fan over it and not push the card.


----------



## Noyjitat

Has anyone connected the vram fan and the pump to corsair link? I don't know any of the technical data about the pump like what rpm is should be set to as I'm assuming either the catalyst control center or the card itself controls the pump based off load or heat. But I'd really like to add this to corsair link so I could manually control everything with the corsair software.


----------



## Mega Man

it is just a typical asetek TRASH

fairly disappointed in amd for using the patent troll junk ( the pumps ) just like the h100 ( if i am not mistaken they use that trash too )


----------



## kayan

What is the best way for me to go about replacing the fans on my 295x2? I'd like to go push pull, but what screws do I need? I'm planning on using two nzxt 120mm white fans to match my build theme.


----------



## Ov3Rk1ll

Not sure about the screws, but more interested to know if you plan to have the card power the fans and if not, what you choose in that regard for speed control, etc. I have Noctua's to put on my rads (2x 295x2's) but they are 4 pin PWMs, I have some adapters waiting when I get home to try out, no idea if it's going to work they way I want it to. Let me know how that turns out for you.


----------



## ReV2ReD

Hello all,

I recently picked up a XFX 295x2 to replace my GTX 680s SLI. So far, the card is running quite well in my aging 2500k system with the exception of a select few games that just refuse to play nice with Crossfire.

Anyway, I booted up the Battlefield: Hardline (don't judge) "beta" yesterday along with MSI Afterburner and noticed that Hardline rarely uses more than 60% of either core. As a result the game sometimes drops to around 37 fps in 6048 x 1080 on the Ultra setting. This certainly never happens in BF4 where it rarely ever dips below 80 fps maxed out with each core being 100% utilized.

Is anyone else with a 295x2 seeing this? Is the game just not optimized for Crossfire yet?

System Specs are (sorry I can't create a rig list because @ work):
i5 2500k (OC'd to 4.5)
ASRock Z77 Extreme 4
XFX 295x2
Kingwin Mach 1 1220W PSU
12GB Mushkin DDR3 1600
Intel 730 Seris SSDs (RAID 0)


----------



## kayan

Quote:


> Originally Posted by *ReV2ReD*
> 
> Hello all,
> 
> I recently picked up a XFX 295x2 to replace my GTX 680s SLI. So far, the card is running quite well in my aging 2500k system with the exception of a select few games that just refuse to play nice with Crossfire.
> 
> Anyway, I booted up the Battlefield: Hardline (don't judge) "beta" yesterday along with MSI Afterburner and noticed that Hardline rarely uses more than 60% of either core. As a result the game sometimes drops to around 37 fps in 6048 x 1080 on the Ultra setting. This certainly never happens in BF4 where it rarely ever dips below 80 fps maxed out with each core being 100% utilized.
> 
> Is anyone else with a 295x2 seeing this? Is the game just not optimized for Crossfire yet?
> 
> System Specs are (sorry I can't create a rig list because @ work):
> i5 2500k (OC'd to 4.5)
> ASRock Z77 Extreme 4
> XFX 295x2
> Kingwin Mach 1 1220W PSU
> 12GB Mushkin DDR3 1600
> Intel 730 Seris SSDs (RAID 0)


I played last night for about 30 mins on Dustbowl. I did not notice any slowdown, but I'm only on 1440p. I'm not sure what exactly the fps was, as FRAPS didn't want to run in-game, but I'll give it a shot later.


----------



## ReV2ReD

Quote:


> Originally Posted by *kayan*
> 
> I played last night for about 30 mins on Dustbowl. I did not notice any slowdown, but I'm only on 1440p. I'm not sure what exactly the fps was, as FRAPS didn't want to run in-game, but I'll give it a shot later.


Much appreciated!

Just an FYI for the future (or anytime you may want): Battlefield 4 and Hardline (and maybe BF3) have a console command that displays fps just like FRAPS does. The command is "perfoverlay.drawfps 1" (minus the quotes). This can be entered in the console which is accessed by pressing the "~" key. I use the command all the time in BF4 when I'm testing fps while using Mantle.

(Sorry if this was already known. No intention to offend)


----------



## Mega Man

Quote:


> Originally Posted by *kayan*
> 
> What is the best way for me to go about replacing the fans on my 295x2? I'd like to go push pull, but what screws do I need? I'm planning on using two nzxt 120mm white fans to match my build theme.


Depends

25mm or 30 mm m3 should work 25 mm is just for fan and I'd you have a super thick case you may need 30mm. Even b with my case labs though I can use 25mm
Quote:


> Originally Posted by *ReV2ReD*
> 
> Hello all,
> 
> I recently picked up a XFX 295x2 to replace my GTX 680s SLI. So far, the card is running quite well in my aging 2500k system with the exception of a select few games that just refuse to play nice with Crossfire.
> 
> Anyway, I booted up the Battlefield: Hardline (don't judge) "beta" yesterday along with MSI Afterburner and noticed that Hardline rarely uses more than 60% of either core. As a result the game sometimes drops to around 37 fps in 6048 x 1080 on the Ultra setting. This certainly never happens in BF4 where it rarely ever dips below 80 fps maxed out with each core being 100% utilized.
> 
> Is anyone else with a 295x2 seeing this? Is the game just not optimized for Crossfire yet?
> 
> System Specs are (sorry I can't create a rig list because @ work):
> i5 2500k (OC'd to 4.5)
> ASRock Z77 Extreme 4
> XFX 295x2
> Kingwin Mach 1 1220W PSU
> 12GB Mushkin DDR3 1600
> Intel 730 Seris SSDs (RAID 0)


Its a beta I wouldn't expect it is optimized for anything


----------



## caste1200

So can I join the club







?
Sapphire R9 295x2


----------



## joeh4384

Quote:


> Originally Posted by *ReV2ReD*
> 
> Much appreciated!
> 
> Just an FYI for the future (or anytime you may want): Battlefield 4 and Hardline (and maybe BF3) have a console command that displays fps just like FRAPS does. The command is "perfoverlay.drawfps 1" (minus the quotes). This can be entered in the console which is accessed by pressing the "~" key. I use the command all the time in BF4 when I'm testing fps while using Mantle.
> 
> (Sorry if this was already known. No intention to offend)


Thanks, I didn't know that. I was using BF4 as a temp test of sorts. It would be nice to see FPS.


----------



## SLADEizGOD

Quote:


> Originally Posted by *F4ze0ne*
> 
> I was always get them mixed up for some reason.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SLADEizGOD, I think it takes about 2-3 business days to process a ticket. I'd wait a bit longer and send a follow up message if you don't hear from them. XFX is very good with warranty service on their cards.


Thanks I ended up getting someone. hopefully I can get a quick turn around on my card.


----------



## digitalwanderer

Hey, do you let anyone join this club who has an R9 295x2 card?


----------



## xer0h0ur

Quote:


> Originally Posted by *digitalwanderer*
> 
> Hey, do you let anyone join this club who has an R9 295x2 card?


Just PM the OP. I believe he doesn't keep track of this thread anymore.


----------



## ReV2ReD

Does anyone know which connections to use on a 2x 295x2 (Quadfire) setup for three screen (3x 1080p - 6048 x 1080) Eyefinity?

I know that with my GTX 680 SLI setup, there were a number of specific ways that required certain monitors to plug into certain connectors on each of the two cards. I honestly have no idea if this is the case for a 295x2 Quadfire setup.

I apologize if this has been answered before. I ran a quick search both through Google and here on Overclock, but couldn't find anything specific.


----------



## joeh4384

I think you can just connect all the monitors to one card.


----------



## caste1200

I have 3 screens with one card, one is DP, the rest are dp passive adapter to DVI



all you need is a dp device or a dp active adapter to dvi or hdmi


----------



## ReV2ReD

Quote:


> Originally Posted by *caste1200*
> 
> I have 3 screens with one card, one is DP, the rest are dp passive adapter to DVI
> 
> 
> 
> all you need is a dp device or a dp active adapter to dvi or hdmi


Sweet! Thanks to both you and joeh for the help!


----------



## Ov3Rk1ll

Quote:


> Originally Posted by *ReV2ReD*
> 
> Does anyone know which connections to use on a 2x 295x2 (Quadfire) setup for three screen (3x 1080p - 6048 x 1080) Eyefinity?
> 
> I know that with my GTX 680 SLI setup, there were a number of specific ways that required certain monitors to plug into certain connectors on each of the two cards. I honestly have no idea if this is the case for a 295x2 Quadfire setup.
> 
> I apologize if this has been answered before. I ran a quick search both through Google and here on Overclock, but couldn't find anything specific.


Thanks for asking this ReV2ReD







I believe the 2nd card will sleep until it's activated by going into a 3D app (game) is my understanding, so you would want to connect to the active card. But that's what I've read, I'm out of the country until the 17th (Yay!) and my 2nd R9 295x2 is still in it's box (you don't realize how important it is to be with your family until your 2000 miles away for weeks at a time - thank you to everyone who serves there countries in extended duties doing what ever tasks your assigned, I couldn't imagine being away for 9 months to a year+ from your love ones).


----------



## Mirdain

Quote:


> Originally Posted by *caste1200*
> 
> I have 3 screens with one card, one is DP, the rest are dp passive adapter to DVI
> 
> 
> 
> all you need is a dp device or a dp active adapter to dvi or hdmi


Prior to the Omega drivers you could use 3 passive adaptors (I used Mini-DP to HDMI cables) in eyefinity on the 295x2.

I stayed away from the Omega drivers since eyefinity did not work (only supports two outputs for timings - probably when they redid the driver sets).

This is why you need what Caste1200 mentioned.

Now that I am adding my 4th monitor I have to get two active adaptors to use with the two passive cables I currently have.

This also allows me to update my drivers


----------



## Mega Man

Quote:


> Originally Posted by *ReV2ReD*
> 
> Does anyone know which connections to use on a 2x 295x2 (Quadfire) setup for three screen (3x 1080p - 6048 x 1080) Eyefinity?
> 
> I know that with my GTX 680 SLI setup, there were a number of specific ways that required certain monitors to plug into certain connectors on each of the two cards. I honestly have no idea if this is the case for a 295x2 Quadfire setup.
> 
> I apologize if this has been answered before. I ran a quick search both through Google and here on Overclock, but couldn't find anything specific.


any way you can ( serious )
Quote:


> Originally Posted by *Ov3Rk1ll*
> 
> thank you to everyone who serves there countries in extended duties doing what ever tasks your assigned, I couldn't imagine being away for 9 months to a year+ from your love ones).


+12
Quote:


> Originally Posted by *Mirdain*
> 
> Quote:
> 
> 
> 
> Originally Posted by *caste1200*
> 
> I have 3 screens with one card, one is DP, the rest are dp passive adapter to DVI
> 
> 
> 
> all you need is a dp device or a dp active adapter to dvi or hdmi
> 
> 
> 
> Prior to the Omega drivers you could use 3 passive adaptors (I used Mini-DP to HDMI cables) in eyefinity on the 295x2.
> 
> I stayed away from the Omega drivers since eyefinity did not work (only supports two outputs for timings - probably when they redid the driver sets).
> 
> This is why you need what Caste1200 mentioned.
> 
> Now that I am adding my 4th monitor I have to get two active adaptors to use with the two passive cables I currently have.
> 
> This also allows me to update my drivers
Click to expand...

i never needed anything but i use DP so it is possible this is accurate ? either way the big jump 79xx to 2xx was the 79xx could only run 2 passive things before needing a active adapter

the 2xx can do 3


----------



## ReV2ReD

Quote:


> Originally Posted by *Mirdain*
> 
> Prior to the Omega drivers you could use 3 passive adaptors (I used Mini-DP to HDMI cables) in eyefinity on the 295x2.
> 
> I stayed away from the Omega drivers since eyefinity did not work (only supports two outputs for timings - probably when they redid the driver sets).
> 
> This is why you need what Caste1200 mentioned.
> 
> Now that I am adding my 4th monitor I have to get two active adaptors to use with the two passive cables I currently have.
> 
> This also allows me to update my drivers


Yeah, I found out about the active adapters the hard way when I was hooking up my monitors to my first 295x2 for Eyefinity. I had 3 mini DP to HDMI passive adapters going from the card to the monitors, but the third monitor would never work until I finally discovered the need for at least one active adapter in addition to the passive adapters. As of now, I have the monitors connected with one DVI and two passive mini DP to HDMI connectors. Hopefully , the second 295x2 will be okay with my connection method when it arrives tomorrow as I keep forgetting to pick up an active mini DP to HDMI adapter


----------



## caste1200

Quote:


> Originally Posted by *Mega Man*
> 
> any way you can ( serious )
> +12
> i never needed anything but i use DP so it is possible this is accurate ? either way the big jump 79xx to 2xx was the 79xx could only run 2 passive things before needing a active adapter
> 
> the 2xx can do 3


yep you only require DP actually, you can use 3 DP and eyefinity will work, you need at least one!


----------



## Mega Man

But still that was one of the upgrades to the 2xx series. ..


----------



## Mirdain

*was*

They broke it when I they went with a unified driver approach since I think it was software emulated for the 2xx series.

- I guess you can't really say "broke" since they always stated the two signal restriction with eyefinity.

I spent some time trying to get the omega drivers to work and the whole time I didn't realize it was the lack of active adaptors I was using.

I could easily get 2 monitors to work fine but when the 3rd was plugged in CCC would lock up and on a reboot would not display anything on any monitor until you unplugged the 3rd.

I had to roll by my drivers to get it to work again.

Not until I needed my 4th did this all start to make sense.......


----------



## m00ter

I upped my mem OC to 1500Mhz today (1100Mhz on the core) and after 2hrs of GRID Autosport on Ultra (6100x1080) she topped out at 59/60C. +10 on the voltage, and +50% on the power limit.

The game was noticeably smoother with the increase, with not a stutter or a tear! I guess I should see what it can really do, but I suspect my arbitrary slider moving is probably pretty close to the limit?

I'd be interested to learn what OC's have others been able to achieve....


----------



## Sgt Bilko

Quote:


> Originally Posted by *m00ter*
> 
> I upped my mem OC to 1500Mhz today (1100Mhz on the core) and after 2hrs of GRID Autosport on Ultra (6100x1080) she topped out at 59/60C. +10 on the voltage, and +50% on the power limit.
> 
> The game was noticeably smoother with the increase, with not a stutter or a tear! I guess I should see what it can really do, but I suspect my arbitrary slider moving is probably pretty close to the limit?
> 
> I'd be interested to learn what OC's have others been able to achieve....


I get about the same for overclocks, +9mV for 1100/1500 but i need +100mV for 1150/1500.


----------



## MR KROGOTH

How is the build quality on most reference 295x2 cards?

Considering moving to another dual-GPU card platform, just getting some ideas.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MR KROGOTH*
> 
> How is the build quality on most reference 295x2 cards?
> 
> Considering moving to another dual-GPU card platform, just getting some ideas.


It's an extremely solid card if i'm honest, everything is well put together and i'm quite impressed with mine.


----------



## MR KROGOTH

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's an extremely solid card if i'm honest, everything is well put together and i'm quite impressed with mine.


Fan noise? How about heat spread across both cores, are they about even?
My current cards are wonderful to me. I love that I can typically keep below 70*c in most games with an aggressive fan profile.

How does CrossFireX scale these days? SLI in this rig, when I dont have VSync on will put me around 97-99% usage evenly across all cards.

Also, driver issues?


----------



## Sgt Bilko

Quote:


> Originally Posted by *MR KROGOTH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's an extremely solid card if i'm honest, everything is well put together and i'm quite impressed with mine.
> 
> 
> 
> Fan noise? How about heat spread across both cores, are they about even?
> My current cards are wonderful to me. I love that I can typically keep below 70*c in most games with an aggressive fan profile.
> 
> How does CrossFireX scale these days? SLI in this rig, when I dont have VSync on will put me around 97-99% usage evenly across all cards.
> 
> Also, driver issues?
Click to expand...

I replaced the stock rad fan with a single Noctua Industrial and with a 1100/1500 overclock keeps it at a max 68c, max variance i've seen between the cores is about 4c.

I can't compare Crossfire to SLI as i've not used SLI before but i can say that i've had little to no issues with games scaling and performing well.

As for drivers there have a been a couple of ups and downs since Hawaii launched but they have been working just fine since i've had the 295x2
(Had a 290x and 2 x 290's before that)


----------



## MR KROGOTH

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I replaced the stock rad fan with a single Noctua Industrial and with a 1100/1500 overclock keeps it at a max 68c, max variance i've seen between the cores is about 4c.
> 
> I can't compare Crossfire to SLI as i've not used SLI before but i can say that i've had little to no issues with games scaling and performing well.
> 
> As for drivers there have a been a couple of ups and downs since Hawaii launched but they have been working just fine since i've had the 295x2
> (Had a 290x and 2 x 290's before that)


Can you not monitor usage per card in game?


----------



## Sgt Bilko

Quote:


> Originally Posted by *MR KROGOTH*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I replaced the stock rad fan with a single Noctua Industrial and with a 1100/1500 overclock keeps it at a max 68c, max variance i've seen between the cores is about 4c.
> 
> I can't compare Crossfire to SLI as i've not used SLI before but i can say that i've had little to no issues with games scaling and performing well.
> 
> As for drivers there have a been a couple of ups and downs since Hawaii launched but they have been working just fine since i've had the 295x2
> (Had a 290x and 2 x 290's before that)
> 
> 
> 
> Can you not monitor usage per card in game?
Click to expand...

Yep, i usually have Afterburner running on my keyboards LCD for Core usage, temps, vram usage and the like


----------



## kayan

Quote:


> Originally Posted by *Ov3Rk1ll*
> 
> Not sure about the screws, but more interested to know if you plan to have the card power the fans and if not, what you choose in that regard for speed control, etc. I have Noctua's to put on my rads (2x 295x2's) but they are 4 pin PWMs, I have some adapters waiting when I get home to try out, no idea if it's going to work they way I want it to. Let me know how that turns out for you.


I do not plan on using the card to power the fan(s). My case has a built in fan controller that sometimes works....







If not then I have a NZXT Sentry 3 around that I could hook them up to. Or heck, even just plug them into my mobo.
Quote:


> Originally Posted by *m00ter*
> 
> I upped my mem OC to 1500Mhz today (1100Mhz on the core) and after 2hrs of GRID Autosport on Ultra (6100x1080) she topped out at 59/60C. +10 on the voltage, and +50% on the power limit.
> 
> The game was noticeably smoother with the increase, with not a stutter or a tear! I guess I should see what it can really do, but I suspect my arbitrary slider moving is probably pretty close to the limit?
> 
> I'd be interested to learn what OC's have others been able to achieve....


I too would be interested in everyone's overclocks, and also what program (CCC, AB, or Trixx) you used to achieve them.


----------



## m00ter

Well it just managed a couple of hours of BF4 @ 1100/1500 and topped 63C without a stutter, might see what it's actually capable of now.


----------



## Roxycon

Here's my rig







like the look of the gpu but it will look better with a waterblock


----------



## Ov3Rk1ll

Quote:


> Originally Posted by *ReV2ReD*
> 
> Does anyone know which connections to use on a 2x 295x2 (Quadfire) setup for three screen (3x 1080p - 6048 x 1080) Eyefinity?
> 
> I know that with my GTX 680 SLI setup, there were a number of specific ways that required certain monitors to plug into certain connectors on each of the two cards. I honestly have no idea if this is the case for a 295x2 Quadfire setup.
> 
> I apologize if this has been answered before. I ran a quick search both through Google and here on Overclock, but couldn't find anything specific.


Thanks for asking this ReV2ReD
Quote:


> Originally Posted by *Roxycon*
> 
> Here's my rig
> 
> 
> 
> 
> 
> 
> 
> like the look of the gpu but it will look better with a waterblock




lol guys haha - we're going to need to start an EVOLV R9 295x2 club...


----------



## ReV2ReD

So, UPS had my second 295x2 waiting on me when I got home from work on Friday, but they didn't have my new Evga 1600W power supply to feed these monsters in Crossfire (Quadfire). It's been wonderful just staring longingly at my second card's unopened box all weekend....

Anyway, just as an FYI to anyone interested, I did put the rad for my single operating 295x2 in push/pull with some Noctua NF-P12s I had laying around. I ran them both off a Y-splitter from the card itself, and the card had no problems powering both fans. Even so, stress testing with Furmark revealed that the card heated up mush faster using my push/pull configuration than with the stock cooling fan. I have the radiator mounted at the top of my HAF 932 exhausting out of the case with two 140mm front intake fans, a 240mm side intake fan, and my CPU closed loop exhausting out the back in push/pull through it's own 120mm rad. The 295x2 reaches steady state under full GPU load at 68C both with the Noctua push/pull, and the stock fan, but it heated up much faster with the push/pull. I should also mention that I tried to connect the Noctuas directly to the power supply as well to ensure that I wasn't underpowering them through the Y-splitter, but that provided the same result. YMMV


----------



## steezebe

Quote:


> Originally Posted by *ReV2ReD*
> ...
> Anyway, just as an FYI to anyone interested, I did put the rad for my single operating 295x2 in push/pull with some Noctua NF-P12s I had laying around. I ran them both off a Y-splitter from the card itself, and the card had no problems powering both fans. Even so, stress testing with Furmark revealed that the card heated up mush faster using my push/pull configuration than with the stock cooling fan. ...


A push-pull heated faster than just a push... what? somethings not right...


----------



## kayan

Hmm, so I am unable to overclock my 295x2 :/ When I bump it up by anything in CCC I get artifacts at anything above stock. Also, when benching, 3dMark returns an error stating, "Time measurement data not available..."

Also it shows me as having 0gb of RAM, along with 0 turbo clock. Scores also dropped when trying to overclock, and judging by my xp with 290x @2, usually means a lack of voltage. Should I go back to AB or Trixx?

Edit: Whoops, forgot the link. http://www.3dmark.com/3dm/5830629?


----------



## Mega Man

Quote:


> Originally Posted by *ReV2ReD*
> 
> So, UPS had my second 295x2 waiting on me when I got home from work on Friday, but they didn't have my new Evga 1600W power supply to feed these monsters in Crossfire (Quadfire). It's been wonderful just staring longingly at my second card's unopened box all weekend....
> 
> Anyway, just as an FYI to anyone interested, I did put the rad for my single operating 295x2 in push/pull with some Noctua NF-P12s I had laying around. I ran them both off a Y-splitter from the card itself, and the card had no problems powering both fans. Even so, stress testing with Furmark revealed that the card heated up mush faster using my push/pull configuration than with the stock cooling fan. I have the radiator mounted at the top of my HAF 932 exhausting out of the case with two 140mm front intake fans, a 240mm side intake fan, and my CPU closed loop exhausting out the back in push/pull through it's own 120mm rad. The 295x2 reaches steady state under full GPU load at 68C both with the Noctua push/pull, and the stock fan, but it heated up much faster with the push/pull. I should also mention that I tried to connect the Noctuas directly to the power supply as well to ensure that I wasn't underpowering them through the Y-splitter, but that provided the same result. YMMV


possible you are using lower static pressure fans my ap30s have no issues cooling it ( NOT CONNECTED TO CARD AS IT IS 3A)


----------



## Mirdain

Went ahead and put my CPU and MOSFETs under water.

4790k 4.6/4.8ghz.

I've been slowly upgrading my water setup but i'm very happy since my water temps don't really get above 42.

the extra 140mm really helped and I had a hell of a time with air bubbles (even before the upgrade).

Spent a long time try'n to get it out then I had to put a drop of soap in.

Wow.... wish i did that 1/2 year ago as now it's dead silent and took temps down another 5 degrees.

My mobo is around 29-32 since almost all the heat is pulled out side the case.

hhahaha i'm making my 295x2 work...... now after stress testing my mobo/cpu at same time i can't get the
hottest GPU core above 69.


----------



## ReV2ReD

Quote:


> Originally Posted by *Mega Man*
> 
> possible you are using lower static pressure fans my ap30s have no issues cooling it ( NOT CONNECTED TO CARD AS IT IS 3A)


This was the conclusion I came to as well. I just had the Noctuas already laying around, so I decided to give them a try. The card has never hit higher than 68C, so I'm in no real rush for a better cooling solution atm. I'll definitely look into a pair of ap30s in the future, though. Out of curiosity, what was your delta between your push/pull and stock rad fan?


----------



## Mega Man

My ap30s are 4250rpm judging by the Noctuas you have you won't like him they are pwm modded

Running eyefinity they seldom break 50c


----------



## ReV2ReD

Wow, 50C is awesome! I'm guessing at 4250rpm, those fans are pretty loud?

Generally, noisy fans aren't really a problem for me as I game with headphones. My wife, however, might not appreciate my rig sounding like an industrial blender.

Also, I ran a quick online search for AP-30s , and it looks like Scythe has discontinued them. All I can find from Scythe now are much lower rpm (and much lower cfm) fans. Does anyone know of some current model high-rpm, high-cfm fans for push/pull radiator duty?


----------



## kayan

Quote:


> Originally Posted by *ReV2ReD*
> 
> Wow, 50C is awesome! I'm guessing at 4250rpm, those fans are pretty loud?
> 
> Generally, noisy fans aren't really a problem for me as I game with headphones. My wife, however, might not appreciate my rig sounding like an industrial blender.
> 
> Also, I ran a quick online search for AP-30s , and it looks like Scythe has discontinued them. All I can find from Scythe now are much lower rpm (and much lower cfm) fans. Does anyone know of some current model high-rpm, high-cfm fans for push/pull radiator duty?


You don't necessarily need high rpm or high cfm fans for radiators. The most important is the static pressure number. That is what determines how much overall air gets pushed through the rad. I use some NZXT high SP fans as my rad fans, and they work well. I'm at work now, but I believe it's the FX line of fans.


----------



## ReV2ReD

I think I read somewhere ages ago while doing some fairly light internet case fan research that 3.0 mm H20 minimum is needed for a successful push/pull setup. After a look online at my Noctua NF-P12s, the specs show that each fan is capable of 1.68 mm H2O. The Gentle Typhoon AP-30s that Mega Man is using put out a whopping 9.652 mm H2O each. It would be truly sweet to find some lower rpm (quieter) fans that can perform half as well as those AP-30s.


----------



## Roxycon

anyone here using four or more display's on their gpu?


----------



## Elmy

Quote:


> Originally Posted by *Roxycon*
> 
> anyone here using four or more display's on their gpu?







Here is a video of my setup. This was a video taken at ExtravaLANza @ AMD Headquarters about 8 months ago or so. Still running the same setup right now.


----------



## HoneyBadger84

Quote:


> Originally Posted by *TheReciever*
> 
> Wouldnt 4 gpus struggle at that resolution? Its rather small...


Indeed it is, but I don't plan on upgrading resolution too soon. Might go with an Eyefinity 3x1080p setup but I doubt it since I'd need 2 more 144 or at least 120Hz monitors to get the most out of it, so to speak. I think the main issue when running smaller resolutions is throughput, and I'm not too hot on a platform upgrade yet, though I'm sure X99 would net me 10-15% gains where CPU throughput is the issue, which I'm sure it is in at least some games that otherwise support QuadFire & do scale with it, just not ideal performance wise.
Quote:


> Originally Posted by *ReV2ReD*
> 
> Hello all,
> 
> I recently picked up a XFX 295x2 to replace my GTX 680s SLI. So far, the card is running quite well in my aging 2500k system with the exception of a select few games that just refuse to play nice with Crossfire.
> 
> Anyway, I booted up the Battlefield: Hardline (don't judge) "beta" yesterday along with MSI Afterburner and noticed that Hardline rarely uses more than 60% of either core. As a result the game sometimes drops to around 37 fps in 6048 x 1080 on the Ultra setting. This certainly never happens in BF4 where it rarely ever dips below 80 fps maxed out with each core being 100% utilized.
> 
> Is anyone else with a 295x2 seeing this? Is the game just not optimized for Crossfire yet?
> 
> System Specs are (sorry I can't create a rig list because @ work):
> i5 2500k (OC'd to 4.5)
> ASRock Z77 Extreme 4
> XFX 295x2
> Kingwin Mach 1 1220W PSU
> 12GB Mushkin DDR3 1600
> Intel 730 Seris SSDs (RAID 0)


One would assume since it's a Beta it's not fully optimized yet... also it could be that newer programming used in the updatedish engine Hardline is using could be a bit more CPU intensive & you're finally experiencing issues with that aging 2500k/Z77 setup.

After playing Watch_Dogs (again) & it's expansion all the way through I have to say while it does run a lot better than it did after release, it still has issues that haven't been solved, but overall I enjoyed it & the performance from QuadFire even though I was only at 1080p was quite nice. Only time I ever went below 80FPS was when driving, which is one of the aforementioned issues that was never fully "fixed".

I'll be moving on, gonna finally have a crack at Hitman: Absolution, & I plan on replaying both Metro 2033 & Last Light, both of which scale stupidly well on this setup even at that low of a resolution.


----------



## Mega Man

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReV2ReD*
> 
> Wow, 50C is awesome! I'm guessing at 4250rpm, those fans are pretty loud?
> 
> Generally, noisy fans aren't really a problem for me as I game with headphones. My wife, however, might not appreciate my rig sounding like an industrial blender.
> 
> Also, I ran a quick online search for AP-30s , and it looks like Scythe has discontinued them. All I can find from Scythe now are much lower rpm (and much lower cfm) fans. Does anyone know of some current model high-rpm, high-cfm fans for push/pull radiator duty?
> 
> 
> 
> You don't necessarily need high rpm or high cfm fans for radiators. The most important is the static pressure number. That is what determines how much overall air gets pushed through the rad. I use some NZXT high SP fans as my rad fans, and they work well. I'm at work now, but I believe it's the FX line of fans.
Click to expand...

recommend reading this
http://martinsliquidlab.org/2013/02/18/why-static-pressure-max-flow-specs-are-poor-measures-of-fan-performance/
Quote:


> Originally Posted by *ReV2ReD*
> 
> Wow, 50C is awesome! I'm guessing at 4250rpm, those fans are pretty loud?
> 
> Generally, noisy fans aren't really a problem for me as I game with headphones. My wife, however, might not appreciate my rig sounding like an industrial blender.
> 
> Also, I ran a quick online search for AP-30s , and it looks like Scythe has discontinued them. All I can find from Scythe now are much lower rpm (and much lower cfm) fans. Does anyone know of some current model high-rpm, high-cfm fans for push/pull radiator duty?


Quote:


> Originally Posted by *ReV2ReD*
> 
> I think I read somewhere ages ago while doing some fairly light internet case fan research that 3.0 mm H20 minimum is needed for a successful push/pull setup. After a look online at my Noctua NF-P12s, the specs show that each fan is capable of 1.68 mm H2O. The Gentle Typhoon AP-30s that Mega Man is using put out a whopping 9.652 mm H2O each. It would be truly sweet to find some lower rpm (quieter) fans that can perform half as well as those AP-30s.


the awesome thing is if you pwm mod them ( i wanted the ap31s but at the time i could not find them ) aka you solder on the pwm wire .... you get a pwm fan that goes from ~1krpm to max speed @ 1k they are quite quiet also most of the gts are quite readily avail ( ap15 ( low speed non pwm modable ) ap30 and ap31s off the top of my head i have seen the others pretty common as well but they are not cheap ) i can point you to where to go

they dont get that high on rpm as they are hooked up to my aq6


----------



## ReV2ReD

Thanks again Mega Man!

So I finally got my second 295x2 and my new EVGA 1600GS power supply installed last night, and I must say that I am not impressed so far. While it did improve some games like Crysis 3 and Far Cry 4, the addition of the second card has made many of the games that I frequent unplayable. Details follow:

Resolution: 6048 x 1080 Eyefinity

Crysis 3: Maxed out settings. FPS doubled from ~33 to stable ~62. There is a short period of stuttering when the game loads a map, but it quickly stablizes and is an ultimately enjoyable experience.

Battlefield 4: Maxed out settings (Ultra). Mantle shows a large increase in fps (~170 fps over the single card's ~80 fps), but the flashing texutres makes the game unplayable. DX11 cures the flashing, but there is no appreciable difference in frame rates from a single 295x2. The 295x2 Quadfire article on HardForum (HardOCP) shows that Quadfire does not scale well with BF4, so this was not unexpected.

Far Cry 3: Maxed out settings. Second 295x2 causes an FOV problem that makes the game unplayable. The problem is not present with a single 295x2.

Far Cry 3: Blood Dragon: Maxed out settings. Second 295x2 causes highly increased load times and massive in-game stuttering making the game unplayable. Neither issue is present with a single 295x2.

Shadow of Mordor: Maxed out settings. Very slow load times with unresponsive and jittery menus. Gameplay suffers from extreme stuttering and no appreciable fps from the single card. The stuttering often causes the game to drop to single-digit fps numbers. None of these problems exist with a single 295x2.

Far Cry 4: While the second 295x2 does not provide an appreciable increase to fps over the single card (~45 - 60 fps with max settings), the flashing that had been present when forcing the game to run in Crossfire is no longer present. The game is largely playable, but the addition of the second card does cause the menus to stutter a bit. Please note that I do understand that AMD does not have an official Crossfire profile for this game.

Tomb Raider: Excellent scaling! Nearly double the fps (avg 140s) with all of the eye candy turned on. Minimal menu stuttering.

As far as trouble shooting:
I've uninstalled the 14.12 drivers with DDU and resinstalled them.

I've connected one of the 295x2s to my former 1220W power supply (this power supply previously powered the single 295x2 before I installed the second card for Quadfire). No change. This is not a power issue.

I've tried just about every CCC game configuration that I could think of for each of the games that are causing me issues.

When I get home from work today, I need to attempt to run the new card by itself to ensure that I didn't get a dud card, but I don't really know what to try beyond that. Has anyone else out there experienced similar issues with their 295x2 Quadfire setup?

I have to admit that I'm skeptical at this point. I'm thinking I may soon return to Team Green with a pair of GTX 980s.

Thanks, and sorry for the wall o' text.


----------



## TheReciever

Maybe try trifire and see what differences there could be?

Just taking a stab in the dark


----------



## ReV2ReD

Quote:


> Originally Posted by *TheReciever*
> 
> Maybe try trifire and see what differences there could be?
> 
> Just taking a stab in the dark


I appreciate the suggestion, but the current AMD drivers don't allow 295x2 Quadfire users to specify the number of GPUs to use. When I turn off Crossfire in CCC, only one GPU from the main card is active, and when Crossfire is turned on, all 4 GPUs are active. As far as the drivers are concerned, the second card is either on or off. It seems to be kind of an "all or nothing" situation.

I don't have a single 290 to test out a more standard trifire configuration.


----------



## Orivaa

Quote:


> Originally Posted by *ReV2ReD*
> 
> Thanks again Mega Man!
> 
> So I finally got my second 295x2 and my new EVGA 1600GS power supply installed last night, and I must say that I am not impressed so far. While it did improve some games like Crysis 3 and Far Cry 4, the addition of the second card has made many of the games that I frequent unplayable. Details follow:
> 
> Resolution: 6048 x 1080 Eyefinity
> 
> Crysis 3: Maxed out settings. FPS doubled from ~33 to stable ~62. There is a short period of stuttering when the game loads a map, but it quickly stablizes and is an ultimately enjoyable experience.
> 
> Battlefield 4: Maxed out settings (Ultra). Mantle shows a large increase in fps (~170 fps over the single card's ~80 fps), but the flashing texutres makes the game unplayable. DX11 cures the flashing, but there is no appreciable difference in frame rates from a single 295x2. The 295x2 Quadfire article on HardForum (HardOCP) shows that Quadfire does not scale well with BF4, so this was not unexpected.
> 
> Far Cry 3: Maxed out settings. Second 295x2 causes an FOV problem that makes the game unplayable. The problem is not present with a single 295x2.
> 
> Far Cry 3: Blood Dragon: Maxed out settings. Second 295x2 causes highly increased load times and massive in-game stuttering making the game unplayable. Neither issue is present with a single 295x2.
> 
> Shadow of Mordor: Maxed out settings. Very slow load times with unresponsive and jittery menus. Gameplay suffers from extreme stuttering and no appreciable fps from the single card. The stuttering often causes the game to drop to single-digit fps numbers. None of these problems exist with a single 295x2.
> 
> Far Cry 4: While the second 295x2 does not provide an appreciable increase to fps over the single card (~45 - 60 fps with max settings), the flashing that had been present when forcing the game to run in Crossfire is no longer present. The game is largely playable, but the addition of the second card does cause the menus to stutter a bit. Please note that I do understand that AMD does not have an official Crossfire profile for this game.
> 
> Tomb Raider: Excellent scaling! Nearly double the fps (avg 140s) with all of the eye candy turned on. Minimal menu stuttering.
> 
> As far as trouble shooting:
> I've uninstalled the 14.12 drivers with DDU and resinstalled them.
> 
> I've connected one of the 295x2s to my former 1220W power supply (this power supply previously powered the single 295x2 before I installed the second card for Quadfire). No change. This is not a power issue.
> 
> I've tried just about every CCC game configuration that I could think of for each of the games that are causing me issues.
> 
> When I get home from work today, I need to attempt to run the new card by itself to ensure that I didn't get a dud card, but I don't really know what to try beyond that. Has anyone else out there experienced similar issues with their 295x2 Quadfire setup?
> 
> I have to admit that I'm skeptical at this point. I'm thinking I may soon return to Team Green with a pair of GTX 980s.
> 
> Thanks, and sorry for the wall o' text.


Neither QuadFire nor QuadSLI works in the vast majority of games. You would likely experience similiar problems with 4 GTX 980's. It's not only an AMD problem.


----------



## TheReciever

Oh wow, thats discouraging...


----------



## tsm106

I don't really have issues with quadfire. Though you 295 guys using two cards are pimping not one but two PLX chips ya know?

Quote:


> Far Cry 3: Blood Dragon: Maxed out settings. Second 295x2 causes highly increased load times and massive in-game stuttering making the game unplayable. Neither issue is present with a single 295x2.


Blood Dragon is from a different studio and it fully supports quad gpu. It's strange that you cannot get it working.

Quote:


> Far Cry 4: While the second 295x2 does not provide an appreciable increase to fps over the single card (~45 - 60 fps with max settings), the flashing that had been present when forcing the game to run in Crossfire is no longer present. The game is largely playable, but the addition of the second card does cause the menus to stutter a bit. Please note that I do understand that AMD does not have an official Crossfire profile for this game.


I've gotten FC4 to play quite nicely using AFR friendly, in fact its silly smooth and scaling is not as bad as I expected.

Quote:


> Shadow of Mordor: Maxed out settings. Very slow load times with unresponsive and jittery menus. Gameplay suffers from extreme stuttering and no appreciable fps from the single card. The stuttering often causes the game to drop to single-digit fps numbers. None of these problems exist with a single 295x2.


Another game that should run well but doesn't on your rig. I notice that half the time it seems your second card is more dead weight than active.


----------



## ReV2ReD

Quote:


> Originally Posted by *tsm106*
> 
> Another game that should run well but doesn't on your rig. I notice that half the time it seems your second card is more dead weight than active.


Yep, it's really been disappointing. I didn't get around to trying to run the second card on its own to check for functionality, but I'm beginning to wonder if I got a dud card. That will be the first thing that I do when I get home from work today. I was wondering in the meantime, since I'm stuck here at work, if my experience is normal to Quadfire. Sounds like it is not.


----------



## tsm106

Quote:


> Originally Posted by *ReV2ReD*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Another game that should run well but doesn't on your rig. I notice that half the time it seems your second card is more dead weight than active.
> 
> 
> 
> Yep, it's really been disappointing. I didn't get around to trying to run the second card on its own to check for functionality, but I'm beginning to wonder if I got a dud card. That will be the first thing that I do when I get home from work today. I was wondering in the meantime, since I'm stuck here at work, if my experience is normal to Quadfire. Sounds like it is not.
Click to expand...

If you haven't confirmed QC on both cards, I would do that first, it's crucial first step in multi gpu setups. There are limitations to quad gpu so we have to be realistic like don't expect great support in a game that is not coded to run quads, etc. Also, your system config/parts etc can have a large impact. Btw, what is your setup like? You should fill out your rig specs.


----------



## ReV2ReD

Quote:


> Originally Posted by *tsm106*
> 
> If you haven't confirmed QC on both cards, I would do that first, it's crucial first step in multi gpu setups. There are limitations to quad gpu so we have to be realistic like don't expect great support in a game that is not coded to run quads, etc. Also, your system config/parts etc can have a large impact. Btw, what is your setup like? You should fill out your rig specs.


Very true, but it sounds like my setup is having issues on games that others are not experiencing. Also, I really do need to get my specs up, but for some reason, they won't post here at work (they have a lot of stuff blocked, i.e. I can't see anyone else's system specs here).

I have an i5 2500k OC'd to 4.5 Ghz on a ASRock Z77 Extreme 4 mobo with 12 GB of Mushkin RAM.

It's definitely an older system, and with Sandy Bridge architecture only supporting PCIe 2.0, I wonder if I'm clogging up the lanes with the Quadfire setup. Between PCIe 2.0, and aging processor and mobo, and the PLX chips on both 295x2s, I'm thinking I may have identified a smoking gun for my stuttering issues...I just hope that's not the case because I really like my 2500k. It's a fighter. (Plus the X99 and 2011v3 components are quite expensive compared to 1150 stuff). I had convinced myself that it wouldn't be an issue with all of the tech articles out there comparing PCIe revisions and showing very little performance differences.


----------



## tsm106

Quote:


> Originally Posted by *ReV2ReD*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> If you haven't confirmed QC on both cards, I would do that first, it's crucial first step in multi gpu setups. There are limitations to quad gpu so we have to be realistic like don't expect great support in a game that is not coded to run quads, etc. Also, your system config/parts etc can have a large impact. Btw, what is your setup like? You should fill out your rig specs.
> 
> 
> 
> Very true, but it sounds like my setup is having issues on games that others are not experiencing. Also, I really do need to get my specs up, but for some reason, they won't post here at work (they have a lot of stuff blocked, i.e. I can't see anyone else's system specs here).
> 
> I have an i5 2500k OC'd to 4.5 Ghz on a ASRock Z77 Extreme 4 mobo with 12 GB of Mushkin RAM.
> 
> It's definitely an older system, and with Sandy Bridge architecture only supporting PCIe 2.0, I wonder if I'm clogging up the lanes with the Quadfire setup. Between PCIe 2.0, and aging processor and mobo, and the PLX chips on both 295x2s, I'm thinking I may have identified a smoking gun for my stuttering issues...I just hope that's not the case because I really like my 2500k. It's a fighter. (Plus the X99 and 2011v3 components are quite expensive compared to 1150 stuff). I had convinced myself that it wouldn't be an issue with all of the tech articles out there comparing PCIe revisions and showing very little performance differences.
Click to expand...

Dude, you have only 16 lanes at pcie 2.0 and you have to share that with 4 gpus. It's no wonder your cards are extremely bandwidth starved. These cards communicate via the pcie bus, and your setup has a lot of compromises on the pcie bus. At a minimum you should go to an Ivy cpu, that way you can at least double the bandwidth and allow everything to run on the same pcie spec.


----------



## TheReciever

2.0 I believe is fine but your running at 8x and not full x16 if im not mistaken which was one of the main advantages of x58 when they were compared side by side


----------



## tsm106

Quote:


> Originally Posted by *TheReciever*
> 
> 2.0 I believe is fine but your running at 8x and not full x16 *if im not mistaken* which was one of the main advantages of x58 when they were compared side by side


No, you are mistaken. SB is 16 lanes pcie 2.0 off the cpu unlike x58.


----------



## TheReciever

over multiple lanes?

Thought it was x8/x8 when in SLI/XFIRE mode


----------



## evmota21

R9 295x2 user here.

Since I got my card (6 months), I had several issues with crossfire, which is normal. Last night, I tried to bump my resolution scaling with BF4 to 150% (Mantle). No problems, everything cool. Btw my card is watercooled with an EK block. When I try to push it to 170%, just after pressing save settings on BF4, fps drops to 10, THEN AFTER some seconds (5-10), it stabilizes. After some 10 to 30 minutes playing, stuttering would start and the game would crash with a red screen. Temps are 45-50.

My guess is that the VRMs are getting too hot, causing my card to crash. I have tried several OC configurations. I am sure that even at stock settings (1018 clock and 1250 mem) the card crashes with BF4.

Any ideas?

PS: I would also like to know any way to monitor my vrm temps.

EDIT 1: Got a 1000W EVGA PSU and BF4 HAS crashed at stock settings with 170% resolution scaling up to 200%.


----------



## Elmy

My quad setup has worked in most of the games i play with minimal performance issues i just have to rename the weaponchunks.sb file in BF4 and Hardline to get rid of the stutter in those 2 games. Ive already notified DICE and AMD about it. I thinks its a DICE issue though because my buddy with quad 980's has the exact same issue.


----------



## evmota21

Quote:


> Originally Posted by *Elmy*
> 
> My quad setup has worked in most of the games i play with minimal performance issues i just have to rename the weaponchunks.sb file in BF4 and Hardline to get rid of the stutter in those 2 games. Ive already notified DICE and AMD about it. I thinks its a DICE issue though because my buddy with quad 980's has the exact same issue.


Where's that weaponchunks.db file? What do I have to edit?


----------



## joeh4384

Quote:


> Originally Posted by *evmota21*
> 
> R9 295x2 user here.
> 
> Since I got my card (6 months), I had several issues with crossfire, which is normal. Last night, I tried to bump my resolution scaling with BF4 to 150% (Mantle). No problems, everything cool. Btw my card is watercooled with an EK block. When I try to push it to 170%, just after pressing save settings on BF4, fps drops to 10, THEN AFTER some seconds (5-10), it stabilizes. After some 10 to 30 minutes playing, stuttering would start and the game would crash with a red screen. Temps are 45-50.
> 
> My guess is that the VRMs are getting too hot, causing my card to crash. I have tried several OC configurations. I am sure that even at stock settings (1018 clock and 1250 mem) the card crashes with BF4.
> 
> Any ideas?
> 
> PS: I would also like to know any way to monitor my vrm temps.
> 
> EDIT 1: Got a 1000W EVGA PSU and BF4 HAS crashed at stock settings with 170% resolution scaling up to 200%.


I bought a cheap IR temp gun from Amazon and just point it at the card. I experienced VRM throttling/overheating when I tried to run a 290x below my card and it would work great and just drop the clock to 300mhz for seconds at a time. Without the 290x, I have had no such issues when cranking the card.


----------



## Elmy

Quote:


> Originally Posted by *evmota21*
> 
> Where's that weaponchunks.db file? What do I have to edit?


Its in the BF4 data folder. I renamed mine to i hate lag.

Just be aware about every third game in BF4 the game hangs when loading the map.

Its annoying but its the only solution right now.

Its like night and day after renaming that folder.


----------



## evmota21

Thanks for the tip, but I still don't think raising the resolution scaling would be solved by that, anyhow let me test it and i'll get back.


----------



## ReV2ReD

Quote:


> Originally Posted by *tsm106*
> 
> Dude, you have only 16 lanes at pcie 2.0 and you have to share that with 4 gpus. It's no wonder your cards are extremely bandwidth starved. These cards communicate via the pcie bus, and your setup has a lot of compromises on the pcie bus. At a minimum you should go to an Ivy cpu, that way you can at least double the bandwidth and allow everything to run on the same pcie spec.


Yep, SB is only limited to 16 lanes of PCIe 2.0 which is limited further by my motherboard that runs two cards at (8x/8x, or one card at 16x). I've never seen a SB or IB mobo that can run true 16x/16x without a duplex chip as 16x is a limitation of the processor itself. If I were to upgrade to an IB processor, it would unlock PCIe 3.0 on my mobo, but still only run at PCIe 3.0 8x/8x.

FWIW, I was aware of the PCIe limitations before I bought the cards. I just convinced myself it wouldn't be a problem as GPUs haven't really been able to saturate PCIe 2.0 for quite some time. I appears it may be the case that these 295x2s in Quadfire, with their PLX chips are doing just that.

I've really been fighting upgrading my mobo/processor since that 2500k has been such a wonderful overclocker and all-out underdog street fighter, but it looks like the time is nigh. I've had more fun tweaking that little i5 than with any other processor in my 20 year history of building rigs. Do you think the simple upgrade from SB to IB will cure me of the stutters, or do I need to go whole-hog on a 1150 or 2011v3?


----------



## evmota21

Btw regarding PCI Lanes, I have my r9 295x2 on PCIE 3.0 and one gtx 980 on the other PCIE slot (Maximus VII HERO). Will this config slow down my r9 295x2?


----------



## tsm106

Quote:


> Originally Posted by *ReV2ReD*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Dude, you have only 16 lanes at pcie 2.0 and you have to share that with 4 gpus. It's no wonder your cards are extremely bandwidth starved. These cards communicate via the pcie bus, and your setup has a lot of compromises on the pcie bus. At a minimum you should go to an Ivy cpu, that way you can at least double the bandwidth and allow everything to run on the same pcie spec.
> 
> 
> 
> Yep, SB is only limited to 16 lanes of PCIe 2.0 which is limited further by my motherboard that runs two cards at (8x/8x, or one card at 16x). I've never seen a SB or IB mobo that can run true 16x/16x without a duplex chip as 16x is a limitation of the processor itself. If I were to upgrade to an IB processor, it would unlock PCIe 3.0 on my mobo, but still only run at PCIe 3.0 8x/8x.
> 
> FWIW, I was aware of the PCIe limitations before I bought the cards. I just convinced myself it wouldn't be a problem as GPUs haven't really been able to saturate PCIe 2.0 for quite some time. I appears it may be the case that these 295x2s in Quadfire, with their PLX chips are doing just that.
> 
> I've really been fighting upgrading my mobo/processor since that 2500k has been such a wonderful overclocker and all-out underdog street fighter, but it looks like the time is nigh. I've had more fun tweaking that little i5 than with any other processor in my 20 year history of building rigs. Do you think the simple upgrade from SB to IB will cure me of the stutters, or do I need to go whole-hog on a 1150 or 2011v3?
Click to expand...

Practicality aside the best route would be to feed each card an x16 pcie 3.0 slot, but that would require going x79 or x99. X79 though has become quite a bargain with used pricing and imo there's no difference from x79 to x99 in games. The only place you see a gain to x99 is in synthetic benchmarks or compute which can take advantage of the extra cores and threading, otherwise is a gigantic exercise in epeen.

The pragmatic approach would be to go with an Ivy cpu. This way you get pcie 3.0, doubling your current bandwidth and the cost for a used 3770K or similar is not that bad after the profit from selling your 2500K. It's not ideal to run x8 per 295 but it sure beats doing running at pcie 2.0.

Quote:


> Originally Posted by *evmota21*
> 
> Btw regarding PCI Lanes, I have my r9 295x2 on PCIE 3.0 and one gtx 980 on the other PCIE slot (Maximus VII HERO). Will this config slow down my r9 295x2?


That's not a good idea. You will be put into the same situation as above. In other words that extra card will half the available bandwidth to your 295. Why do you need another card in your system? Are you doing it for the physx? There's a balance you want to achieve. There's also a difference between having to do it because you have no choice (like ReV2ReD) and in your case doing it of your own choice.


----------



## evmota21

Quote:


> Originally Posted by *tsm106*
> 
> That's not a good idea. You will be put into the same situation as above. In other words that extra card will half the available bandwidth to your 295. Why do you need another card in your system? Are you doing it for the physx? There's a balance you want to achieve. There's also a difference between having to do it because you have no choice (like ReV2ReD) and in your case doing it of your own choice.


Well, Skyrim with ENBs are not compatible with Crossfire unfortunately, so that's what the 980 is for. I have searched for a Crossfire fix for NLA ENB for my 295x2 but there isn't one I guess. I think the 980 works fine, it can max all games on 1080p, but I think my 295x2 is having problems. Also, I'm running an i7 4790k.

My question is, is it a really bad idea to have this two cards at the same time? Will having them increase crossfire issues?


----------



## Orivaa

Quote:


> Originally Posted by *evmota21*
> 
> Will having them increase crossfire issues?


Isn't that kind of a dumb question? If there is an issue with CrossFire, it's not like adding even _more_ cards into the mix will fix it. And since a lot of games are not made for QuadFire, yes, you will increase the amount of CrossFire-related issues. That is not to say that there's no point in having QuadFire, as in the games where it DOES work, you'll see absolutely astounding performance. And if a game doesn't work with QuadFire, there are ways to disable it for that particular game.


----------



## evmota21

Quote:


> Originally Posted by *Orivaa*
> 
> Isn't that kind of a dumb question? If there is an issue with CrossFire, it's not like adding even _more_ cards into the mix will fix it. And since a lot of games are not made for QuadFire, yes, you will increase the amount of CrossFire-related issues. That is not to say that there's no point in having QuadFire, as in the games where it DOES work, you'll see absolutely astounding performance. And if a game doesn't work with QuadFire, there are ways to disable it for that particular game.


Not trying to be rude here dude, calm down.

Obviously these cards are not in quadfire and I am not trying to run a r9 295x2 and a 980 at the same time while gaming. When I want to play Skyrim, I just switch monitor configuration and pop my 980 on. When I want to play with Crossfire (any game), change monitor config. I am not trying to run them both at the same time. I hope you read and understood what I was trying to imply. I guess if you would want to run a 980 with a 295x2 in "Quadfire" it would be Tri-SLIFIRE, lol.

EDIT: Anyway, question is, even if the other card is disabled/not in use will it half bandwith or affect my 295x2 while using it?


----------



## Orivaa

Quote:


> Originally Posted by *evmota21*
> 
> Not trying to be rude here dude, calm down.
> 
> Obviously these cards are not in quadfire and I am not trying to run a r9 295x2 and a 980 at the same time while gaming. When I want to play Skyrim, I just switch monitor configuration and pop my 980 on. When I want to play with Crossfire (any game), change monitor config. I am not trying to run them both at the same time. I hope you read and understood what I was trying to imply. I guess if you would want to run a 980 with a 295x2 in "Quadfire" it would be Tri-SLIFIRE, lol.
> 
> EDIT: Anyway, question is, even if the other card is disabled/not in use will it half bandwith or affect my 295x2 while using it?


Sorry, thought you were referring to having 2 295x2, not a 295x2 and a 980GTX. I apologize.


----------



## evmota21

Quote:


> Originally Posted by *Orivaa*
> 
> Sorry, thought you were referring to having 2 295x2, not a 295x2 and a 980GTX. I apologize.


No problem! I am a die hard fan for skyrim mods, that's why I got that setup. No one has a solution for ENBs and Crossfire.


----------



## ReV2ReD

Thank you for the excellent reply, tsm106! I had not thought about going with X79. I really want to stay away from X99 because it seems ridiculously expensive for what it is and requires the outrageously expensive DDR4 RAM. Also, just as an aside, I just don't see how Z97 could be a huge upgrade from what I already have were I to add an IB processor to my current Z77 mobo. LGA 1150 still only has 16 PCIe 3.0 lanes, and, as far as I can tell, there is no Z97 mobo that can run a TRUE 16x/16x PCIe configuration (I know that some Z97 mobos have basically what amounts to on-board PLX chips that virtually allow them to run 16x/16x, but this is not the same as the true 16x/16x that X79 or X99 is capable of. Sorry if I'm wrong on that).

For now, I'm going to take the path of least resistance and try out the IB. I went ahead and ordered a new 3570k. I have no idea if it will take care of the stuttering, but it is the easiest and cheapest upgrade of all of the options. New processor should be here tomorrow, and I'll post back with the results in case anyone is curious to hear how this saga plays out.


----------



## TheReciever

I am also debating X79 for the very same reasons, was originally planning to grab an X58 on the cheap for Westmere EP but X79 would allow me to be lazy for a longer foreseeable future.


----------



## evmota21

Probably a dumb question here but, can I disable my second PCIE 3.0 slot so my 295x2 can run @16x? Problem is that I can't take my GTX 980 out of the slot so, is there a way to just disable it and let the 295x2 run @16x?


----------



## tsm106

Quote:


> Originally Posted by *evmota21*
> 
> Probably a dumb question here but, can I disable my second PCIE 3.0 slot so my 295x2 can run @16x? Problem is that I can't take my GTX 980 out of the slot so, is there a way to just disable it and let the 295x2 run @16x?


Unplug it.

Also if your mb has pcie dip switches that would be best.


----------



## joeh4384

Check in your Bios.


----------



## evmota21

Quote:


> Originally Posted by *tsm106*
> 
> Unplug it.
> 
> Also if your mb has pcie dip switches that would be best.


Quote:


> Originally Posted by *joeh4384*
> 
> Check in your Bios.


Thanks very much!


----------



## steezebe

whelp ordered the EK block and backplate... single-slot dual-gpu here i come!


----------



## ReV2ReD

So the saga continues with my 295x2 Quadfire setup:

I installed the Ivy Bridge 3570k yesterday, and as promised it did provide my motherboard with PCIe 3.0 capability. This allowed both cards to run in PCIe 3.0 8x/8x on my motherboard as opposed to PCIe 2.0 8x/8x. While there has been noticable improvment, the bulk of my problems are still there. Honestly, I really wonder how much of the problem is hardware and how much is just games that just aren't meant to run on 4 GPUs (or at least 4 GPUs on two cards).

So, I'm now at a crossroads: Do I sell my system entirely with my old, trusty GTX 680s (SLI), and go for a X79 or X99 setup and hope that this irons out the majority of my issues with Quadfire, or do I keep my current setup and give up on the dream of running two 295x2s side-by-side and go crawling back to "Team Green" with a GTX 980 SLI configuration. (I don't want to sound like an Nvidia fanboy here. I honestly think that fanboyism and brand loyalty usually end up just screwing over the consumer. That said, the 980 route is particularly appealing right now compared to sticking with AMD as SLI works on games like Far Cry 4 and Crossfire does not. On the other hand, I would lose Mantle on games like Battlefield 4, and I've come to very much appreciate that API).

I would be happy to stick with my Quadfire setup if there were driver support that allowed me to enable or disable the second card as needed, but the current drivers only allow Crossfire or no Crossfire (i.e. when Crossfire is disabled, only 1 GPU from 1 card is active. When Crossfire is enabled, all 4 GPUs are active). It's an "all-or-nothing" proposition. It would be wonderful if the drivers allowed 295x2 Quadfire users to select any combination of the 4 GPUs, but that's likely asking too much.

The bottom line is this: Is there ANYONE AT ALL out there that has an X79 or X99 rig with SPECIFICALLY a 295x2 Quadfire setup? If so, I am very interested to know about your Quadfire gaming experience with Shadow of Mordor, Far Cry 3, Far Cry 4, and Battlefield 4 (I play these games most with the biggest problems coming from Shadow or Mordor and Far Cry 3). If I could get some kind of proof that a new X79 or X99 build would provide a satisfactory (and FAST!) gaming experince with Quadfire, I will happily sink the funds into building a new system.


----------



## TheReciever

Lack of PCIE lanes and (iirc) 1080p resolution with 4 GPU's would be the red flags for me.

Though Im sure others would be able to have other ideas


----------



## rdr09

Quote:


> Originally Posted by *TheReciever*
> 
> Lack of PCIE lanes and (iirc) 1080p resolution with 4 GPU's would be the red flags for me.
> 
> Though Im sure others would be able to have other ideas


and an i5. For mining maybe.

Single 290 in BF4 MP 64 and an i7 @ 4.5GHz HT Off . . .



AMD is not really known for less cpu overhead.


----------



## Elmy

I run 2 Club3D 295X2 with Z87 and 4770K ...I play mainly BF4 @ 5400X1920 and I have very little issues. I have not tried any of the other games you have. I have played on one 1080 monitor and it has work flawlessly @ over 200 FPS in Ultra. I am running my 4770K @ 4.8 And my cards @ 1075/1400.


----------



## ReV2ReD

Quote:


> Originally Posted by *Elmy*
> 
> I run 2 Club3D 295X2 with Z87 and 4770K ...I play mainly BF4 @ 5400X1920 and I have very little issues. I have not tried any of the other games you have. I have played on one 1080 monitor and it has work flawlessly @ over 200 FPS in Ultra. I am running my 4770K @ 4.8 And my cards @ 1075/1400.


Thanks for the quick replies, everyone!

BF4 actually is one of the few games that doesn't give me much trouble in Quadfire. With Mantle and 6048 x 1080 resolution on Ultra, I usually get 90 - 140 fps.

With Shadow of Mordor, I get unplayable stuttering and tons of artifacting (not a temperature issue as the GPUs and RAM are not throttling, and the temp never gets about 68C on either card). I would love to hear from anyone running two 295x2s in Quadfire that is running this game.

Far Cry 3 is simply unplayable on Quafire as it seems to have some kind of FOV problem with 4 GPUs. Blood Dragon seems to work pretty well aside from a bit of stutter.


----------



## axiumone

Quote:


> Originally Posted by *ReV2ReD*
> 
> I appreciate the suggestion, but the current AMD drivers don't allow 295x2 Quadfire users to specify the number of GPUs to use. When I turn off Crossfire in CCC, only one GPU from the main card is active, and when Crossfire is turned on, all 4 GPUs are active. As far as the drivers are concerned, the second card is either on or off. It seems to be kind of an "all or nothing" situation.
> 
> I don't have a single 290 to test out a more standard trifire configuration.


Unless something has drastically changed
Quote:


> Originally Posted by *ReV2ReD*
> 
> Thanks for the quick replies, everyone!
> 
> BF4 actually is one of the few games that doesn't give me much trouble in Quadfire. With Mantle and 6048 x 1080 resolution on Ultra, I usually get 90 - 140 fps.
> 
> With Shadow of Mordor, I get unplayable stuttering and tons of artifacting (not a temperature issue as the GPUs and RAM are not throttling, and the temp never gets about 68C on either card). I would love to hear from anyone running two 295x2s in Quadfire that is running this game.
> 
> Far Cry 3 is simply unplayable on Quafire as it seems to have some kind of FOV problem with 4 GPUs. Blood Dragon seems to work pretty well aside from a bit of stutter.


Have you taken a look at how much video memory Mordor is using? I noticed that the game is an extreme memory hog. If you're hovering at near or over 4gb of video memory, and I suspect that you are, you'll get massive stuttering.


----------



## ReV2ReD

Quote:


> Originally Posted by *axiumone*
> 
> Unless something has drastically changed
> Have you taken a look at how much video memory Mordor is using? I noticed that the game is an extreme memory hog. If you're hovering at near or over 4gb of video memory, and I suspect that you are, you'll get massive stuttering.


I've been wondering if that could be the culprit for Shadow of Mordor. I know the game suggests 6 GB or vRAM for the Ultra Texture Pack, but a single 295x2 runs the game fine with everything on Ultra. The addition of the second card makes the game stutter like crazy and load very slowly. All of this brings to mind a review I read on [H] about running two 295x2s in Quadfire. In the conclusion on their review, they mentioned something about a single 295x2 being too slow to suffer from the 4 GB vRAM shortcoming, but when a second is added, it becomes a clear bottleneck. I don't know if that's what's happening to me or not, but I definitely think it's a possibility. That's why I would really like someone with a X99 or X97 295x2 Quadfire setup to chime in on their system's performance on Shadow of Mordor. I want to verify or eliminate the possibility that the problem is a PCIe lane shortage.


----------



## Mega Man

I can check when I get home.


----------



## ReV2ReD

Quote:


> Originally Posted by *Mega Man*
> 
> I can check when I get home.


Thanks! I would really appreciate it!


----------



## Mega Man

although i dont buy the 295x2 is too slow i blame it on poor coding

in eyefinity

3930k @ 4.8 prime stable /ibyt-avx stable 2400 ram cl10

speed step ect enabled

RIVBE

30ish -40ish ( depending on settings used )

in quadfire tested 5 different time with different settings

60-90 avg in not crossfired..

however

crossfire enabled = 4gb
disabled =2 gpu

gpus at stock


Spoiler: Warning: Spoiler!



















2gpu ULTRA





most pics are in no particular order

to run in eyefinity i had to up Pagefile to 36gb i am using 32gb on average when in SOM
also my 16gb ram is capped too
truely scary


----------



## wermad

Pulled the trigger on these bad boyz:





Ppcs.com had the blocks on sale w/ the discount ~$100 a pop. Just need to find one more 295x2 and get a new psu (probably lepa 1600 or enermax 1500).

Sammy 4k just shipped. lots of time to prep for other things.


----------



## Orivaa

Quote:


> Originally Posted by *wermad*
> 
> Pulled the trigger on these bad boyz:
> 
> 
> 
> 
> 
> Ppcs.com had the blocks on sale w/ the discount ~$100 a pop. Just need to find one more 295x2 and get a new psu (probably lepa 1600 or enermax 1500).
> 
> Sammy 4k just shipped. lots of time to prep for other things.


The Lepa 1600 is terrible; don't get it.


----------



## wermad

Quote:


> Originally Posted by *Orivaa*
> 
> The Lepa 1600 is terrible; don't get it.


Uhm, no. Its a great unit. I bought one off Tsm160, and you know he loves to push his stuff to the limit (he switched to dual units for more powah!). It powered quad 7970 Lightnings with ease. If its TSM160 certified, you know it will do the job







. Also, JG recommended (9.5/10) so I trust Oki-Wolf's opinion too.

I need a psu under 200mm in length and its one of the best ones out there. Not a big fan of their cables, but the revised one does have ribbon cables for the vga and accessories lines. Everything else (besides the Enermax 1500W Gold) is extremely long and i can't fit it.

Old rig pics:


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Orivaa*
> 
> The Lepa 1600 is terrible; don't get it.
> 
> 
> 
> Uhm, no. Its a great unit. I bought one off Tsm160, and you know he loves to push his stuff to the limit (he switched to dual units for more powah!). It powered quad 7970 Lightnings with ease. If its TSM160 certified, you know it will do the job
> 
> 
> 
> 
> 
> 
> 
> . Also, JG recommended (9.5/10) so I trust Oki-Wolf's opinion too.
> 
> I need a psu under 200mm in length and its one of the best ones out there. Not a big fan of their cables, but the revised one does have ribbon cables for the vga and accessories lines. Everything else (besides the Enermax 1500W Gold) is extremely long and i can't fit it.
> 
> Old rig pics:
Click to expand...

two footnote

tsm does not recommend them for 290x quad

i have your old one the PCIE is 16 ga the new ones ( i have 2 others ) the pcie is 18 ga, it should still work but i wanted to let you know


----------



## wermad

He was using it for quad Tahiti's. Then went to hawaii and sold this one to me. It's fine, since I don't oc crazy high like he does. Actually, I don't oc my gpu's anymore. I'm running triple 290s with a 1kw unit. Only issue is that my psu shuts down randomly (rma time), so might as well get a g1600 while I'm waiting.


----------



## AtomicFrost

Hello everyone!

After much deliberation and research I've decided to grab an XFX 295x2 for my new X99 build.







Hopefully I can finish my build by next week.

I was originally going to go with two 980 in SLI, but being so close to the launch of the 3xx or new Nvidia cards; it felt like I'd be better off saving ~$450-500 over the 980's. Not to mention that at the resolution I'll be using the gap between them really closes. I'm going to be using it with a triple monitor setup so either 5760x1080 or 7680x1440 (Still have to choose which ones to buy). Three 4k won't be doable GPU wise for several more years. Most likely I'll wait till mid-march to buy them to see what the new monitors coming out look like. Getting one that supports either G-Sync or FreeSync would be nice.

Most likely I'll either throw in either another 295x2 or just a MSI 290X Lightning to help pull some more FPS. Especially if I go with a 7680x1440 setup.

The last ATI/AMD card I had was a 4870 so its been awhile so it'll be fun to play around again in the red tide.


----------



## ReV2ReD

Thanks a ton Mega Man! Rep'd and I totally owe you a 6-pack of your favorite beer!

I was quickly hitting the vRAM wall with the cards in Quadfire as well, particularly on the Ultra settings. It also turned out that running Quadfire slowed down the SoM load times tremendously. I couldn't tell conclusively from the graphs, but were you also running into some crazy stutter that made the game unplayable?

(As a side note: I hadn't even thought to increase my page file size!)


----------



## Mega Man

random times yes but never bad and was not enough times to notice

never had issue with load times though..

i still say it is that game, personally i think most "nvidia " titles with physx have issues on the amd side and i dont think it is by accident


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> although i dont buy the 295x2 is too slow i blame it on poor coding
> 
> *in eyefinity*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3930k @ 4.8 prime stable /ibyt-avx stable 2400 ram cl10
> 
> speed step ect enabled
> 
> RIVBE
> 
> 30ish -40ish ( depending on settings used )
> 
> in quadfire tested 5 different time with different settings
> 
> 60-90 avg in not crossfired..
> 
> however
> 
> crossfire enabled = 4gb
> disabled =2 gpu
> 
> gpus at stock
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2gpu ULTRA
> 
> 
> 
> 
> 
> most pics are in no particular order
> 
> 
> 
> to run in eyefinity i had to up Pagefile to 36gb i am using 32gb on average when in SOM
> also my 16gb ram is capped too
> truely scary


Hey Mega, are you able to choose 5760x1080 as a res? Btw, I use a dedicated ssd for the pagefile. I have an old sammy 830 60gb, prollly a lil overkill but whatever not using it otherwise.


----------



## Mega Man

you have to edit something but yes you can
http://deadendthrills.com/forum/discussion/318/guide-middle-earth-shadow-of-mordor
Quote:


> To change aspect ratios in-game:
> 1: Open /Documents/WB Games/Shadow of Mordor/render.cfg
> 2: Set: "Render.Setting.AspectRatioLock" "0.000000"
> 3. When in-game, change to windowed mode
> 4. Change desktop resolution to desired aspect ratio
> 5. In-game, leave the video options screen (back out 1 menu)
> 6. Drag to resize the window a little bit - this forces the game to recognize the new aspect ratio
> 7. Back in the video options, you should see resolutions that scale based off your new desktop size.




yea i been thinking about starting to use a sdd solely for PF


----------



## wermad

Quote:


> Originally Posted by *AtomicFrost*
> 
> Hello everyone!
> 
> After much deliberation and research I've decided to grab an XFX 295x2 for my new X99 build.
> 
> 
> 
> 
> 
> 
> 
> Hopefully I can finish my build by next week.
> 
> I was originally going to go with two 980 in SLI, but being so close to the launch of the 3xx or new Nvidia cards; it felt like I'd be better off saving ~$450-500 over the 980's. Not to mention that at the resolution I'll be using the gap between them really closes. I'm going to be using it with a triple monitor setup so either 5760x1080 or 7680x1440 (Still have to choose which ones to buy). Three 4k won't be doable GPU wise for several more years. Most likely I'll wait till mid-march to buy them to see what the new monitors coming out look like. Getting one that supports either G-Sync or FreeSync would be nice.
> 
> Most likely I'll either throw in either another 295x2 or just a MSI 290X Lightning to help pull some more FPS. Especially if I go with a 7680x1440 setup.
> 
> The last ATI/AMD card I had was a 4870 so its been awhile so it'll be fun to play around again in the red tide.


Congrats! 4k Eyefinity is doable, even with three Hawaii cores, it can achieve some great #s. Depending on the title, quads may help. Check out Bashaa's and DeadlyDNA's "12K" Eyefinity threads. I have one incoming (xfx hydro) I'm looking for one more. Make sure you have plenty of air flow if you don't plan to custom water cool the card.

Imho, with any eyefinity/surround, ips is a must to avoid the nasty gray out. If you wanna wait, Acer has an ips wqhd (144hz) monitor coming out. Its been generating a lot of buzz as the holy grail of monitors. I've decided not to go w/ Eyefinity this time and just go w/ a simple 4k (28) display. Once 4k comes down, i'll look for the Acer for Eyefinity or a bigger 4k monitor (60hz).

Othewise, grab three K-monitors (refurb ones go for ~$200) and run them in Eyefinity 7680x1440. Albeit, make sure the monitors have displayport. Its been noted these monitors will not work w/ quite a few adapters and Hawaii (you can run two dvi-d and one hdmi on a 290/290x).


----------



## PontiacGTX

Quote:


> Originally Posted by *wermad*
> 
> Congrats! 4k Eyefinity is doable, even with three Hawaii cores, it can achieve some great #s. Depending on the title, quads may help. Check out Bashaa's and DeadlyDNA's "12K" Eyefinity threads. I have one incoming (xfx hydro) I'm looking for one more. Make sure you have plenty of air flow if you don't plan to custom water cool the card.
> 
> Imho, with any eyefinity/surround, ips is a must to avoid the nasty gray out. If you wanna wait, Acer has an ips wqhd (144hz) monitor coming out. Its been generating a lot of buzz as the holy grail of monitors. I've decided not to go w/ Eyefinity this time and just go w/ a simple 4k (28) display. Once 4k comes down, i'll look for the Acer for Eyefinity or a bigger 4k monitor (60hz).
> 
> Othewise, grab three K-monitors (refurb ones go for ~$200) and run them in Eyefinity 7680x1440. Albeit, make sure the monitors have displayport. Its been noted these monitors will not work w/ quite a few adapters and Hawaii (you can run two dvi-d and one hdmi on a 290/290x).


peopke were already running out of vram at 7680x1440 in another thread with 4GB


----------



## wermad

Just turn down the eye candy. 5x1 1200 pushes more pixels than 3x1 1440, and my old quad 7970s had no trouble running metro ll & crysis 3. The whole vram thing gets blown out of proportion over eager users pushing too much aa and going crazy with mods. You have to find balance in each game for your setup imho. Not even the most powerful setups will cover every single game at the highest possible resolution.

Found my last card. New mb & cpu arrived and ready to test


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Just turn down the eye candy. 5x1 1200 *pushes more pixels than 3x1 1440*, and my old quad 7970s had no trouble running metro ll & crysis 3. *The whole vram thing gets blown out of proportion over eager users pushing too much aa and going crazy with mods.* You have to find balance in each game for your setup imho. Not even the most powerful setups will cover every single game at the highest possible resolution.
> 
> Found my last card. New mb & cpu arrived and ready to test


Barely more pixels man. Regardless it's way over the threshold. Everyone running 11mp and higher is already over the limit. And it will only get worse with new and newer AAA titles sucking down vram like its going out of style. Not sure if you knew this but a few of these newer AAA games can use up all your vram at 1080p, ie. use up all 4gb vram. This is partly why ppl suddenly noticed that the 970 was not maxing out its full 4gb vram. It's not just skyrim mods. And the point is, we're on the cusp of another vram paradigm shift, going from 4gb to 6gb or maybe 8gb as a standard. I wouldn't run stupid large resolutions just for the helluvit because it's not fun having pixel densities that high then having to turn down all the eye candy. It sort of defeats the point imo.


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> Barely more pixels man. Regardless it's way over the threshold. Everyone running 11mp and higher is already over the limit. And it will only get worse with new and newer AAA titles sucking down vram like its going out of style. Not sure if you knew this but a few of these newer AAA games can use up all your vram at 1080p, ie. use up all 4gb vram. This is partly why ppl suddenly noticed that the 970 was not maxing out its full 4gb vram. It's not just skyrim mods. And the point is, we're on the cusp of another vram paradigm shift, going from 4gb to 6gb or maybe 8gb as a standard. I wouldn't run stupid large resolutions just for the helluvit because it's not fun having pixel densities that high then having to turn down all the eye candy. It sort of defeats the point imo.


Ah, cut the whining cus you weren't epic w/ this







. I got the same complaint from the 3x1 1440 guys all the time. 5x1 is very uncommon, more so then MMG 3x1's. Its still more:

5x1 -1200: 11.52M
3x1-1440: 11.06M

One game uses more vram, the whole world starts to freak out. This "the Sky is Falling" freakout is common when games like Shadow of Mordor come out (and skyrim). In my 5x1 setup, I was already pegged on vram, regardless of high or medium settings. My frames dropped 40-60% going to high settings vs medium. I still concentrate on gpu power and I don't really need quad Sapphire 8gb tbh.

The revolution will come w/ stacked vram imho. Only the uber cards like Titan and Aib's extending the life of mid-life-cycle products will continue to offer more vram. To amd and Nvidia, they don't care as it makes them more money in the long run with incremental increases. Don't ask for a Titan BE card to replace a GTS X50 because one game uses more vram in the cycle of one series.

And remember, we are part of a small minority, which when you think about, you Tsm, fall even into smaller category. Not all of us are pushing for 3Dmark records









I like this video:





It does give you a good perspective on real world stuff. Not what happens in the smallest of markets that we fall into.


----------



## Bludge

Guys, playing at 1440, is there any value in the 295X2 at the moment, or am I better waiting out the 3000 series? The 295X2 is still $1000 in Australia


----------



## Sgt Bilko

Quote:


> Originally Posted by *Bludge*
> 
> Guys, playing at 1440, is there any value in the 295X2 at the moment, or am I better waiting out the 3000 series? The 295X2 is still $1000 in Australia


I'm sitting around 110+ fps in nearly every game i have at Ultra settings but rumours are suggesting the 390x is supposed to be 75% of the performance of the 295x2 and it should be launching soonish (rumours i know)

and yeah, the 295x2 is a tough pill to swallow here


----------



## AtomicFrost

Quote:


> Originally Posted by *wermad*
> 
> Congrats! 4k Eyefinity is doable, even with three Hawaii cores, it can achieve some great #s. Depending on the title, quads may help. Check out Bashaa's and DeadlyDNA's "12K" Eyefinity threads. I have one incoming (xfx hydro) I'm looking for one more. Make sure you have plenty of air flow if you don't plan to custom water cool the card.
> 
> Imho, with any eyefinity/surround, ips is a must to avoid the nasty gray out. If you wanna wait, Acer has an ips wqhd (144hz) monitor coming out. Its been generating a lot of buzz as the holy grail of monitors. I've decided not to go w/ Eyefinity this time and just go w/ a simple 4k (28) display. Once 4k comes down, i'll look for the Acer for Eyefinity or a bigger 4k monitor (60hz).
> 
> Othewise, grab three K-monitors (refurb ones go for ~$200) and run them in Eyefinity 7680x1440. Albeit, make sure the monitors have displayport. Its been noted these monitors will not work w/ quite a few adapters and Hawaii (you can run two dvi-d and one hdmi on a 290/290x).


Although I'd love to do Eyefinity 4k, I have a feeling that my single 295x2 wouldn't do it justice without getting at least one more 290x. I'm also went with the XFX hydro version of the 295x2 because it seems like it runs cooler than the Devil version, and it was cheaper. Getting Civ. Beyond Earth and a $20 NewEgg GC was also a nice bonus. I'll be throwing it in the Corsair 900D that I still have sitting in its box. I'm going to be tearing into that box today.







My 295x2 will be here on Tuesday.

I was reading about that Acer monitor, and it seems like it has a lot of the features I'm looking for in my new monitor setup. I'm going to be mounting all three to an Obutto Revolution cockpit. I'm only worried about how much they will cost when they are released in March. The Swift monitors seemed like they would be perfect, but my X99 build blew my budget out of the water. My body couldn't say no to a 5960X.









Right now I'm looking at either going with three cheaper 60Hz, 1080p monitors like the PLS ones that Acer sells. However, it seems like the monitors would end up being the bottleneck in my system then. I'm also contemplating getting three 4k TN panels but I'm worried about trying to drive that many pixels. I doubt that even Quadfire could pull decent frames, and the 4GB of VRAM would be murdered; along with the FPS.

I'll have to check out those threads you mentioned, and THANK YOU for the heads up about the issue with display port adapters. I was looking at some cheaper Ben Q monitors that don't have Displayport, but now I'll cross those off the list. I also noticed that BenQ is coming out with a new FreeSync monitor which could work well.

UGH. The monitor selections are really crummy when trying to buy three at a time. The only good news is that I don't need them till the end of March so I have some time to wait and see what comes out by then.

Thanks again the info. Hopefully both of our 295x2 work out well.


----------



## wermad

My first one shipped and should be in by the end of the week. The second may ship on Monday and hopefully arrive by Saturday. I decided to keep my 900D as a custom case would have been more money. That extra money instead went to the gpu's and a new mb/cpu.

There's been a few members who have dabbled in "12K" (3x 4k monitors in Eyefinity/Surround). ONe of them ran quad 290s and then 290Xs. He did recommend at minimum three Hawaii's to run 12k. Its doable, just like 3x2 1440 (six 2560x1440 monitors). A member was able to trigger 3x2 WQHD on a 7950. Performance was utter crap under the lowest possible settings (~15-18), but its something that a great powerful setup can tackle. You do push less then 12k (25M vs 22M) and you'll need one mst hub (and a multi displaport card like the 295x2).

Monitor selection can be a real headache. You have so many models to choose from. And not all of them are perfect (except maybe the upcoming acer, speculated ~$800 USD). I had a set budget, and after consulting with a few members and reading reviews, I opted for a simple 60hz 4k monitor for now. If the opportunity to run 3x1 1440 (probably w/ Dells or Asus ips in portrait) comes up later on, I'll jump on it.

As far as the issue w/ display port, its when you use a k-monitor. These are the Korean WQHD that flood ebay. If you're going with an Eyfinity setup, make sure you opt for the dp models (basic cheaper ones only have dual-link dvi and vga. Or if you get a 290X, for trifire, you can run dvi/dvi/hdmi granted the monitor has hdmi 1.4 support.

Just got done putting in the new mb and pulling out my old 290 triplets. Gonna catch some zzz's for now


----------



## ReV2ReD

Quote:


> Originally Posted by *Mega Man*
> 
> random times yes but never bad and was not enough times to notice
> 
> never had issue with load times though..
> 
> i still say it is that game, personally i think most "nvidia " titles with physx have issues on the amd side and i dont think it is by accident


That's good. It looks like the full-on 16x/16x PCIe lanes clears up the stuttering. I also agree it's the game, as most of my other games don't suffer to the extent of SoM. Still, most of my games are not offering up a "better all-around" experience in Quadfire than with a single 295x2 in Crossfire. Do any of the other games that you frequently play act odd in Quadfire?


----------



## PontiacGTX

Quote:


> Originally Posted by *wermad*
> 
> Just turn down the eye candy. 5x1 1200 pushes more pixels than 3x1 1440, and my old quad 7970s had no trouble running metro ll & crysis 3. The whole vram thing gets blown out of proportion over eager users pushing too much aa and going crazy with mods. You have to find balance in each game for your setup imho. Not even the most powerful setups will cover every single game at the highest possible resolution.
> 
> Found my last card. New mb & cpu arrived and ready to test


Just
0.5MP more still the people was running vanilla games.and the current games that are console ports would require more than 3.5GB at 4K(2MP less than 3x1440)


----------



## Mega Man

Quote:


> Originally Posted by *ReV2ReD*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> random times yes but never bad and was not enough times to notice
> 
> never had issue with load times though..
> 
> i still say it is that game, personally i think most "nvidia " titles with physx have issues on the amd side and i dont think it is by accident
> 
> 
> 
> That's good. It looks like the full-on 16x/16x PCIe lanes clears up the stuttering. I also agree it's the game, as most of my other games don't suffer to the extent of SoM. Still, most of my games are not offering up a "better all-around" experience in Quadfire than with a single 295x2 in Crossfire. Do any of the other games that you frequently play act odd in Quadfire?
Click to expand...

most titles i play quadfire works fine . not all but most


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> It does give you a good perspective on real world stuff. Not what happens in the smallest of markets that we fall into.


Not really sure what you are saying, its all over the place. I don't care about what or whomever doing whatever. I care about how I'm going to use my cards. And at 5760, I am running out of vram with the games I play. For ex. dyling light hovers around 3.7gb on any given occasion, mordor is pushing 14gb (w/o hires testures)(4gbx4=16gb) which is 3.5gb per gpu, and so on and so forth. Do I care what some yahoo on a youtube video says? I care what I am seeing because I'm the one paying for it.


----------



## wermad

^^^If you need 12gb now, nvidia or amd will not care and give you 6-8gb next year. That's how it works if you been in the game for a while. My last card I bought was a Savage S4 before exiting the scene. Came back, bought 4870x2 + 4870,. 1gb each, I was wowed (2009). I think the savage had 64mb of vram







.

Just dropped off by the mailman, bnib







. #2 just shipped today and per seller should be there by next week. This one was a buyer's remorse, used for a day and sold for Maxwells. I totally forgot to ask if these come w/ a backplate. If they don't, I'll get some ek's or hk's. 4k monitor arriving on Friday







.



(side note: got to test my new Note 3's camera; not the latest but sure a bit better then my old Note 2







).


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> ^^^If you need 12gb now, nvidia or amd will not care and give you 6-8gb next year. That's how it works if you been in the game for a while. My last card I bought was a Savage S4 before exiting the scene. Came back, bought 4870x2 + 4870,. 1gb each, I was wowed (2009). I think the savage had 64mb of vram
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Just dropped off by the mailman, bnib
> 
> 
> 
> 
> 
> 
> 
> . #2 just shipped today and per seller should be there by next week. This one was a buyer's remorse, used for a day and sold for Maxwells. I totally forgot to ask if these come w/ a backplate. If they don't, I'll get some ek's or hk's. 4k monitor arriving on Friday
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> (side note: got to test my new Note 3's camera; not the latest but sure a bit better then my old Note 2
> 
> 
> 
> 
> 
> 
> 
> ).


Nice!!









And yes to the backplate.....they have on from the factory









Im waiting for DX12 and Mantle to support the "memory pooling" idea, basically with the 295x2 as an example you'll have 8GB of usable vram and not 4GB mirrored


----------



## wermad

^^^yeah, that's got me more excited, the stacked memory for the future. What's the point of having quad 12gb Titan II when you still have 12GB overall. I think its better to make things more efficient then just throw more at it. Its like car engines, better and more powerful while being smaller and cleaner then the older, bigger, dirtier ones.

Thanks for the input. I'm running some errands at the moment so I won't be able to pull the card until tonight. gonna have to leave the rad out the door panel (removed) since I can't fit the radiator right now. I'll see how one card (and a 4690k) treat the V1000 psu using the kill-a-watt).


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> ^^^yeah, that's got me more excited, the stacked memory for the future. What's the point of having quad 12gb Titan II when you still have 12GB overall. I think its better to make things more efficient then just throw more at it. Its like car engines, better and more powerful while being smaller and cleaner then the older, bigger, dirtier ones.
> 
> Thanks for the input. I'm running some errands at the moment so I won't be able to pull the card until tonight. gonna have to leave the rad out the door panel (removed) since I can't fit the radiator right now. I'll see how one card (and a 4690k) treat the V1000 psu using the kill-a-watt).


Yup, definitely some exciting stuff coming up









V1000 will be fine with one card easy but i am curious to see what wattage you pull


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Thanks for the input. I'm running some errands at the moment so I won't be able to pull the card until tonight. gonna have to leave the rad out the door panel (removed) since I can't fit the radiator right now. I'll see how one card (and a 4690k) treat the V1000 psu using the kill-a-watt).


I'm highly interested to see how that v1000 does when you get your second card. I'm running the same PSU and love it!


----------



## AtomicFrost

Quote:


> Originally Posted by *wermad*
> 
> ^^^If you need 12gb now, nvidia or amd will not care and give you 6-8gb next year. That's how it works if you been in the game for a while. My last card I bought was a Savage S4 before exiting the scene. Came back, bought 4870x2 + 4870,. 1gb each, I was wowed (2009). I think the savage had 64mb of vram
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Just dropped off by the mailman, bnib
> 
> 
> 
> 
> 
> 
> 
> . #2 just shipped today and per seller should be there by next week. This one was a buyer's remorse, used for a day and sold for Maxwells. I totally forgot to ask if these come w/ a backplate. If they don't, I'll get some ek's or hk's. 4k monitor arriving on Friday
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> (side note: got to test my new Note 3's camera; not the latest but sure a bit better then my old Note 2
> 
> 
> 
> 
> 
> 
> 
> ).


Congratulations!









UPS dropped mine off this afternoon from NewEgg, also an XFX. Hopefully I'll be able to start my new build tomorrow afternoon; finally have all of the main components. Just need to get some more fans and some red case lighting.









I'm also intersted to see how much wattage your pulls with one card.


----------



## wermad

Quote:


> Originally Posted by *AtomicFrost*
> 
> Congratulations!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UPS dropped mine off this afternoon from NewEgg, also an XFX. Hopefully I'll be able to start my new build tomorrow afternoon; finally have all of the main components. Just need to get some more fans and some red case lighting.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also intersted to see how much wattage your pulls with one card.
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Yup, definitely some exciting stuff coming up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> V1000 will be fine with one card easy but i am curious to see what wattage you pull
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kayan*
> 
> I'm highly interested to see how that v1000 does when you get your second card. I'm running the same PSU and love it!
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

The V1000 is acting up so I'm hoping I can pull a few runs of 3d11. The psu kept shutting down w/ Crysis 3 and triple 290s (stock 1000mhz). I was stating to suspect it, but I talked to a few guys thinking it was probably my mb or a bad card. The symptoms due point to a psu shutdown (probably over protection). What's interesting is that the kill-a-watt reads well under its 1000w capability but the unit might be defective at this point. Well, we'll see what two cores can do.

Btw G1600-MA is on its way. I missed out on some really awesome deals these last couple of months (brand new v1.0 G1600 sold for $125!). But now, its seems the surplus of ex-mining gear is drying up and prices are creeping up again. Rather then waiting, I just picked up a used unit for $165. More then what i wanted but the rma V1000 should fetch more then what it was going for two months ago (~$80-100).


----------



## Mega Man

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Thanks for the input. I'm running some errands at the moment so I won't be able to pull the card until tonight. gonna have to leave the rad out the door panel (removed) since I can't fit the radiator right now. I'll see how one card (and a 4690k) treat the V1000 psu using the kill-a-watt).
> 
> 
> 
> I'm highly interested to see how that v1000 does when you get your second card. I'm running the same PSU and love it!
Click to expand...

it trips ( i did that ( well tried that ) )
Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AtomicFrost*
> 
> Congratulations!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UPS dropped mine off this afternoon from NewEgg, also an XFX. Hopefully I'll be able to start my new build tomorrow afternoon; finally have all of the main components. Just need to get some more fans and some red case lighting.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also intersted to see how much wattage your pulls with one card.
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Yup, definitely some exciting stuff coming up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> V1000 will be fine with one card easy but i am curious to see what wattage you pull
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kayan*
> 
> I'm highly interested to see how that v1000 does when you get your second card. I'm running the same PSU and love it!
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> The V1000 is acting up so I'm hoping I can pull a few runs of 3d11. The psu kept shutting down w/ Crysis 3 and triple 290s (stock 1000mhz). I was stating to suspect it, but I talked to a few guys thinking it was probably my mb or a bad card. The symptoms due point to a psu shutdown (probably over protection). What's interesting is that the kill-a-watt reads well under its 1000w capability but the unit might be defective at this point. Well, we'll see what two cores can do.
> 
> Btw G1600-MA is on its way. I missed out on some really awesome deals these last couple of months (brand new v1.0 G1600 sold for $125!). But now, its seems the surplus of ex-mining gear is drying up and prices are creeping up again. Rather then waiting, I just picked up a used unit for $165. More then what i wanted but the rma V1000 should fetch more then what it was going for two months ago (~$80-100).
Click to expand...

yea with 3290s i can see you tripping @ 1kw ocp


----------



## wermad

It shut down w/ one card, so that tells me its going bad. I'm waiting for the Lepa to come in before i send it off to rma. After the benching I did a month ago, it started going. I thought it was the mb as it was turning off frequently. Then i updated my bios and it helped. Hence why I was suspecting the mb. I haven't done anything since I only had a G3258.

If i still have the 290s when the new psu comes in, I wanna test those before selling them to see how much power they draw from the G1600.


----------



## AtomicFrost

Quote:


> Originally Posted by *wermad*
> 
> It shut down w/ one card, so that tells me its going bad. I'm waiting for the Lepa to come in before i send it off to rma. After the benching I did a month ago, it started going. I thought it was the mb as it was turning off frequently. Then i updated my bios and it helped. Hence why I was suspecting the mb. I haven't done anything since I only had a G3258.
> 
> If i still have the 290s when the new psu comes in, I wanna test those before selling them to see how much power they draw from the G1600.


Sounds like that PSU is on its way out.

I know that with the AX1500i you have to either up the OCP on it or disable it entirely because the 295x2 pulls over 30a when under heavy load. Which is actually more than it says it pulls on the box, but I guess that could also be why your PSU is tripping (If its OCP is limited to 30a on the PCIE).

http://www.corsair.com/en-us/blog/2014/may/setting-up-ocp-on-the-ax1500i
Quote:


> When benchmarking AMD R9-295X2 graphics cards, we found that the cards can overload the preset OCP of the AX1500i and cause it to shut down. So once your PC is built and you have powered everything up, installed Windows and made sure everything is working correctly, it will be necessary to install Corsair Link if you wish to maximize the performance of your 295X2 card.
> 
> In Link, there are one of two methods that can be used. One is to check all of the OCP check boxes and turn the OCP up to the maximum of 40A.
> 
> The other method that can be used is to turn off the OCP altogether.


Hopefully the new PSU you have coming will run your 295x2 like a champ.


----------



## wermad




----------



## Mega Man

hey your avatar looks like you now


----------



## wermad

lol, Haruna? She's psycho tbh. Hehe, maybe I am tonight. Did the warning readings plastered all over this beast. Since the V1000 is single, no worries to plug her in. Fans are loud, especially the red one. Its blowing right at my feet. Good thing it gets chilly at nights







.

Not much time tonight. Cleaning up old cell phones to put them on ebay take priority. Don't wanna keep them any longer then i need to. I do love the 1080 screen on the Note 3. Maybe in a couple of years, Note 4 w/ 1440







. Since all I really do is check email and ocn (and shop the pc parts sites, you'll know which ones), I don't really care for the latest and greatest cell.

Got Omega 12.14 installed. I'll test it tomorrow morning while its still cold (and before it heats up).

At one point, I was seriously thinking of sticking two stock cards in an matx case (







).

Btw, boo at Xfx for not including a mini-displayport to displayport adapter. I mistook the m-dp to hdmi (







). Luckily, i have a couple of spare ones







.

Edit: do see two cores in gpu-z, amd od doesn't show two gpu and no xfire option? I'm thinking this is normal. I need to install AB again (hoping there's no more issues with it).


----------



## ReV2ReD

Quote:


> Originally Posted by *wermad*
> 
> Edit: do see two cores in gpu-z, amd od doesn't show two gpu and no xfire option? I'm thinking this is normal. I need to install AB again (hoping there's no more issues with it).


Yep, that's normal for a single 295x2. The Crossfire option will show up if you install another 295x2 or a 290X. With a single 295x2, you can go into CCC and turn Crossfire on and off through individual application settings. It's kind-of a pain, but not really a big deal.

Not sure of AB's past (I used EVGA Precision X when I had Nvidia cards), but I've been using it for about a month now and had no problems. It's actually much easier to use than Precision X IMO.


----------



## wermad

So far, 3D11 Extreme, I pulled ~600-650W at the Kill-A-Watt. Factor in efficiency 88% (jg.com has it close to platinum rating): ~525-575w (rounded)

Less ~100-120w for the system, and I'm ~400-450w (under the amd 500w). Keeping in mind this is a synthetic bench. V1000 held well. Not sure why it was shutting down before when the three 290s were pulling slightly more ~(650-700w at the wall). I'm still gonna rma it. Maybe it doesn't like 290s.

AB recorded 99% usage on both during the demo and gpu tests. Both cores were under 70°C w/ the slider in OD set to 75°C.

If i have time today, I'll launch Crysis 3 which seems to be the standard yard-stick for power consumption.

One thing though, this thing puts out a lot of heat. I didn't think it was gonna be this much







. Poor little rad is working triple overtime w/ two cores. Fedex will be dropping off the blocks tomorrow. No eta on the 2nd card per tracking.

Quote:


> Originally Posted by *ReV2ReD*
> 
> Yep, that's normal for a single 295x2. The Crossfire option will show up if you install another 295x2 or a 290X. With a single 295x2, you can go into CCC and turn Crossfire on and off through individual application settings. It's kind-of a pain, but not really a big deal.
> 
> Not sure of AB's past (I used EVGA Precision X when I had Nvidia cards), but I've been using it for about a month now and had no problems. It's actually much easier to use than Precision X IMO.


Thanks


----------



## xer0h0ur

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nice!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes to the backplate.....they have on from the factory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im waiting for DX12 and Mantle to support the "memory pooling" idea, basically with the 295x2 as an example you'll have 8GB of usable vram and not 4GB mirrored


Mantle has had this feature since launch and no title has adopted it. The problem is that coding a game to use a varying amount of vRAM from multiple GPU setups means that it would have to be coded to use from 1-2GB up to 32GB if 4-way crossfiring 8GB 290X's and everything in between. This has obviously been too challenging for developers as no one has tackled it yet and a DX12 release isn't going to change that either.


----------



## wermad

Aren't consoles doing that already?

Loving this beast! Pulled ~650w @ the kill-a-watt in the opening level of crysis 3 (raining ship scene). Factor all the other stuff and I'm right or under the 500w tdp.

My g1600 arrives next week (six rails). Do you guys recommend one rail per 8-pin or one rail per card? I believe rails 3-6 are 30amp.


----------



## kayan

What brand of block do you guys recommend? I'm thinking I may Waterloo this bad boy, once I get my new case.


----------



## xer0h0ur

Quote:


> Originally Posted by *wermad*
> 
> Aren't consoles doing that already?
> 
> Loving this beast! Pulled ~650w @ the kill-a-watt in the opening level of crysis 3 (raining ship scene). Factor all the other stuff and I'm right or under the 500w tdp.
> 
> My g1600 arrives next week (six rails). Do you guys recommend one rail per 8-pin or one rail per card? I believe rails 3-6 are 30amp.


If you're talking about what I was making reference to. Its not even remotely the same thing. A set amount of vRAM versus a widely varying amount of vRAM.


----------



## wermad

@ Kayan

All are way better then the stock cooler.

It comes down to preference and budget. I already had a koolance cpu block and got my two gpu blocks on sale. Bling, HK and AC, most accessories ek and xspc, best all rounder koolance.


----------



## Mega Man

Quote:


> Originally Posted by *kayan*
> 
> What brand of block do you guys recommend? I'm thinking I may Waterloo this bad boy, once I get my new case.


I would stick to ek


----------



## AtomicFrost

Quote:


> Originally Posted by *wermad*
> 
> Aren't consoles doing that already?
> 
> Loving this beast! Pulled ~650w @ the kill-a-watt in the opening level of crysis 3 (raining ship scene). Factor all the other stuff and I'm right or under the 500w tdp.
> 
> My g1600 arrives next week (six rails). Do you guys recommend one rail per 8-pin or one rail per card? I believe rails 3-6 are 30amp.


I'd recommend two rails per card (one per 8pin socket). You're better off drawing about 50% capacity on two rails than pulling 100%+ on one rail.

During benchmarking it sounds like the 295x2 can pull a bit more than 30a, which could trip the OCP on your PSU if you put it on only one rail.


----------



## RobzDragon

My new baby...soon to have a 2nd.


----------



## wermad

Quote:


> Originally Posted by *AtomicFrost*
> 
> I'd recommend two rails per card (one per 8pin socket). You're better off drawing about 50% capacity on two rails than pulling 100%+ on one rail.
> 
> During benchmarking it sounds like the 295x2 can pull a bit more than 30a, which could trip the OCP on your PSU if you put it on only one rail.


Thanks! I had a hunch this was the best approach. I totally forgot my kill-a-watt can read amperage. I pulled 650w at the wall stock 4690k and one stock 295x2. I'll test it again for kicks.


----------



## Orivaa

Quote:


> Originally Posted by *RobzDragon*
> 
> My new baby...soon to have a 2nd.


If you're getting a second, a new PSU would be useful.


----------



## RobzDragon

Agreed picking up an evga 1600w g2. Unless you guys think there is a better option?. Price point is $269 shipped with a $30 mir.


----------



## Orivaa

I've had the EVGA 1300w since I bought my own 295x2, and it has served me faithfully. But to be fair, I do not pull anywhere near 1300w with this setup.


----------



## RobzDragon

Well id like some headroom to OC as well. Maybe look into other cooling solutions. Has anyone done push pull on the stock rad cant imagine it would help to much or if its even worth the work of adding the extra fans. Also any info on these pumps? Wonder if they could support some 240rads? Sorry if im all over the place here im cold and extra gitty typing on my cellphone lol


----------



## Orivaa

Most people who's done push & pull on the stock has reported a decent enough result. Nothing too amazing, but if you have some Corsair SP Quiet lying around (Of which you can never get too many), and you don't intend to do custom cooling, you might as well slap those on.


----------



## Mega Man

Quote:


> Originally Posted by *RobzDragon*
> 
> Agreed picking up an evga 1600w g2. Unless you guys think there is a better option?. Price point is $269 shipped with a $30 mir.


No super flowers leadex is one of the best platforms out currently ( the oem/ platform Evga is using for the g2/p2/t2s )


----------



## joeh4384

Quote:


> Originally Posted by *Orivaa*
> 
> Most people who's done push & pull on the stock has reported a decent enough result. Nothing too amazing, but if you have some Corsair SP Quiet lying around (Of which you can never get too many), and you don't intend to do custom cooling, you might as well slap those on.


Do you think quiets would be enough? My card run in mid 60s with the performance editions.


----------



## joeh4384

Quote:


> Originally Posted by *RobzDragon*
> 
> Agreed picking up an evga 1600w g2. Unless you guys think there is a better option?. Price point is $269 shipped with a $30 mir.


I have a 1300 watt g2 and it has worked flawlessly for me. I did run it for a week with a 290x/295x2 trifire setup and had no issues.


----------



## wermad

Rocking a v1000


----------



## Mega Man

in my 295x2 rig rocking 2x superflower leadex platinum 1kw


----------



## wermad

going to put in a 290 w/ the 295x2 and see what happens to the V1000. Its actually pretty rock solid right now. I don't know why it acted funky w/ the 290s.

Well, gonna put in some Z's as I only got 10 minutes of sleep in the last 48 hours. My Z97 board went out and I'm dealing w/ a massive headache of Z87 (sniper 5) and DC (haswell refresh). Imho, these guys don't like each other. Spent six hours trying to install the drivers fixing one issue after another. The z97 was flawless with DC. Maybe this inconsistency with Z87 lead to the psu shutting off????


----------



## RagingCain

While waiting for some new drivers, managed to rank 1st place in 3DMark - Firestrike:
AMD FX-8320 w/ R9 295x2, 2 GPUs
AMD FX-8350 w/ R9 295x2, 2 GPUs
AMD FX-8370 w/ R9 295x2, 2 GPUs
AMD FX-9370 w/ R9 295x2, 2 GPUs
AMD FX-9590 w/ R9 295x2, 2 GPUs

Devil 13 I guess would really shine under a hex-core i7 5xxx gen CPU and at least H20.

Air Cooled, PowerColor Devil 13 R9 290X "DualCore" GPU: 1110 MHz / 1530 MHz CPU: 5.039 GHz
http://www.3dmark.com/fs/3799617

Graphics score though is only 2000 points behind a high overclocked i7 5960X. Good proof that the FX-8370 should be adequate for a R9 295x2.


----------



## Amlalsulami

Hello,

Is there any problems if i bought 4x cards of R9 295x2 and run it?

My Rig:

Asus 99x V Rampage
i7-5960X
32GB DDR 4
2x 1200W Seasonic series Platinum

Please let me know guys cuz until now i don't bought any cards.


----------



## xer0h0ur

The most you can do is two 295X2 which is quadfire. You can't use 8 GPUs for gaming purposes.


----------



## Sploosh

Quote:


> Originally Posted by *Amlalsulami*
> 
> Hello, Is there any problems if i bought 4x cards of R9 295x2 and run it?


Anything beyond 2x R9 295x2 is useless as it goes beyond Quadfire. Also, damn, that's an expensive rig.


----------



## Amlalsulami

Quote:


> Originally Posted by *xer0h0ur*
> 
> The most you can do is two 295X2 which is quadfire. You can't use 8 GPUs for gaming purposes.


Quote:


> Originally Posted by *Sploosh*
> 
> Anything beyond 2x R9 295x2 is useless as it goes beyond Quadfire. Also, damn, that's an expensive rig.


Oky but the price it's near from 4-way Sli GTX980, i said better to get 4x of R9 295x2.

So it's can be run 4 of R9 295x2 or not?


----------



## axiumone

Quote:


> Originally Posted by *Amlalsulami*
> 
> Oky but the price it's near from 4-way Sli GTX980, i said better to get 4x of R9 295x2.
> 
> So it's can be run 4 of R9 295x2 or not?


No, you can not run 4 of 295x2 in crossfire. The maximum you can do is 2 of 295x2, as that would equal to 4 GPU's.


----------



## wermad

If he's gonna mine, it should work. for everything else, crossfire is limited to four cores.

Dropped in one of the Sapphire 290 TriX in the system and 3d11 Pulled ~850-950w. The First test pulled 900-950w and peaked ~980w. The rest of the test and demos (gpu) were ~850-900w. I don't bother w/ FS since I've notice it pulls less power then 3D11. These were at the Kill-A-Watt.

Crysis 3 showed under 700w. I made sure i cooled all three cores down but for some reason, its actually pulling less watt with a 295x2+290 then triple 290s.

Now I'm wondering if rma is even needed...Well, I didn't want to go this route but looks like i may need to get Furmark installed on my system again. Meh









Btw, those w/ blocks, how bad is the sag?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Amlalsulami*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> The most you can do is two 295X2 which is quadfire. You can't use 8 GPUs for gaming purposes.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sploosh*
> 
> Anything beyond 2x R9 295x2 is useless as it goes beyond Quadfire. Also, damn, that's an expensive rig.
> 
> Click to expand...
> 
> Oky but the price it's near from 4-way Sli GTX980, i said better to get 4x of R9 295x2.
> 
> So it's can be run 4 of R9 295x2 or not?
Click to expand...

2 x 295x2 = quadfire.

No more than two cards.


----------



## Amlalsulami

Quote:


> Originally Posted by *axiumone*
> 
> No, you can not run 4 of 295x2 in crossfire. The maximum you can do is 2 of 295x2, as that would equal to 4 GPU's.


So better get 2 of 295x2 or 4-way of GTX980?

Needs any opinions


----------



## xer0h0ur

Unless you plan on doing some sort of work not 3D gaming related then its useless to have more than 2 R9 295X2's in your system. You can only crossfire up to 4 GPUs and each 295X2 has 2 GPUs.


----------



## wermad

two 295x2 would be cheaper (well, depending on where you live tbh).


----------



## Orivaa

Quote:


> Originally Posted by *Amlalsulami*
> 
> Oky but the price it's near from 4-way Sli GTX980, i said better to get 4x of R9 295x2.
> 
> So it's can be run 4 of R9 295x2 or not?


Nothing can run four 295x2's. One 295x2 is two 290x GPUs in a single card, but still two GPUs, and CrossFire doesn't allow for more than 4 GPUs, meaning two 295x2 max. 295x2 are for the people who want to do multi-GPU at high level, but want to save space on the motherboard. If you simply want performance, buy four 980's. Or wait for AMD's new GPUs and buy four of those.


----------



## Amlalsulami

Quote:


> Originally Posted by *xer0h0ur*
> 
> Unless you plan on doing some sort of work not 3D gaming related then its useless to have more than 2 R9 295X2's in your system. You can only crossfire up to 4 GPUs and each 295X2 has 2 GPUs.


Quote:


> Originally Posted by *wermad*
> 
> two 295x2 would be cheaper (well, depending on where you live tbh).


Quote:


> Originally Posted by *Orivaa*
> 
> Nothing can run four 295x2's. One 295x2 is two 290x GPUs in a single card, but still two GPUs, and CrossFire doesn't allow for more than 4 GPUs, meaning two 295x2 max. 295x2 are for the people who want to do multi-GPU at high level, but want to save space on the motherboard. If you simply want performance, buy four 980's. Or wait for AMD's new GPUs and buy four of those.


That means 4-Way of GTX980 better then 2 of 295x2? Right


----------



## wermad

Quote:


> Originally Posted by *Amlalsulami*
> 
> That means 4-Way of GTX980 better then 2 of 295x2? Right


Not necessarily unless you're all about benchmarks. 4-way sli has never been any good in real world situations.
*
I think we need to know what is it that you will be doing with this pc?*

Quote:


> Originally Posted by *Orivaa*
> 
> Nothing can run four 295x2's. One 295x2 is two 290x GPUs in a single card, but still two GPUs, and CrossFire doesn't allow for more than 4 GPUs, meaning two 295x2 max. 295x2 are for the people who want to do multi-GPU at high level, but want to save space on the motherboard. If you simply want performance, buy four 980's. Or wait for AMD's new GPUs and buy four of those.


If your minging, most 4-way motherboards will work. These mining rigs can get crazy.



I think he's confusing the 295x2 for the 290x (since he's also contemplating quad 980s).


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> Nothing can run four 295x2's. One 295x2 is two 290x GPUs in a single card, but still two GPUs, and CrossFire doesn't allow for more than 4 GPUs, meaning two 295x2 max. 295x2 are for the people who want to do multi-GPU at high level, but want to save space on the motherboard. If you simply want performance, buy four 980's. Or wait for AMD's new GPUs and buy four of those.


Well honestly if he's going for 4K performance I would be looking at quad 8GB 290X's over 980's any day. The only reason to wait imo would be if he's interested in GM200 or Fiji XT.


----------



## xer0h0ur

Quote:


> Originally Posted by *Amlalsulami*
> 
> That means 4-Way of GTX980 better then 2 of 295x2? Right


What is the purpose of this setup? Are you gaming? Are you doing scientific work? Are you trying to push 4K gaming? You build depending on your use really.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well honestly if he's going for 4K performance I would be looking at quad 8GB 290X's over 980's any day. The only reason to wait imo would be if he's interested in GM200 or Fiji XT.


Good point, forgot about those. If he wants to push 4K, four of the 8GB's would indeed be better than the 980's.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> What is the purpose of this setup? Are you gaming? Are you doing scientific work? Are you trying to push 4K gaming? You build depending on your use really.


Yes, this will help us help you decide.


----------



## Amlalsulami

Quote:


> Originally Posted by *xer0h0ur*
> 
> What is the purpose of this setup? Are you gaming? Are you doing scientific work? Are you trying to push 4K gaming? You build depending on your use really.


For a gaming in 3 display 2K


----------



## wermad

Quote:


> Originally Posted by *Amlalsulami*
> 
> For a gaming in 3 display 2K


2x 295x2, or 3x GTX 980, or 3x 290X 8GB. 4-way sli is very poor compared to crossfire (which is crap in most games but still good in some). You may want to wait as new gpu's will coming out for both sides soon.


----------



## Amlalsulami

Quote:


> Originally Posted by *wermad*
> 
> 2x 295x2, or 3x GTX 980, or 3x 290X 8GB. 4-way sli is very poor compared to crossfire (which is crap in most games but still good in some). You may want to wait as new gpu's will coming out for both sides soon.


When it's realesed?


----------



## xer0h0ur

For a multi-display setup operating past 1080p I would say grab yourself some 8GB 290X's. The 1440p and 4K performance is very good and the 8GB of vRAM will serve you very well when multi-monitor gaming.

vRAM gets gobbled up fast with multi-monitor setups and particularly so when you're pushing higher resolutions.


----------



## wermad

Quote:


> Originally Posted by *Amlalsulami*
> 
> When it's realesed?


GM200 pretty soon, probably after march when more and more GSync high end monitors come out. Pirates island (ie r9 390x) is speculated for the second half of the year, though its still not specific. Right now, I wouldn't touch GTX980 as it may drop in price when GM200 shows. You don't want another GTX780 -> GTX 780 Ti po'd crowd (google: hitler finds out gtx 780 ti)



http://www.pcgameshardware.de/Grafikkarten-Grafikkarte-97980/Specials/Roadmap-Grafikkarten-Liste-Radeon-Geforce-1128937/


----------



## wermad

Quote:


> Originally Posted by *wermad*
> 
> Btw, those w/ blocks, how bad is the sag?


Anyone?

Sorry, my question got lost in the current shuffle


----------



## xer0h0ur

Fiji XT also known as the R9 390X is on the horizon. Information points to being weeks away from being ready to launch and its suspected it could launch next month (most rumors say April though) and GM200, which is Nvidia's top Maxwell generation die, should follow fairly soon after.


----------



## xer0h0ur

Quote:


> Originally Posted by *wermad*
> 
> Anyone?
> 
> Sorry, my question got lost in the current shuffle


I don't know which block you're talking about but the EK block and backplate on my 295X2 does make it quite heavy. I never weighed it outright to know how heavy but it was enough weight that when I left it unsupported for about 6 months it ended up causing me problems with my motherboard giving me false 6-beep error codes signifying video card failure. Ever since I bought a powercolor power jack to support the weight I rarely if ever get those boot problems. The sag isn't really evident since the block and backplate stiffen the hell out of the PCB so all the strain ends up on the PCI-E slot.


----------



## fishingfanatic

I'm curious to find out how the D13 does with the 5960x myself. Looking at getting a pr of single universal AMD blocks to try it on water. Haven't heard back from PowerColor though.

Has anyone tried that per chance?









I'm still playing around with the 4960 atm, new TIM, and a kpe 980. I'll likely try the D13 with both the 4960 and the 5960 b4 I get my block for the kpe.

FF


----------



## fishingfanatic

I don't c my name in the owner's list...

Maybe I didn't submit the form...









FF


----------



## xer0h0ur

You're probably going to have to PM the OP since I doubt he keeps up with this thread anymore.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know which block you're talking about but the EK block and backplate on my 295X2 does make it quite heavy. I never weighed it outright to know how heavy but it was enough weight that when I left it unsupported for about 6 months it ended up causing me problems with my motherboard giving me false 6-beep error codes signifying video card failure. Ever since I bought a powercolor power jack to support the weight I rarely if ever get those boot problems. The sag isn't really evident since the block and backplate stiffen the hell out of the PCB so all the strain ends up on the PCI-E slot.


Thank you









I had this concern in the back of my mind. I didn't expect to keep this mb and case with xfire 295x2. I had envisioned a horizontal matx setup. Though, the TT x9 does give me a low cost "horizon" layout and I can squeeze in my rads. Got my thinking hat on for this one









Edit: forgot to mention they're koolance blocks. I had dual core cards before and sag was always an annoyance. Thinking about it these last few hours i am starting to lean on switching cases.

Koolance block (sans port pieces) came in @ 1.04kg (2lbs4.6oz)...heavy. My old Koolance 690s and EK 590s weren't that heavy. I'm starting to think I may need to ditch my lovely 900D since I'm keeping the stock backplate.


----------



## steezebe

Quote:


> Originally Posted by *steezebe*
> 
> whelp ordered the EK block and backplate... single-slot dual-gpu here i come!


As it turns out, I ordered it from FrozenCPU... which closed the day after I ordered it. What a bummer. I'll just consider my $320 temporarily misplaced...

I can handle the throttling a little longer, perhaps. maybe. guh.


----------



## wermad

File a PayPal claim.


----------



## xer0h0ur

I thought they weren't closed after all and just running a skeleton crew fulfilling orders.


----------



## steezebe

Quote:


> Originally Posted by *wermad*
> 
> File a PayPal claim.


Didn't use PayPal
Quote:


> Originally Posted by *xer0h0ur*
> 
> I thought they weren't closed after all and just running a skeleton crew fulfilling orders.


I've
been trying to call them, but the phone number is still disconnected. I wouldn't expect anything from them for at least a month.


----------



## remnant

Sooo, when the 3xx series comes out, do you all who own these 295x2 s plan on buying the newer cards? 390x or even the 395x2 (if they plan on releasing on)?









I ask cause ya'll know the only way my broke *** is getting a 295x2 is when people start selling used ones.


----------



## RagingCain

Quote:


> Originally Posted by *remnant*
> 
> Sooo, when the 3xx series comes out, do you all who own these 295x2 s plan on buying the newer cards? 390x or even the 395x2 (if they plan on releasing on)?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I ask cause ya'll know the only way my broke *** is getting a 295x2 is when people start selling used ones.


Devil 13 R9 390X II "Dual Core"


----------



## remnant

Quote:


> Originally Posted by *RagingCain*
> 
> Devil 13 R9 390X II "Dual Core"


If I had money I would love to do a Devil May Cry themed computer + case mod with one of these and a Devil's Canyon i7. etc


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> I thought they weren't closed after all and just running a skeleton crew fulfilling orders.


They are closed. Unfortunately the thread w/ this info go closed because of the 98% crap content. But it looks like its been re-open:

http://www.overclock.net/t/1540656/official-frozencpu-shuts-its-doors

Quote:


> Originally Posted by *steezebe*
> 
> Didn't use PayPal
> I've
> been trying to call them, but the phone number is still disconnected. I wouldn't expect anything from them for at least a month.


At this point, there's no one there. The company no longer is operational. You need to get your money back. If you used a card, contact that bank and get your money back. In hindsight, Paypal is much easier to work with online then dealing directly w/ traditional banks (imho).


----------



## Bludge

Hey guys, what sort of power draw am I looking at with a 295X2 and a 290X? I am currently using a corsair 1200, but I do have a enemax maxrevo 1500 available

Sent from my LG-D802T using Tapatalk


----------



## RagingCain

Quote:


> Originally Posted by *Bludge*
> 
> Hey guys, what sort of power draw am I looking at with a 295X2 and a 290X? I am currently using a corsair 1200, but I do have a enemax maxrevo 1500 available
> 
> Sent from my LG-D802T using Tapatalk


I would guess 900~1000 Watts total.


----------



## steezebe

Quote:


> Originally Posted by *wermad*
> 
> They are closed. Unfortunately the thread w/ this info go closed because of the 98% crap content. But it looks like its been re-open:
> 
> http://www.overclock.net/t/1540656/official-frozencpu-shuts-its-doors
> At this point, there's no one there. The company no longer is operational. You need to get your money back. If you used a card, contact that bank and get your money back. In hindsight, Paypal is much easier to work with online then dealing directly w/ traditional banks (imho).


Usually I go the paypal route, but i trusted fcpu, so i didn't think it was necessary. Amex (3% cash back on online purchases) will take care of me; I'm not concerned. I'm just botching about having to deal with my vrms thermal throttling for longer.


----------



## wermad

Quote:


> Originally Posted by *Bludge*
> 
> Hey guys, what sort of power draw am I looking at with a 295X2 and a 290X? I am currently using a corsair 1200, but I do have a enemax maxrevo 1500 available
> 
> Sent from my LG-D802T using Tapatalk


I pulled 950w in 3DMark11 @ the wall 295x2 +290 (factory oc). Interesting, in Crysis 3, it was pulling less then the triple 290s









1kw if you're not going to oc much. 1200-1350w if you plan to bump up the clocks. 1500w if you're gonna add a second 295x2.

Quote:


> Originally Posted by *steezebe*
> 
> Usually I go the paypal route, but i trusted fcpu, so i didn't think it was necessary. Amex (3% cash back on online purchases) will take care of me; I'm not concerned. I'm just botching about having to deal with my vrms thermal throttling for longer.


Gotcha







. They don't count the transaction via paypal (amex linked to paypal?)?


----------



## Mega Man

Quote:


> Originally Posted by *remnant*
> 
> Sooo, when the 3xx series comes out, do you all who own these 295x2 s plan on buying the newer cards? 390x or even the 395x2 (if they plan on releasing on)?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I ask cause ya'll know the only way my broke *** is getting a 295x2 is when people start selling used ones.


I do. But I am one of the few people who keep their old stuff
Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I thought they weren't closed after all and just running a skeleton crew fulfilling orders.
> 
> 
> 
> They are closed. Unfortunately the thread w/ this info go closed because of the 98% crap content. But it looks like its been re-open:
> 
> http://www.overclock.net/t/1540656/official-frozencpu-shuts-its-doors
> 
> Quote:
> 
> 
> 
> Originally Posted by *steezebe*
> 
> Didn't use PayPal
> I've
> been trying to call them, but the phone number is still disconnected. I wouldn't expect anything from them for at least a month.
> 
> Click to expand...
> 
> At this point, there's no one there. The company no longer is operational. You need to get your money back. If you used a card, contact that bank and get your money back. In hindsight, Paypal is much easier to work with online then dealing directly w/ traditional banks (imho).
Click to expand...

According to what mark just wrote on the Web site fcpu all be reopening. If that is true we shall see


----------



## remnant

Quote:


> Originally Posted by *Mega Man*
> 
> I do. But I am one of the few people who keep their old stuff


but ... but ... I want


----------



## F4ze0ne

Quote:


> Originally Posted by *RagingCain*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bludge*
> 
> Hey guys, what sort of power draw am I looking at with a 295X2 and a 290X? I am currently using a corsair 1200, but I do have a enemax maxrevo 1500 available
> 
> Sent from my LG-D802T using Tapatalk
> 
> 
> 
> I would guess 900~1000 Watts total.
Click to expand...

My kill-a-watts showed between 950w-1000w while running Valley benchmark at *stock clocks*.


----------



## steezebe

Quote:


> Originally Posted by *wermad*
> 
> Gotcha
> 
> 
> 
> 
> 
> 
> 
> . They don't count the transaction via paypal (amex linked to paypal?)?


honestly that hasn't occurred to me; I'll look into it. good idea!

Quote:


> Originally Posted by *F4ze0ne*
> 
> My kill-a-watts showed between 950w-1000w while running Valley benchmark at *stock clocks*.


I'm running a Seasonic Platinum 1000, and even under full loads with solidworks renders, firestrike benches and games, the PSU fan doesn't even kick on (no, it's not broken, the fan just stays off unless it gets warm - it was designed that way)

Basically f(your results) = may vary.


----------



## F4ze0ne

Quote:


> Originally Posted by *steezebe*
> 
> I'm running a Seasonic Platinum 1000, and even under full loads with solidworks renders, firestrike benches and games, the PSU fan doesn't even kick on (no, it's not broken, the fan just stays off unless it gets warm - it was designed that way)
> 
> Basically f(your results) = may vary.


Results can vary of course. But, I'm offering perspective of what I saw when I tested my cards.


----------



## Bludge

Thanks for your feedback about the wattage required guys, contemplating adding a 295X2 rather than wait for 300 series....I hate being in the lull period between releases


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> According to what mark just wrote on the Web site fcpu all be reopening. If that is true we shall see


Just saw the last post by Enterprise. Well, I'm hoping Mark can get his stuff straight and raise fcpu from the grave.

Quote:


> Originally Posted by *steezebe*
> 
> honestly that hasn't occurred to me; I'll look into it. good idea!
> Basically f(your results) = may vary.


I do get points using my Amazon/Chase card. Then I redeem those via amazon (which, their pc water parts are decreasing, but aio's are increasing







)
Quote:


> Originally Posted by *Bludge*
> 
> Thanks for your feedback about the wattage required guys, contemplating adding a 295X2 rather than wait for 300 series....I hate being in the lull period between releases


This card is a monster. I'm seriously thinking of switching cases once both are blocked and plumbed to the loop. Thinking of a Thermaltake X9 since its horizontal layout and its not terribly expensive (vs a CL).


----------



## Bludge

Quote:


> Originally Posted by *wermad*
> 
> This card is a monster. I'm seriously thinking of switching cases once both are blocked and plumbed to the loop. Thinking of a Thermaltake X9 since its horizontal layout and its not terribly expensive (vs a CL).


You've been reading my mind, I'm thinking along the same route as well, prefer the horizontal, and too cheap for case labs. Currently running 420+360 in a Enthoo Primo, the X9 will allow me to add another 420, and be tidier for separate loops for CPU/GPU


----------



## wermad

The koolance block sans hardware and ports came in a shy over a kilo! Add the hardware, water, and the pcb, this thing is gonna be a porker. One member already had issues w/ the strain the weight of a block (and aftermarket backplate) can have. I'm just waiting for free shipping to be offered again and I'll pick up an X9. I have three Monsta 480s that should be more then enough. I do hate the fact of getting rid of my 900D but, the cards are more expensive and take priority.


----------



## ramos29

i was away on duty , did not visit this website for a while
any of you tried the catalyst 15.2? it is designed for windows 10 but many tried it on windows 8..there is crashes for many, the ccc can not be opened and for others every thing is ok exept there is no dsr
there is a new feature fps cap



there is a download link but it is dead


----------



## wermad

I'm still on omega 14.12 since I was having troubles w/ my mb and cpu. After a clean slate (reformat), I've been riding 14.12 with no concerns.

#2 showed up and my Samsung 4k monitor












edit:


----------



## remnant

Quote:


> Originally Posted by *wermad*
> 
> I'm still on omega 14.12 since I was having troubles w/ my mb and cpu. After a clean slate (reformat), I've been riding 14.12 with no concerns.
> 
> #2 showed up and my Samsung 4k monitor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit:


So much hate and envy


----------



## xer0h0ur

Dear lord why didn't you wait for the Freesync version of the same monitor? I was mad at myself for not waiting for it when I ordered mine a couple months ago.


----------



## wermad

Nah, I'm tired of waiting for crap to come out, then work out all the bugs, and then enjoy. I prefer tech that's been out for a while and sorted. I really don't care for fluff like Freesync/gsync/mantle/120hz/3d etc. I'm just a plain ol' gamer







.

Btw, I bought it sightly preowned and much less then retail.


----------



## xer0h0ur

Freesync is plug and play. There is nothing to iron out about it. Either way I suppose I am about to do the same as your seller and get rid of mine for the Freesync equivalent.


----------



## axiumone

Quote:


> Originally Posted by *xer0h0ur*
> 
> Freesync is plug and play. There is nothing to iron out about it. Either way I suppose I am about to do the same as your seller and get rid of mine for the Freesync equivalent.


That may not be entirely true. I promise there will be some teething issues. Two things that are almost guaranteed to have issue with the first drivers and monitors that will be available with freesync are multi monitor and multi gpu set ups.

That's just my two cents. I sincerely hope I'm wrong, but judging by historical evidence, there's a high probability that I may be right.


----------



## xer0h0ur

I didn't think about multi-monitor setups since I have yet to tread those waters. It would suck if there are issues with multi-gpu setups.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> I didn't think about multi-monitor setups since I have yet to tread those waters. It would suck if there are issues with multi-gpu setups.


its fun, but its gotta be done right or you'll run into issues. I loved my 5x1 6000x1920 setup







(60hz, ips, displayport).


----------



## Elmy

A little video I made over the weekend showing my build called White Lightning with 2 Club3D 295X2's playing BF4 @ 5400X1920 120Hz 1ms.

Please subscribe It would be greatly appreciated.


----------



## Orivaa

Anyone else having Hearthstone BSOD'ing or completely freezing your PC?


----------



## Rx4speed

I realize some of the info I'm seeking is probably in this thread. I've read 192 pages, but still have a ways to go. I bought a used 295x2 and so far am loving it, although Assetto Corsa with it sucks, as the game has serious issues with multiple GPU's. I'm running stock speeds and so far no throttling, although in THIEF @5760x1200 or xxxx X 2600 (single screen), both GPUS stay near 100% utilization(which is good) and GPU #2 will get to 74 C.

I have a Define R4 with the 295x2 radiator mounted in the rear exhaust, with 2 140mm fans pulling air into the front of the case and one 140mm exhausting out the top. I could add another 140mm exhaust on top and a 120 on the bottom of the case, but so far, there is no need.

I had 2 Noctua NF-F12's laying around so i installed one, in push, to replace the stock radiator fan, but the temps were worse. With the 2 NF-F12's in push/pull, the temps were about the same as the stock fan in push config. The Noctuas are 4 pin PWM and I used and adapter but could never get the 295x2's fan wire to evn turn them on. Had to hook them up the my case's fan controller. I want to avoid manual operation. FYI, the top HDD cage is removed from my case so the top intake is blowing across the 295x2. I also use i5-4670k at stock 3.4/3,8 turbo speed, 8GB of 2133 g skill 9-11-11-28 (stock XMP) and an EVGA 850 G2 PSU, which seems to be handling eveything just fine.

Problem is, that in the summer (northern KY), my AC struggles to keep my house at 70F, normally its around 72, and my office can get a little hot in the summer with the west sun hitting it. So, I'm worried about throttling when summer rolls around.

I figure I have two options.

1) buy two corsair S120's(3pin) , and use them in push/pull because from what I've read, this seems to be about the best air cooling solution. Although I'm sure some Scythe AP-45s would rock it out, but these are hard to find, expensive and loud.

2) Waterblock. I see a Koolance and an EK Nickel, both with EK backplates on the bay right now, but i've never water cooled anything, and have no idea what else i would need to complete this (pump, resevoir, tubings, fittings), especially which pump and radiators to buy.

Money is not the issue, so I'm just looking for the best all around setup that will run worry free in the summer months. And , if I watercool, should I go ahead and cool my CPU too? Right now its on a Coolermaster EVO 212 (i think), with a Noctua fan.

Thanks in advance, fellas!


----------



## wermad

Go with water, unless you wanna through in some screaming 3k rpm fans on there.

Got my G1600; setup one card on rail 3 & 4 and the 2nd card on rail 5 & 6. Ran 3D11 and pulled ~1200-1250w @ the Kill-A-Watt. factor in effeciency: ~1050-1090w at the psu, less ~150w for the system, that brings us ~900w-940 for both cards. AB showed the primary card was ~50°C for both cores and the bottom card was in the high 40s for both cores. Both are set to 75°C in OD, opened case, and cpu still on water. Its a great little cooler but its ~19°C right now. I'm sure this will skyrocket during the hot socal summers.

The new case is going to be shipped from accross the country. Sucks, i was hoping it would ship from Socal. Oh well, guess no custom water till next week







.


----------



## steezebe

Quote:


> Originally Posted by *Rx4speed*
> 
> ...
> 
> I had 2 Noctua NF-F12's laying around so i installed one, in push, to replace the stock radiator fan, but the temps were worse. With the 2 NF-F12's in push/pull, the temps were about the same as the stock fan in push config. The Noctuas are 4 pin PWM and I used and adapter but could never get the 295x2's fan wire to evn turn them on. Had to hook them up the my case's fan controller. I want to avoid manual operation. FYI, the top HDD cage is removed from my case so the top intake is blowing across the 295x2. I also use i5-4670k at stock 3.4/3,8 turbo speed, 8GB of 2133 g skill 9-11-11-28 (stock XMP) and an EVGA 850 G2 PSU, which seems to be handling eveything just fine.
> 
> ...


you're not the first to experience that with a push-pull using Noctua's, and it's been mentioned in this thread almost exactly.

As my recommended alternative, I prefer my Bitfenix Spectre PWM's as they are much more effective at pushing and pulling air through a dense radiator. They're silent at 500rpm when idle, and at 1800 RPM I can't hear them through my headphones, even though they throw the radiator heat away like crazy. It may help (eg save a few hundred $$$) if you don't want to go full water.


----------



## xer0h0ur

Yeah I had to go with at least the 2000 RPM iPPC NF-F12's in push/pull to get a decent temp drop while still using the original Asetek cooler but that didn't last long anyways before I said screw it and slapped the EK block on it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I had to go with at least the 2000 RPM iPPC NF-F12's in push/pull to get a decent temp drop while still using the original Asetek cooler but that didn't last long anyways before I said screw it and slapped the EK block on it.


Eh?

Im using a single Industrial NF-F12 on mine and its handling temps fine, 2000rpm for gaming and 2500 for hot days.

3000rpm is only reserved for benching (damn thing gets way too louf at that point)


----------



## xer0h0ur

Yeah I didn't want to use the 3000 RPM iPPC NF-F12's either. The 2000 RPM version never even spins up to 2000 RPM anyways and at 100% speed its still a tolerable noise level. I usually never end up pushing the fans past 90% anyways while gaming. At one point my fans were making odd noises when being run at 100% but that went away on its own. I was going to re-lube the fans with dupont's teflon multi-use lubricant. I love that stuff.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I didn't want to use the 3000 RPM iPPC NF-F12's either. The 2000 RPM version never even spins up to 2000 RPM anyways and at 100% speed its still a tolerable noise level. I usually never end up pushing the fans past 90% anyways while gaming. At one point my fans were making odd noises when being run at 100% but that went away on its own. I was going to re-lube the fans with dupont's teflon multi-use lubricant. I love that stuff.


Ive noticed at 2k rpm they have a weird hum (i have 6 of them) but at 2200 or 1800rpm they are very tolerable.

Ive done some push/pull testing and found i get the same temps with one at 3k rpm in push as i do as push/pull at 2k rpm....was interesting.


----------



## ryno9696

Hello all. I built a new system today, and am having a problem I've never encountered before. Here's the relevant specs:
i7-5930k
ASrock x99 Extreme4
(2) XFX 295x2
1600w EVGA G2 PSU

One of the cards isn't working and has this strange error in Device Manager: "This device cannot find enough free resources that it can use. (Code 12) If you want to use this device, you will need to disable one of the other devices on this system."

I'm pretty positive I've got all the power connections correct- 2 8pin connectors per card, 4 pin power to MB-PCIE connector. I've got the cards in slots 1 and 3 like the manual says. I tried disabling other onboard devices but that didn't help.

Anyone have any ideas? What can I try to isolate the issue?


----------



## Rx4speed

So Performance PCs has the Koolance for $108 and throw a EK backplate on it for another $40. Do you have a recommendation for a radiator and pump/res?


----------



## wermad

Quote:


> Originally Posted by *ryno9696*
> 
> Hello all. I built a new system today, and am having a problem I've never encountered before. Here's the relevant specs:
> i7-5930k
> ASrock x99 Extreme4
> (2) XFX 295x2
> 1600w EVGA G2 PSU
> 
> One of the cards isn't working and has this strange error in Device Manager: "This device cannot find enough free resources that it can use. (Code 12) If you want to use this device, you will need to disable one of the other devices on this system."
> 
> I'm pretty positive I've got all the power connections correct- 2 8pin connectors per card, 4 pin power to MB-PCIE connector. I've got the cards in slots 1 and 3 like the manual says. I tried disabling other onboard devices but that didn't help.
> 
> Anyone have any ideas? What can I try to isolate the issue?


Try running off that one card in question. Does gpu-z see all four cores?
Quote:


> Originally Posted by *Rx4speed*
> 
> So Performance PCs has the Koolance for $108 and throw a EK backplate on it for another $40. Do you have a recommendation for a radiator and pump/res?


You'll just have to find longer screws to fit the thicker Ek plate. I picked up two blocks at this price a couple of weeks ago and i've decided to keep the stock backplates. From the instructions, they do give you long enough screws for this. I think they're m2.6 screws but not sure on the length for the replacement screws with the ek backplate.


----------



## ryno9696

Quote:


> Originally Posted by *ryno9696*
> 
> Hello all. I built a new system today, and am having a problem I've never encountered before. Here's the relevant specs:
> i7-5930k
> ASrock x99 Extreme4
> (2) XFX 295x2
> 1600w EVGA G2 PSU
> 
> One of the cards isn't working and has this strange error in Device Manager: "This device cannot find enough free resources that it can use. (Code 12) If you want to use this device, you will need to disable one of the other devices on this system."


I managed to get it working by fiddling with BIOS settings. The BIOS on this MB is stupid, changing this setting makes it so I cannot go back into bios without clearing CMOS, all I get is a black screen afterward. I can't remember the exact name of the setting and it is conspicuously missing from the manual. It's in Advanced->Chipset and it relates to the 64bit addressing of PCI-E devices. Enabling it allowed me to use both cards and nearly doubled my framerate in benchmark tests.


----------



## Rx4speed

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I didn't want to use the 3000 RPM iPPC NF-F12's either. The 2000 RPM version never even spins up to 2000 RPM anyways and at 100% speed its still a tolerable noise level. I usually never end up pushing the fans past 90% anyways while gaming. At one point my fans were making odd noises when being run at 100% but that went away on its own. I was going to re-lube the fans with dupont's teflon multi-use lubricant. I love that stuff.


Thanks very much for your suggestions. When you said screw it and put a block on the card, which radiator, fans and pump/resevoir did you go with? And, when you still used the stock radiator with the Noctua 2000RPM NF-F12's, did you use the 3 pin or 4 pin PWM's and did you run them off the card or off a fan controller? Thanks.


----------



## xer0h0ur

I actually still need to edit my water cooling loop since its being overpowered by the heat. In its current state its barely enough if I limit the FPS in games but not otherwise. I would say that you want no less than a 360mm radiator per 295X2 (assuming you plan on overclocking otherwise 240 is bare minimum). If you plan on adding the CPU into the loop then you want to go with at least a 480, dual 240's or whatever combination gives you 480mm. I like Alphacool radiators myself. The 60mm UT60 or 80mm thick Monsta radiators. As for pump/reservoir I am going to likely end up going with a combination EK pump/res like the single 5.25" drive bay EK-SBAY or maybe the dual pump version. Either of which give you PWM control of the pump(s). My case is small so I am going to have to mount the extra radiator I am adding externally with a Koolance mounting bracket.

As for your question about the fans, I am currently using the 4 pin PWM fans with my system's built in controllers. I have been avoiding buying a fan controller but adding the 360mm radiator and another 6 fans means I will have to use the 3-pin version now along with one of these suckers: http://www.amazon.com/Phanteks-PWM-Fan-Controller-PH-PWHUB_01/dp/B00M0R05WE/ref=sr_1_1?s=electronics&ie=UTF8&qid=1424970997&sr=1-1&keywords=Phanteks+PWM+Fan+Hub+Controller


----------



## Elmy

I can't remember where. But someone tested temps with EK backplate and stock backplate and stock ran cooler. I can find it mb if your really interested.


----------



## xer0h0ur

Quote:


> Originally Posted by *Elmy*
> 
> I can't remember where. But someone tested temps with EK backplate and stock backplate and stock ran cooler. I can find it mb if your really interested.


Personally I use the EK backplate for the obvious because it matches the block but more so because it completely covers the back of the card. That backplate has already saved me twice from coolant leaks. So unless the temperature difference is drastic I wouldn't change back to the original plate. Just my 2 cents.


----------



## hellojustinr

Hi I just got my 295X2, got a great deal, love the aesthetics.

One issue I'm having is with the triple monitor set up I have, it's two DVIs and one HDMI. This card's unconventional setup has me wondering what route to take.

I thought it would just be as simple as getting two DP adapters (one HDMI and one DVI) and calling it a day but apparently, reading online through some reviews these 'passive DP adapters will not work'

Does anyone know how to set up three monitors on this? (without having to buy two expensive 'active' DP adapters). Already low on budget as it is and worked really hard to get this.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hellojustinr*
> 
> Hi I just got my 295X2, got a great deal, love the aesthetics.
> 
> One issue I'm having is with the triple monitor set up I have, it's two DVIs and one HDMI. This card's unconventional setup has me wondering what route to take.
> 
> I thought it would just be as simple as getting two DP adapters (one HDMI and one DVI) and calling it a day but apparently, reading online through some reviews these 'passive DP adapters will not work'
> 
> Does anyone know how to set up three monitors on this? (without having to buy two expensive 'active' DP adapters). Already low on budget as it is and worked really hard to get this.


I'm running 3 monitors via DVI/MiniDP to HDMI and MiniDP to DVI (both passive adaptors)

If you want more than three monitors then you need active adaptors but 3 monitors + eyefinity will work the way i have mine setup


----------



## hellojustinr

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm running 3 monitors via DVI/MiniDP to HDMI and MiniDP to DVI (both passive adaptors)
> 
> If you want more than three monitors then you need active adaptors but 3 monitors + eyefinity will work the way i have mine setup


Wow that's really quick reply!!!

I'll buy two pasive adapters tonight, that's exactly how I want to have mine set up!
Can't wait to see this baby in action!

Thank you so much


----------



## Sgt Bilko

Quote:


> Originally Posted by *hellojustinr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I'm running 3 monitors via DVI/MiniDP to HDMI and MiniDP to DVI (both passive adaptors)
> 
> If you want more than three monitors then you need active adaptors but 3 monitors + eyefinity will work the way i have mine setup
> 
> 
> 
> Wow that's really quick reply!!!
> 
> I'll buy two pasive adapters tonight, that's exactly how I want to have mine set up!
> Can't wait to see this baby in action!
> 
> Thank you so much
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

You're welcome


----------



## DeathDealer592

Hi, I recently got my 295x2 and the "RADEON" LEDs will seem to stay low-light when the GPU is at idle. Every once in a while they'll go bright, flash about three times, then go dim again. Also I noticed last night when my monitor was off, both the RADEON LEDs and the fan lights would turn off for a few seconds, then turn back on. Is this normal? Thanks!


----------



## xer0h0ur

What you're talking about are LEDs in the VRMs' fan. You may have an actual problem there or you might just have a faulty fan. Without replacing the fan it would be hard to tell.


----------



## DeathDealer592

Alright thanks for the info! So basically then it's not normal; it's funny because the dimming actually seems like it's something it does purposefully, because it'll go bright to dim about once every second for three seconds, then remain dim. The shutting off only seems to occur when the monitor is off, so I figured it might have just been some type of power-saving thing. But thank you for your response! I guess I should send it back to Amazon before my 30-day window is up :'(


----------



## Coppy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm running 3 monitors via DVI/MiniDP to HDMI and MiniDP to DVI (both passive adaptors)
> 
> If you want more than three monitors then you need active adaptors but 3 monitors + eyefinity will work the way i have mine setup


Hi !
which drivers do you use ? Omega 14.12. ?
I´m thinking about eyefinity,too and some guys in this thread said it doesn´t work that way (anymore) if you only use MiniDP adaptors.
I always thought it should work that way ! But they say only 2 Monitors are working this way. With 3 Monitors they use active adaptors, i think.

But you say it works the simple way. Now i´m confused...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Coppy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I'm running 3 monitors via DVI/MiniDP to HDMI and MiniDP to DVI (both passive adaptors)
> 
> If you want more than three monitors then you need active adaptors but 3 monitors + eyefinity will work the way i have mine setup
> 
> 
> 
> Hi !
> which drivers do you use ? Omega 14.12. ?
> I´m thinking about eyefinity,too and some guys in this thread said it doesn´t work that way (anymore) if you only use MiniDP adaptors.
> I always thought it should work that way ! But they say only 2 Monitors are working this way. With 3 Monitors they use active adaptors, i think.
> 
> But you say it works the simple way. Now i´m confused...
Click to expand...

I can enable/disable 5760x1080 eyefinity with my monitors hooked up via DVI, MiniDP to HDMI and MiniDP to DVI.

I normally don't run eyefinity but i do run three monitors and game on the centre one, I've been doing this since the day i got this card and it has worked fine

using the 14.12 Omega driver atm as well yes.


----------



## Lurifaks

Another new member


----------



## Mega Man

welcome

if that is the wifi card i have 2 of, the biggest complaint i have is it looks like single slot but it isnt is that why you put the card so low ??


----------



## hellojustinr

Quote:


> Originally Posted by *Mega Man*
> 
> welcome
> 
> if that is the wifi card i have 2 of, the biggest complaint i have is it looks like single slot but it isnt is that why you put the card so low ??


I have the same one its single slot it should fit in the slots below he might have it so low because he has the rad mounted up front or something


----------



## Lurifaks

Quote:


> Originally Posted by *Mega Man*
> 
> welcome
> 
> if that is the wifi card i have 2 of, the biggest complaint i have is it looks like single slot but it isnt is that why you put the card so low ??


Thanks !

Its because of the Expansion Slots layot , 1 x PCI Express x1 slot , where the wifi card is

And the second PCI Express x16 running x4

And the third is PCI Express x16 running @ x8 , where i have the 295x2


----------



## Mega Man

Quote:


> Originally Posted by *hellojustinr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> welcome
> 
> if that is the wifi card i have 2 of, the biggest complaint i have is it looks like single slot but it isnt is that why you put the card so low ??
> 
> 
> 
> I have the same one its single slot it should fit in the slots below he might have it so low because he has the rad mounted up front or something
Click to expand...

I am on mobile and can not check his rog in rig builder (easily) but 99% sure it is the ac1900 from asus there is a components on back of the card that interfere with a card in the slot above ( another gpu )
Quote:


> Originally Posted by *Lurifaks*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> welcome
> 
> if that is the wifi card i have 2 of, the biggest complaint i have is it looks like single slot but it isnt is that why you put the card so low ??
> 
> 
> 
> Thanks !
> 
> Its because of the Expansion Slots layot , 1 x PCI Express x1 slot , where the wifi card is
> 
> And the second PCI Express x16 running x4
> 
> And the third is PCI Express x16 running @ x8 , where i have the 295x2
Click to expand...

You would get better performance assuming I am guessing your mobo correct gigabyte z97 soc ( again on mobile ) if you put the gpu in the first slot and the wifi in the last ( x4 from chipset pcie2.0 )


----------



## ryno9696

Is anyone able to get AC: Unity running w/ 4k and max settings? I can't tell if I'm still having system issues or this game is just horribly optimized. If I turn off AA it is playable but has a crummy framerate.

When I set this system up I was having trouble in all games w/ poor performance and system crashes. I finally narrowed it down to the Motherboard and RMA'ed it and got something else. Since then I've run several other games on full settings without a problem.

I just posted my specs back a couple pages, but here they are again:

Gigabyte GA-X99-UD4 Motherboard
i7-5930k
32gb ddr4 2400mhz
(2) XFX 295x2
Samsung evo 850
EVGA 1600w PSU


----------



## Mega Man

have you attempted using only 1 295x2?

unfortunately i refuse to support ubisoft.

they lost my business due to how they act and release crap for games that wouldnt pass for a beta


----------



## ryno9696

Quote:


> Originally Posted by *Mega Man*
> 
> have you attempted using only 1 295x2?


Yeah I had the same problem on my old setup w/ 1 295x2 and i7-4790k
Quote:


> Originally Posted by *Mega Man*
> 
> unfortunately i refuse to support ubisoft.
> 
> they lost my business due to how they act and release crap for games that wouldnt pass for a beta


At this point I completely agree. I loved Black Flag, so I bought Unity but it was a waste of $60.


----------



## xer0h0ur

Black Flag was such a hit in my opinion that it was only possible to go down from there. I really had no idea how the hell they could have possibly followed that game up with a better game unless it kept with the pirate, sailing theme.

If you create a profile for the game and force the AFR friendly crossfire setting you can get good and smooth framerate in Unity with a single 295X2 @ 4K (obviously with zero gameworks options being used and no AA) however that may introduce flashing textures for you. Depending on the patch version of the game I had it either work perfectly fine or not.


----------



## Ragingun

Hey y'all. I just returned my 970 SLI setup and am very seriously considering a 295x2. The other option is to wait for a 300x. Thoughts?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ragingun*
> 
> Hey y'all. I just returned my 970 SLI setup and am very seriously considering a 295x2. The other option is to wait for a 300x. Thoughts?


I'd wait and see what the 300 series brings tbh, shouldn't be much longer now


----------



## doctakedooty

Quote:


> Originally Posted by *Ragingun*
> 
> Hey y'all. I just returned my 970 SLI setup and am very seriously considering a 295x2. The other option is to wait for a 300x. Thoughts?


My opinion is I got my 295x2 and sli 980s and at 4k the 295x2 does perform real good but as you are aware you need a beefy psu to run the 295x2 and it gets super hot even with the water cooler on it.


----------



## Mega Man

I think you mean stock cooling. If it gets real hot with water cooling. ... you're doing it wrong


----------



## remnant

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd wait and see what the 300 series brings tbh, shouldn't be much longer now


Must keep waiting, so hard to wait,

thestruggleisreal


----------



## wermad

Can't wait to get mine under water. Right now, there's no need to turn on the heater since these guys are keeping my room nice and warm







.


----------



## Lurifaks

Quote:


> Originally Posted by *Mega Man*
> 
> I am on mobile and can not check his rog in rig builder (easily) but 99% sure it is the ac1900 from asus there is a components on back of the card that interfere with a card in the slot above ( another gpu )
> You would get better performance assuming I am guessing your mobo correct gigabyte z97 soc ( again on mobile ) if you put the gpu in the first slot and the wifi in the last ( x4 from chipset pcie2.0 )


Thank you ! Was not awere that i could place the ac1900 in a full PCIe 3.0 x16 slot


----------



## Mega Man

Yw.

The great thing about pcie is the scaling. You can put a x16 in any size slot. Same for a x1


----------



## Ragingun

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Hey y'all. I just returned my 970 SLI setup and am very seriously considering a 295x2. The other option is to wait for a 300x. Thoughts?
> 
> 
> 
> I'd wait and see what the 300 series brings tbh, shouldn't be much longer now
Click to expand...

Quote:


> Originally Posted by *doctakedooty*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Hey y'all. I just returned my 970 SLI setup and am very seriously considering a 295x2. The other option is to wait for a 300x. Thoughts?
> 
> 
> 
> My opinion is I got my 295x2 and sli 980s and at 4k the 295x2 does perform real good but as you are aware you need a beefy psu to run the 295x2 and it gets super hot even with the water cooler on it.
Click to expand...

Thanks. I've been waiting for a month now since returning my 970's but I'm running a single 660 temporarily and I can't run nearly any of my games, I'm going nuts lol. If it er just 2 weeks away that's one thing but from rumors it may be another 2-3 months.

Obviously the 390X will be a single gpu but I doubt it would out preform the 295x2, thoughts?


----------



## wermad

From past history, a single 390x won't surpass a 295x2 in terms of raw power. That's left to 395x2. In mostly everything else, it should be a great improvement. I'm guessing 390x won't be out until Titan II is out (nvidia will counter with ti mk2 later on).


----------



## BradleyW

It was my understanding that a 295X2 is the same as 290X Crossfire. Can someone explain these benchmarks to me? The 295X2 seems to win by 10fps min on most tests compared to 290X CFX.

http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-5.html

Thank you.


----------



## xer0h0ur

Well the difference goes the other way once you decide to overclock. Crossfired 290X's can overclock better than a 295X2 can. So in reality I would generally speaking say dual 290X's are a better way to go than a 295X2 if you can go that route. Otherwise being run at stock then yes the 295X2 should outperform dual 290X's. However there is a clock speed difference of 18MHz. 1000MHz versus 1018MHz on the 295X2.


----------



## doctakedooty

Quote:


> Originally Posted by *Ragingun*
> 
> Thanks. I've been waiting for a month now since returning my 970's but I'm running a single 660 temporarily and I can't run nearly any of my games, I'm going nuts lol. If it er just 2 weeks away that's one thing but from rumors it may be another 2-3 months.
> 
> Obviously the 390X will be a single gpu but I doubt it would out preform the 295x2, thoughts?


Sadly if your just always waiting to grab the new thing you will always be waiting because when the new 300 series comes out nvidia will do something and so on and so forth so you end up keep waiting. Both the 300 series and 200 series are high tdp cards. So high tdp equates to high temps. If you don't mind the temps the cards are great. That's where nvidia makes great is power efficiency so less heat. Now on the other hand 200 series are good cards and not the best time to bad but not a bad time to buys as the 200 series prices have plummeted so if your in need of a card and want amd now is a pretty good time to buy and if you decide to crossfire the 295x2 then when the 300 series come out you could pick up another pretty cheap. Me personally if I need a new card I just find one for a good price at the time and if I decide to upgrade later sell the card and consider my loss I take a rental fee. I did see 295x2 on sale last week don't know if they are still on sale for $629.99


----------



## Ragingun

Quote:


> Originally Posted by *doctakedooty*
> 
> Sadly if your just always waiting to grab the new thing you will always be waiting because when the new 300 series comes out nvidia will do something and so on and so forth so you end up keep waiting. Both the 300 series and 200 series are high tdp cards. So high tdp equates to high temps. If you don't mind the temps the cards are great. That's where nvidia makes great is power efficiency so less heat. Now on the other hand 200 series are good cards and not the best time to bad but not a bad time to buys as the 200 series prices have plummeted so if your in need of a card and want amd now is a pretty good time to buy and if you decide to crossfire the 295x2 then when the 300 series come out you could pick up another pretty cheap. Me personally if I need a new card I just find one for a good price at the time and if I decide to upgrade later sell the card and consider my loss I take a rental fee. I did see 295x2 on sale last week don't know if they are still on sale for $629.99


Thanks. 2 more questions. 1: where did ya find the card for $629 and 2: how well does 2 of these cards scale being 4 gpu's?


----------



## wermad

Quote:


> Originally Posted by *BradleyW*
> 
> It was my understanding that a 295X2 is the same as 290X Crossfire. Can someone explain these benchmarks to me? The 295X2 seems to win by 10fps min on most tests compared to 290X CFX.
> 
> http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-5.html
> 
> Thank you.
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Well the difference goes the other way once you decide to overclock. Crossfired 290X's can overclock better than a 295X2 can. So in reality I would generally speaking say dual 290X's are a better way to go than a 295X2 if you can go that route. Otherwise being run at stock then yes the 295X2 should outperform dual 290X's. However there is a clock speed difference of 18MHz. 1000MHz versus 1018MHz on the 295X2.
Click to expand...

Plus the fact that the stock turbine 290/290x cooler is not as good as the aio of the 295x2. The crossfire setup will run into heat issues right away and throttle down even though the 295x2 has a lower thermal threshold. So in other words, the 295x2 can hold the boost a bit better due to the aio cooling. Either one, Hawaii really shines with custom water imho.
Quote:


> Originally Posted by *Ragingun*
> 
> Thanks. 2 more questions. 1: where did ya find the card for $629 and 2: how well does 2 of these cards scale being 4 gpu's?


The lowest I've seen them was 649.99 (w/ rebate needed) and a $20 gift card (newegg) dropping it to $629, but its been a while though. I bought a new one and a day old one for fair prices considering newegg charges california tax (and it was a bit cheaper).


----------



## BradleyW

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well the difference goes the other way once you decide to overclock. Crossfired 290X's can overclock better than a 295X2 can. So in reality I would generally speaking say dual 290X's are a better way to go than a 295X2 if you can go that route. Otherwise being run at stock then yes the 295X2 should outperform dual 290X's. However there is a clock speed difference of 18MHz. 1000MHz versus 1018MHz on the 295X2.


Surely 18MHz cannot result in +10 min fps between the 290X's and a 295X2 in those benchmarks?

Edit: Sorry I did not see the post above. I guess the temperatures on the 290X's may be throttling their speeds.


----------



## xer0h0ur

If you're using a card that is dumping heat into the case you're likely to run into temperature issues unless you have something else evacuating that heat out of the case. The 290X I am using is the bone stock blower style reference design and it never overheats. I am however using a custom fan curve in afterburner so that would be why its not overheating.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> It was my understanding that a 295X2 is the same as 290X Crossfire. Can someone explain these benchmarks to me? The 295X2 seems to win by 10fps min on most tests compared to 290X CFX.
> 
> http://www.tomshardware.com/reviews/radeon-r9-295x2-crossfire-performance,3808-5.html
> 
> Thank you.
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Well the difference goes the other way once you decide to overclock. Crossfired 290X's can overclock better than a 295X2 can. So in reality I would generally speaking say dual 290X's are a better way to go than a 295X2 if you can go that route. Otherwise being run at stock then yes the 295X2 should outperform dual 290X's. However there is a clock speed difference of 18MHz. 1000MHz versus 1018MHz on the 295X2.
> 
> 
> 
> 
> 
> Click to expand...
> 
> Plus the fact that the stock turbine 290/290x cooler is not as good as the aio of the 295x2. The crossfire setup will run into heat issues right away and throttle down even though the 295x2 has a lower thermal threshold. So in other words, the 295x2 can hold the boost a bit better due to the aio cooling. Either one, Hawaii really shines with custom water imho.
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Thanks. 2 more questions. 1: where did ya find the card for $629 and 2: how well does 2 of these cards scale being 4 gpu's?
> 
> Click to expand...
> 
> The lowest I've seen them was 649.99 (w/ rebate needed) and a $20 gift card (newegg) dropping it to $629, but its been a while though. I bought a new one and a day old one for fair prices considering newegg charges california tax (and it was a bit cheaper).
Click to expand...

XFX branded 295x2 is $660 after rebate atm on Newegg


----------



## remnant

Maybe it's crazy but I don't want to buy one right now because I'm afraid the price will drop more when the 300 series comes out. :,(


----------



## wermad

Because the card will still be in a class of its own, it tends to hold prices very well. It isn't until its true replacement comes out that prices will drop significantly. Keep in mind that stock has been getting low, so in the end, they may list slightly cheaper refurbished ones (a'la 7990). Hit the forums and you may be able to score one for $600 or less. The cheapest I seen was a desperate seller selling one for $500 (hardforum) a few months ago.


----------



## remnant

Quote:


> Originally Posted by *wermad*
> 
> Because the card will still be in a class of its own, it tends to hold prices very well. It isn't until its true replacement comes out that prices will drop significantly. Keep in mind that stock has been getting low, so in the end, they may list slightly cheaper refurbished ones (a'la 7990). Hit the forums and you may be able to score one for $600 or less. The cheapest I seen was a desperate seller selling one for $500 (hardforum) a few months ago.


I keep an eye out here, I trust the people here. not sure where else to look.


----------



## xer0h0ur

Oh its practically guaranteed that the price will drop when the 3XX series comes out but since a single 390X won't take the single card performance crown from the 295X2 it may take till Bermuda XT aka the 395X2 for the price to drop again on the 295X2.


----------



## wermad

At this point, its up the retailers to drop prices. They will try to maximize sales as long as possible. I remember when 7970 Lightnings were still selling for $500 (TD) even when the reference was well under $300. The exclusivity is another reason to milk products as long as possible.

Going to break down my rig tonight and get these cards on blocks. I'm going to rig up a simple loop for now on a box until the new case arrives.


----------



## Feyris

My friend is going to part with a 295x2, i have a 7990 and someone whos going to buy it.

Should I bite the bullet and get it?


----------



## remnant

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh its practically guaranteed that the price will drop when the 3XX series comes out but since a single 390X won't take the single card performance crown from the 295X2 it may take till Bermuda XT aka the 395X2 for the price to drop again on the 295X2.


Ya because of my mITX size the 295x2 has always interest me but price man. .. price


----------



## joeh4384

Sure why not. Nice upgrade plus crossfire is a lot better with the newer architecture.


----------



## wermad

Quote:


> Originally Posted by *Feyris*
> 
> My friend is going to part with a 295x2, i have a 7990 and someone whos going to buy it.
> 
> Should I bite the bullet and get it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Sure why not. Nice upgrade plus crossfire is a lot better with the newer architecture.
Click to expand...

Go for it


----------



## Ragingun

Quote:


> Originally Posted by *BradleyW*
> 
> Surely 18MHz cannot result in +10 min fps between the 290X's and a 295X2 in those benchmarks?
> 
> Edit: Sorry I did not see the post above. I guess the temperatures on the 290X's may be throttling their speeds.


It's also due to the fact the GPU's are built on the same PCB which allows faster communication as opposed to using the PCI-E to communicate.


----------



## remnant

Quote:


> Originally Posted by *wermad*
> 
> Plus the fact that the stock turbine 290/290x cooler is not as good as the aio of the 295x2. The crossfire setup will run into heat issues right away and throttle down even though the 295x2 has a lower thermal threshold. So in other words, the 295x2 can hold the boost a bit better due to the aio cooling. Either one, Hawaii really shines with custom water imho.
> *The lowest I've seen them was 649.99 (w/ rebate needed) and a $20 gift card (newegg) dropping it to $629,* but its been a while though. I bought a new one and a day old one for fair prices considering newegg charges california tax (and it was a bit cheaper).


that price cut is no more, sadly


----------



## BradleyW

Quote:


> Originally Posted by *Ragingun*
> 
> It's also due to the fact the GPU's are built on the same PCB which allows faster communication as opposed to using the PCI-E to communicate.


I've investigated this now. It seems the difference was simply due to temperature / throttling protocols as suggested in previous posts.


----------



## Ragingun

Quote:


> Originally Posted by *BradleyW*
> 
> I've investigated this now. It seems the difference was simply due to temperature / throttling protocols as suggested in previous posts.


Interesting. I can't remember where, possibly a Tom's Hardware review, but it was stated that the physical location of the GPUs caused the 295x2 to perform better, although that may change as indicated by your point.


----------



## BradleyW

Quote:


> Originally Posted by *Ragingun*
> 
> Interesting. I can't remember where, possibly a Tom's Hardware review, but it was stated that the physical location of the GPUs caused the 295x2 to perform better, although that may change as indicated by your point.


I believe the physical location of the cores just helped with frame times. PCI-E 3.0 is more than enough for 290X's to communicate without any hindrance. Especially does not account for a min 10fps difference.


----------



## wermad

Got my blocks on and I have to say it was a real challenge just to remove the three cables from the pcb. Usually, its just one cable but I had no idea I had to deal with three. Koolance blocks went on without a hitch and I love how they give you the right screws to keep the stock backplate. Just finished the second one and I need to test that one as well.


----------



## Feyris

So pretty but looks like another pcie slot heavy weight. How is it weight wise? my razor hd block makes gpu so heavy it messes up in the slot without prop.


----------



## wermad

I weighed the block only @ a little over a kilo. This is why I bought this case:


----------



## Ragingun

What case is that?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ragingun*
> 
> What case is that?


I know it's a Thermaltake......X9 maybe?


----------



## wermad

Thermaltake Core X9


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thermaltake Core X9


Derp.....i only just seen it in your sig









nice case though, quite liking the new ones


----------



## Ragingun

I couldn't see the case on my phone. Good to know though. I like it.


----------



## Xzow

Is it true the 75c thermal limit of this card is not changeable?

If so, how badly does it impact it during full load performance?


----------



## Feyris

Is there a block that allows the use of the OEM shroud still? because I want it powdercoated pink a pink 295x2 would be love to match my rig


----------



## wermad

Get two uniblocks.


----------



## Orivaa

Quote:


> Originally Posted by *Xzow*
> 
> Is it true the 75c thermal limit of this card is not changeable?
> 
> If so, how badly does it impact it during full load performance?


You can't change it, but unless your PC has absolutely no air flow, and an insanely hot environment, it won't go past 70c. At least not in my experiences.


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Get two uniblocks.


But..but then the poor vrms! right?


----------



## wermad

Quote:


> Originally Posted by *Feyris*
> 
> But..but then the poor vrms! right?


keep the stock cooler central fan. For the heatsinks, nor sure if you can reuse the stock ones or go with aftermarket.


----------



## wermad

Quote:


> Originally Posted by *Xzow*
> 
> Is it true the 75c thermal limit of this card is not changeable?
> 
> If so, how badly does it impact it during full load performance?


Mine got close with just some brief benching. Though I'm running two. With good airflow and a second fan you maybe able to stay under. But, the summer weather maybe a bigger concern imho.

Edit; crap, my phone double posted


----------



## Sgt Bilko

Quote:


> Originally Posted by *Xzow*
> 
> Is it true the 75c thermal limit of this card is not changeable?
> 
> If so, how badly does it impact it during full load performance?


I did some testing here

Quote:


> Originally Posted by *Sgt Bilko url=*
> Here's some results with my Noctua Industrials, I have some Corsair AF120's, XSPC Xinruilian's and some NZXT FZ-120mm's to try out another day. (I know the Corsair's and NZXT's aren't made for Static Pressure but worth a shot
> 
> 
> 
> 
> 
> 
> 
> )
> 
> No fancy graphs sorry
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Noctua NF-F12 Industrial PPC 2000rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:*69c*
> GPU2:*74c* (No throttle)
> 
> Noctua NF-F12 Industrial PPC 2500rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:*67c*
> GPU2:*71c*
> 
> Noctua NF-F12 Industrial PPC 3000rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:*64c*
> GPU2:*68c*
> 
> Noctua NF-F12 Industrial PPC 2000rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:*64c*
> GPU2:*68c*
> 
> Noctua NF-F12 Industrial PPC 2500rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:*62c*
> GPU2:*66c*
> 
> Noctua NF-F12 Industrial PPC 3000rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:*59c*
> GPU2:*63c*
> 
> EDIT: Fan speed was controlled using a Bitfenix Recon Fan Controller.
> Radiator was mounted as top rear exhaust (Cannot mount as front intake due to there being a 240mm rad in the way
> 
> 
> 
> 
> 
> 
> 
> )
> GPU was at bone stock (1018/1250)


----------



## remnant

Spoiler: prices officially dropping?


----------



## PROBN4LYFE

I would but I've never been a fan of either of those manufacturers...or XFX for that matter. If you can provide me with some "unchallenged" proof...I may reconsider getting back into the game


----------



## Feyris

Quote:


> Originally Posted by *PROBN4LYFE*
> 
> I would but I've never been a fan of either of those manufacturers...or XFX for that matter. If you can provide me with some "unchallenged" proof...I may reconsider getting back into the game


Why no XFX? Their support is top notch. 3 day turn around time to RMA my spazzing 7990s, the first they couldn't even find any issues and failed it for "coil whine" so I could receive another. and ive gotten a brand new sealed in orig box replacement every single time that gets its own warranty. Its a small group of people so relying on email support only takes ages but a ticket then phone call later they always make it right.

Unless locked vcore is why. but there is always bios flashing

Then we go to powercolor, which is a brand that either hits really hard or misses the mark. Devil series is amazing, but the failure rate they had on some series like 7870 mysts was more than desired and coolers are not all that great. However Sapphire would take ground for best AMD gpu brand nearly.

ASUS - DCU is horrid
MSI - Besides Lightnings, myeh
Gigabyte - Never liked them


----------



## wermad

Quote:


> Originally Posted by *remnant*
> 
> 
> 
> Spoiler: prices officially dropping?


Wait for the sales. St. Patty's is coming up, then you have eater, etc.

Forgot to post this


----------



## xarot

Wow that looks nice.









I decided to sell my 295X2. I still have my ARES III in the box waiting for spare time to put back under water.

I was a bit disappointed by 295X2'ss very low throttling temp and the difficulty in installing the radiator and hoses in many cases. They just don't look very nice which caughts my attention every time I open the case. On the other hand, I'd could have put it under water again and that's another thing. Getting these to throttle with good airflow needs no more than Sleeping Dogs and no Vsync.

295X2 would be a perfect card with throttling temp at least 85 degrees C and perhaps a 140 or 240 radiator, though I understand why the latter is not very good option. 120 radiator is just not enough for it, every water cooler knows this and I would not use anything less than a 360 rad with custom water cooling. In current state it's more like a kettle.

Also what's going on with AMD's drivers? Last driver released is from 9th Dec. In case our games don't support CF properly, we'll have to wait for months? :O maybe I was just too used to Nvidia's frequent updates, although for some games the situation is not really any better for them.


----------



## Feyris

Quote:


> Originally Posted by *xarot*
> 
> Wow that looks nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I decided to sell my 295X2. I still have my ARES III in the box waiting for spare time to put back under water.
> 
> I was a bit disappointed by 295X2'ss very low throttling temp and the difficulty in installing the radiator and hoses in many cases. They just don't look very nice which caughts my attention every time I open the case. On the other hand, I'd could have put it under water again and that's another thing. Getting these to throttle with good airflow needs no more than Sleeping Dogs and no Vsync.
> 
> 295X2 would be a perfect card with throttling temp at least 85 degrees C and perhaps a 140 or 240 radiator, though I understand why the latter is not very good option. 120 radiator is just not enough for it, every water cooler knows this and I would not use anything less than a 360 rad with custom water cooling. In current state it's more like a kettle.
> 
> Also what's going on with AMD's drivers? Last driver released is from 9th Dec. In case our games don't support CF properly, we'll have to wait for months? :O maybe I was just too used to Nvidia's frequent updates, although for some games the situation is not really any better for them.


If you still havent sold me send me a PM. i come with holy offerings

AMD Driver is getting updated in next month though


----------



## Orivaa

Quote:


> Originally Posted by *Feyris*
> 
> AMD Driver is getting updated in next month though


Cool, I might finally be able to play Hearthstone without getting BSOD.


----------



## xarot

Quote:


> Originally Posted by *Feyris*
> 
> If you still havent sold me send me a PM. i come with holy offerings
> 
> AMD Driver is getting updated in next month though


Sold already. Btw, our distance seems to be 5166 miles...hope you'll find one closer to you


----------



## kayan

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150710&utm_medium=Email&utm_source=IGNEFL030315&nm_mc=EMC-IGNEFL030315&cm_mmc=EMC-IGNEFL030315-_-EMC-030315-Index-_-DesktopGraphicsCards-_-14150710-S1A1C

$599 after rebate for an XFX. Darn good price!


----------



## wermad

There you go! Finally a great price. Should have waited in hindsight but oh well. Wait....with cali tax I still paid less...score! Get em for those who don't get hit by sales tax!


----------



## Sheyster

Quote:


> Originally Posted by *wermad*
> 
> There you go! Finally a great price. Should have waited in hindsight but oh well. Wait....with cali tax I still paid less...score! Get em for those who don't get hit by sales tax!


Very tempted. Worried about drivers though...







I've owned several AMD cards in the past, including a 5970.

Hmmm...


----------



## Orivaa

Quote:


> Originally Posted by *xarot*
> 
> Sold already. Btw, our distance seems to be 5166 miles...hope you'll find one closer to you


Quote:


> Originally Posted by *Sheyster*
> 
> Very tempted. Worried about drivers though...
> 
> 
> 
> 
> 
> 
> 
> I've owned several AMD cards in the past, including a 5970.
> 
> Hmmm...


>Implying AMD drivers are worse than Nvidia's.


----------



## Feyris

to be honest drivers suck on both spectrums for anything CF/SLI Related. devs2lazy to code properly from start, talk to amdvidia, mess happens, months later a fix "kind of fixes it" on both sides. Single card generation here we come


----------



## remnant

Well just realized that the power supply I wanted to buy is to large to fit in my case. . .








So now I'm having to figure out what to do, maybe a new case. how heavy are these cards; are you ever worried about having them in a 'normal'/vertical orientation?

(Also let me confirm. how big of psu would you suggest for a single 295x2? how about 2 x 295x2?)


----------



## wermad

Lepa G1600 and Enermax 1500 are around 180mm in length. Though these are six rails and make sure you plug the cards properly. I'm using rails 3, 4, 5, & 6 (one rail per 8 pin).


----------



## Alex132

Quote:


> Originally Posted by *remnant*
> 
> Well just realized that the power supply I wanted to buy is to large to fit in my case. . .
> 
> 
> 
> 
> 
> 
> 
> 
> So now I'm having to figure out what to do, maybe a new case. how heavy are these cards; are you ever worried about having them in a 'normal'/vertical orientation?
> 
> (Also let me confirm. how big of psu would you suggest for a single 295x2? how about 2 x 295x2?)


850w for 1, 1200w for 2 (EVGA G2/P2 I would recommend).
If you want to HEAVILY overclock then 1600w for 2 and 1200w for 1.

If it were me, I'd get the 1200w because the price difference is like $30-40









Quote:


> Originally Posted by *Feyris*
> 
> to be honest drivers suck on both spectrums for anything CF/SLI Related. devs2lazy to code properly from start, talk to amdvidia, mess happens, months later a fix "kind of fixes it" on both sides. Single card generation here we come


Can confirm. I have run dual SLI/Crossfire setups since I can remember - and although they have gotten BETTER you still have issues on both sides. Would I say that AMD has better dual GPU drivers than Nvidia? Eh. Nah. tbh I have had more crashes / errors on my 690 than I did on my 5870 Crossfire


----------



## remnant

Quote:


> Originally Posted by *Alex132*
> 
> *850w for 1, 1200w for 2 (EVGA G2/P2 I would recommend).*
> If you want to HEAVILY overclock then 1600w for 2 and 1200w for 1.
> 
> If it were me, I'd get the 1200w because the price difference is like $30-40


Yup the G2 is the PSU I was going to go with.


----------



## Alex132

Quote:


> Originally Posted by *remnant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> *850w for 1, 1200w for 2 (EVGA G2/P2 I would recommend).*
> If you want to HEAVILY overclock then 1600w for 2 and 1200w for 1.
> 
> If it were me, I'd get the 1200w because the price difference is like $30-40
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yup the G2 is the PSU I was going to go with.
Click to expand...

Yeah, 850w is fine for 1 with mild OCs really. 1200w for crazy OC.

These things can pull a lot more than you think


----------



## wermad

I pulled 1200w @ at the wall in benching all stock . I'm sure in games will pull more. I would recommend 1350w for crossfire or a 100w unit with 100w overhead. +1500w for mild to medium oc'ing.


----------



## AtomicFrost

Quote:


> Originally Posted by *kayan*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150710&utm_medium=Email&utm_source=IGNEFL030315&nm_mc=EMC-IGNEFL030315&cm_mmc=EMC-IGNEFL030315-_-EMC-030315-Index-_-DesktopGraphicsCards-_-14150710-S1A1C
> 
> $599 after rebate for an XFX. Darn good price!


That's a good price, but I wonder how much lower it'll drop when the 3xx series is officially announced?

I have that card, and I can confirm that it's a great GPU, and I'm really tempted to grab a second one.









If I was to grab a second one, does anyone have a good idea of where to mount the second radiator in a 900D? The top of my case has three 140mm pulling air in, and the single back exhaust is where I have my current 295x2's radiator mounted.

Have any of you mounted the radiator below the GPU? I know that the manual recommends mounting it above, but since it's a closed loop I don't see why it would be an issue?

However, my AX1500i might not like a 295x2 exhausting directly into its fan intake.


----------



## remnant

Quote:


> Originally Posted by *Alex132*
> 
> Yeah, 850w is fine for 1 with mild OCs really. 1200w for crazy OC.
> 
> These things can pull a lot more than you think


My thinking was I would go with the 800W for my itx build. but the G2 just won't fit.

My next Idea was to buy the 1200W for an atx build and be able to put 2 in with out oc.
Quote:


> Originally Posted by *wermad*
> 
> I pulled 1200w @ at the wall in benching all stock . I'm sure in games will pull more. I would recommend 1350w for crossfire or a 100w unit with 100w overhead. +1500w for mild to medium oc'ing.


Welp that's something to think about


----------



## xer0h0ur

Me personally, I do not like running my power supply at near capacity. So if I were ever going to go from my tri-fired setup to quadfire then I would upgrade to the EVGA Supernova G2 1600W PSU.


----------



## Feyris

And on this day a NVIDIA faithful has been blessed by the holy power and radiance of the 295x2 and hath converted.

Welcome to best team alex


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> Me personally, I do not like running my power supply at near capacity. So if I were ever going to go from my tri-fired setup to quadfire then I would upgrade to the EVGA Supernova G2 1600W PSU.


Yep, would do the same.

Like if I get the R9 295X2 (PLEASE I WANT IT







) I would be very careful about running it on my 850w until I slap in a bigger, better PSU.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> Me personally, I do not like running my power supply at near capacity. So if I were ever going to go from my tri-fired setup to quadfire then I would upgrade to the EVGA Supernova G2 1600W PSU.


really no reason not to.

I would quote but I am on mobile. Maybe when I get home.

There is a reason to buy high end osus. So you can push them.

I'll post the quote when I get home but it comes from one of the PSU gurus


----------



## Phaedrus2129

I somehow forgot to sign up for this club.

I have a Diamond model



It's a bit messy in this pic, I've since replaced the PSU with a fully modular one and tidied up the cable management a bit.


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> And on this day a NVIDIA faithful has been blessed by the holy power and radiance of the 295x2 and hath converted.
> 
> Welcome to best team alex












I can't say anything biased and not look bad - but I am very glad









I think this completes my Nvidia -> AMD switching.
FX5200 -> X1900XT -> 8600m GT -> 5870 crossfire -> 690 -> R9 295X2.


----------



## Sheyster

Really tough choice here... Get this card and have to buy a new power supply to support it, or keep my current PS and get two very energy efficient ASUS 980 Strix? The difference is about $350 for the green team cards factoring in the price of the PS with the AMD card.









EDIT - Is there any chance the 295x2 will work with a 750W PS? Not much else in the system except for the mobo, CPU, RAM, couple of fans and 2 SSD's. I know overclocking is out of the question, but how about stock?


----------



## Feyris

Quote:


> Originally Posted by *Sheyster*
> 
> Really tough choice here... Get this card and have to buy a new power supply to support it, or keep my current PS and get two very energy efficient ASUS 980 Strix? The difference is about $350 for the green team cards factoring in the price of the PS with the AMD card.


ide save your money and get 295x2 and a 850 Evga G2 or a used TT TP-1350 or something decent from OCN Market. Sure, 980 GTX SLI may be slightly faster but not worth investing so much when eventually 300 and Pascal are going to eclipse these.


----------



## Phaedrus2129

Quote:


> Originally Posted by *Sheyster*
> 
> Really tough choice here... Get this card and have to buy a new power supply to support it, or keep my current PS and get two very energy efficient ASUS 980 Strix? The difference is about $350 for the green team cards factoring in the price of the PS with the AMD card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT - Is there any chance the 295x2 will work with a 750W PS? Not much else in the system except for the mobo, CPU, RAM, couple of fans and 2 SSD's. I know overclocking is out of the question, but how about stock?


The 295x2 in my picture above was running on an Antec HCG-750, Bronze 750W power supply. As long as it's a quality unit and you aren't overclocking, 750W is fine.


----------



## ReV2ReD

Quote:


> Originally Posted by *Feyris*
> 
> ide save your money and get 295x2 and a 850 Evga G2 or a used TT TP-1350 or something decent from OCN Market. Sure, 980 GTX SLI may be slightly faster but not worth investing so much when eventually 300 and Pascal are going to eclipse these.


So, I actually just returned a pair of GTX 980s and went back to my 295x2. From my testing, I found that for my use, the 295x2 was actually better overall (I run an Eyefinity 6048x1080 triple screen setup). While the 980s did get higher frame rate spikes, the 295x2 kept the average frame rate and minimum frame rate much higher (~20 fps difference!). I am firmly on Team Red at this point. While my older Z77 mobo isn't quite up to snuff for a Quadfire setup (I had to get rid of my second 295x2 because I'm just not ready to go for an X99 build yet), I am now running Trifire with my 295x2 and a 290X with minimal issues.

I only wish AMD would update their drivers a bit more often. 14.12 came out way back in early December, and I think it's about time we got an update. I'm still seeing some texture flicker on some maps on Battlefield 4 (still, I get almost double the frames in Mantle vs. DX11!).


----------



## F4ze0ne

We should have a beta soon according to Warsam.

http://www.overclock.net/t/1543706/amd-catalyst-15-2-beta-driver-update


----------



## ryno9696

Quote:


> Originally Posted by *AtomicFrost*
> 
> If I was to grab a second one, does anyone have a good idea of where to mount the second radiator in a 900D? The top of my case has three 140mm pulling air in, and the single back exhaust is where I have my current 295x2's radiator mounted.
> 
> Have any of you mounted the radiator below the GPU? I know that the manual recommends mounting it above, but since it's a closed loop I don't see why it would be an issue?


I'm certainly not an expert on these things, but why wouldn't you put the two radiators at the top and pull air in at the back? Heat rises, so the natural draft is going to move upward anyway. I'd even take the extra fan and use it to exhaust at the extra top slot.


----------



## Ragingun

Quote:


> Originally Posted by *kayan*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150710&utm_medium=Email&utm_source=IGNEFL030315&nm_mc=EMC-IGNEFL030315&cm_mmc=EMC-IGNEFL030315-_-EMC-030315-Index-_-DesktopGraphicsCards-_-14150710-S1A1C
> 
> $599 after rebate for an XFX. Darn good price!


Just bought mine last night! Should be here in 2 days! My PSU is only 850w but I remedied that. The last rig I built I sold to a friend along with my favorite PSU I've ever owned, the Enermax Revolution 1050. I regretted that since I sold it will over a year ago. So I called him up and said can I trade ya and that was that! I'm pumped! Probably lore so than ever about a VGA card. The last dual gpu card I owned was a gtx 295 and that card got HOT. So this solution has got me stoked.


----------



## remnant

Quote:


> Originally Posted by *ryno9696*
> 
> I'm certainly not an expert on these things, but why wouldn't you put the two radiators at the top and pull air in at the back? Heat rises, so the natural draft is going to move upward anyway. I'd even take the extra fan and use it to exhaust at the extra top slot.


this should probably be left to those over at cooling forum. they know a lot more than us non experts. i thought about throwing my







in aswell but I can't setting on an idea that I think would be best.


----------



## Ragingun

Quote:


> Originally Posted by *Sheyster*
> 
> Really tough choice here... Get this card and have to buy a new power supply to support it, or keep my current PS and get two very energy efficient ASUS 980 Strix? The difference is about $350 for the green team cards factoring in the price of the PS with the AMD card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT - Is there any chance the 295x2 will work with a 750W PS? Not much else in the system except for the mobo, CPU, RAM, couple of fans and 2 SSD's. I know overclocking is out of the question, but how about stock?


In some games the 295x2 beats the 980 sli. 750w is cutting it close though but a good PSU is ALWAYS worth the investment even to be reused for future builds. Plus if you're looking at 1440p or 2160p the 295x2 takes the cake there.


----------



## AtomicFrost

Quote:


> Originally Posted by *AtomicFrost*
> 
> That's a good price, but I wonder how much lower it'll drop when the 3xx series is officially announced?
> 
> I have that card, and I can confirm that it's a great GPU, and I'm really tempted to grab a second one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I was to grab a second one, does anyone have a good idea of where to mount the second radiator in a 900D? The top of my case has three 140mm pulling air in, and the single back exhaust is where I have my current 295x2's radiator mounted.
> 
> Have any of you mounted the radiator below the GPU? I know that the manual recommends mounting it above, but since it's a closed loop I don't see why it would be an issue?
> 
> However, my AX1500i might not like a 295x2 exhausting directly into its fan intake.


Quote:


> Originally Posted by *ryno9696*
> 
> I'm certainly not an expert on these things, but why wouldn't you put the two radiators at the top and pull air in at the back? Heat rises, so the natural draft is going to move upward anyway. I'd even take the extra fan and use it to exhaust at the extra top slot.


I was thinking about that. However, currently I have my 900D setup for positive air pressure. My H110i GT's radiator is mounted on top along with the 140mm fan that comes with the case. The front has three 120mm pulling in air, and I have two more 120mm pulling in air on the bottom left panel. Right now the 295x2's radiator fan is my only exhaust, and there is a decent amount of passive exhaust out of the mesh back.

I could reverse the top to exhaust, but I'm afraid that might raise my CPU temperatures too much. An H110i GT struggles a bit with a 5960X at ~1.3v core.


----------



## Rx4speed

I have 1 295x2, and i5-4670k @ 3.8, one SSD, one WD Black, Blu-Ray, 3 140mm case fans and 2 Noctua NF-F12 Industrial 2000RPM push/pull fans on my 295x2 radiator, and 3 Dell 2412M's and my router and modem all on my UPS and the most I've ever seen it pull is 770W. Most of the time it's at 729w running Thief benchmark with both GPU's at near 100%. Subtract the 3 monitors(25w each) and the modem and router (20w total) and I'm in the mid 600watt area and then factor in PSU efficiency and I'm around 600W. I use an EVGA 850 Supernova G2 and it's plenty. It's about 1yr old and until a month ago only drop one R9 290. It's for sale for the right price as I'd like to buy a 1000P2 or 1200P2. Probably overkill as there is no way i can hear. the PSU fan over the Noctuas if I have them at full 12v. Christ, I cannot imagine how loud the 3000RPM's are. I should just keep the 850W and buy a 1600w if and when i buy another 295x2.


----------



## xer0h0ur

I actually don't believe the 2000 RPM iPPC NF-F12's are loud. I guess its just me though since I have had some severely loud systems back in the day.


----------



## ryno9696

Quote:


> Originally Posted by *AtomicFrost*
> 
> I was thinking about that. However, currently I have my 900D setup for positive air pressure. My H110i GT's radiator is mounted on top along with the 140mm fan that comes with the case. The front has three 120mm pulling in air, and I have two more 120mm pulling in air on the bottom left panel. Right now the 295x2's radiator fan is my only exhaust, and there is a decent amount of passive exhaust out of the mesh back.
> 
> I could reverse the top to exhaust, but I'm afraid that might raise my CPU temperatures too much. An H110i GT struggles a bit with a 5960X at ~1.3v core.


Again, you should ask in the other forum, but I think you'd do best exhausting all the radiators out the top. With those fans, plus the direct exhaust on the graphics cards you will lose positive pressure, but you'll be pulling cool air through the radiators at the highest rate possible. The only real gain you get with positive pressure is less dust, but that's nothing a little compressed air can't fix.

You could do an experiment to confirm.


----------



## Rx4speed

Quote:


> Originally Posted by *xer0h0ur*
> 
> I actually don't believe the 2000 RPM iPPC NF-F12's are loud. I guess its just me though since I have had some severely loud systems back in the day.


True, back in the day I had some loud systems too, like 8 80mm fans running wide open, lol. The iPPC NF-F12 are awesome fans. I have one of the nf-f12 non-industrial PWM's on my CPU heatsink too. I'm considering buying a Corsair H90 to mount in the top right exhaust in my case, and if I do, I'll certainly buy a NFD-A14 iPPC Ind. 2000RPM PWM fan for that. Because of the rear mounted push/pull 295x2 radiator, I cannot mount a 2x120 radiator and fans in the top(nor anything other that a single exhaust fan in that top/back fan slot), and I want to keep the two front fans slots pulling in clean, filtered air. The 140mm single (only 25-27mm thick) radiator on the H90 is my best option right now anything thicker, or two fans will interfere with my systme RAM.

The fan controller I use for the two 295x2 fans is just one built into my Define R4 case and only has 3 7 and 12v settings. The fans are significantly louder than anything else in my case when running at full 12v, but i run them at 3v when surfing etc and they are nearly silent. I can run them at 7v for alot of games and they do just fine and are very tolerable at that power.

I wish I could figure out a way to tie them to the 295x2's temperature and set a fan curve. My MB has two CPU 4 pins and 3 system 3 pin headers. Maybe I could 'Y' them into one of the sys-fan headers and use Corsair Link or Speedfan to accomplish this? Right now running off the case fan controller, which feeds off a Molex, neither of these programs 'see' the Nocuta's.

Right now I have two NF-F12 PWM fans sitting here gathering dust if anyone needs them.


----------



## xer0h0ur

Yeah when I expand my loop I am going to have to swap out my PWM NF-F12's for the 3-pin version and I will be controlling all of the fans using one of these http://www.amazon.com/Phanteks-PWM-Fan-Controller-PH-PWHUB_01/dp/B00M0R05WE/ref=sr_1_1?s=electronics&ie=UTF8&qid=1425430754&sr=1-1&keywords=Phanteks+PWM+Fan+Hub+Controller connected to a PWM header.


----------



## Mega Man

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Me personally, I do not like running my power supply at near capacity. So if I were ever going to go from my tri-fired setup to quadfire then I would upgrade to the EVGA Supernova G2 1600W PSU.
> 
> 
> 
> really no reason not to.
> 
> I would quote but I am on mobile. Maybe when I get home.
> 
> There is a reason to buy high end osus. So you can push them.
> 
> I'll post the quote when I get home but it comes from one of the PSU gurus
Click to expand...

Quote:


> Originally Posted by *Phaedrus2129*
> 
> I somehow forgot to sign up for this club.
> 
> I have a Diamond model
> 
> It's a bit messy in this pic, I've since replaced the PSU with a fully modular one and tidied up the cable management a bit.


welcome

speak of the devil

Quote:


> Originally Posted by *Phaedrus2129*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> These enthusiast power supplies we all have (high-end Corsair, Antec, SeaSonic, Enermax, Silverstone, etc.) can take a lot more abuse than you think. I have tell people this a lot. These are heavy duty, precision engineered electron pushers, not tinker toys.
> 
> While you can't trust your average cheap or OEM power supply so much, and you can't trust a generic as far as you can throw it... A high-end enthusiast grade power supply is engineered with massive safety margins. Take the Corsair VX550. That's a CWT PSH, and Rocketfish/Best Buy took that same PSU (w/ minor modifications, nothing important) and rated it as a 700W _and it can hold that rating within ATX specs_.
> 
> When you buy a high-end PSU you aren't just buying reliability and performance, you're buying your headroom right there. Some people say, "Well the TX750 can push 900W, so when I buy it I'm getting a 900W PSU". That defeats the purpose. The purpose is that you can treat it as a 750W PSU and draw that amount from it long-term, for extended periods, while a cheap 750W might eventually break. That's part of the reason for buying a high-end PSU, instead of something just adequate, like an OCZ ModXStream or a Rosewill RV2 or a CM Silent Pro.
> 
> 
> 
> When you buy a high end enthusiast power supply, especially one that I can vouch for, you should know that you're buying into more than just a name. You're buying a machine, and one that's a lot tougher than the typical dreck you might have used before. So don't be afraid to use it for what it's intended for. Forget about "extra headroom", forget about babying your PSU or keeping some massive unnecessary safety margin. Go ahead, get that second Fermi and go wild. Most of you have already paid for that ability, so make the most of it.


Quote:


> Originally Posted by *ryno9696*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AtomicFrost*
> 
> If I was to grab a second one, does anyone have a good idea of where to mount the second radiator in a 900D? The top of my case has three 140mm pulling air in, and the single back exhaust is where I have my current 295x2's radiator mounted.
> 
> Have any of you mounted the radiator below the GPU? I know that the manual recommends mounting it above, but since it's a closed loop I don't see why it would be an issue?
> 
> 
> 
> I'm certainly not an expert on these things, but why wouldn't you put the two radiators at the top and pull air in at the back? Heat rises, so the natural draft is going to move upward anyway. I'd even take the extra fan and use it to exhaust at the extra top slot.
Click to expand...

first of all the underlined

unless you are running a 100% fanless ( COMPLETELY passive ) system. heat does not rise ( unless there is a fan pointing up )

heat/air goes where the fan wants it to. period

if anywhere in your case heat is rising, and you dont have a fan pushing it up, there is a problem.

secondly he shouldnt have to goto another forum when asking about these rads, no use/no need, and not helpful. you have an entire thread with people with these rads, right here


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *ryno9696*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AtomicFrost*
> 
> I was thinking about that. However, currently I have my 900D setup for positive air pressure. My H110i GT's radiator is mounted on top along with the 140mm fan that comes with the case. The front has three 120mm pulling in air, and I have two more 120mm pulling in air on the bottom left panel. Right now the 295x2's radiator fan is my only exhaust, and there is a decent amount of passive exhaust out of the mesh back.
> 
> I could reverse the top to exhaust, but I'm afraid that might raise my CPU temperatures too much. An H110i GT struggles a bit with a 5960X at ~1.3v core.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Again, you should ask in the other forum, but I think you'd do best exhausting all the radiators out the top. With those fans, plus the direct exhaust on the graphics cards you will lose positive pressure, but you'll be pulling cool air through the radiators at the highest rate possible. The only real gain you get with positive pressure is less dust, but that's nothing a little compressed air can't fix.
> 
> You could do an experiment to confirm.
Click to expand...




@AtomicFrost
although no one really can tell you whats best, it will differ based on several variables basically it will be unique for you

basic ideas with watercooling/airflow

you want most//all rads as intake.
this will provide the lowest temps possible.

you want positive pressure, you dont need it !

you want to create a "path" for the airflow, not random fans strewn about

again unless running a 100% passive system ( which do exist ) HEAT DOES NOT RISE it goes where the fan tells it to
Quote:


> Originally Posted by *Rx4speed*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I actually don't believe the 2000 RPM iPPC NF-F12's are loud. I guess its just me though since I have had some severely loud systems back in the day.
> 
> 
> 
> True, back in the day I had some loud systems too, like 8 80mm fans running wide open, lol. The iPPC NF-F12 are awesome fans. I have one of the nf-f12 non-industrial PWM's on my CPU heatsink too. I'm considering buying a Corsair H90 to mount in the top right exhaust in my case, and if I do, I'll certainly buy a NFD-A14 iPPC Ind. 2000RPM PWM fan for that. Because of the rear mounted push/pull 295x2 radiator, I cannot mount a 2x120 radiator and fans in the top(nor anything other that a single exhaust fan in that top/back fan slot), and I want to keep the two front fans slots pulling in clean, filtered air. The 140mm single (only 25-27mm thick) radiator on the H90 is my best option right now anything thicker, or two fans will interfere with my systme RAM.
> 
> The fan controller I use for the two 295x2 fans is just one built into my Define R4 case and only has 3 7 and 12v settings. The fans are significantly louder than anything else in my case when running at full 12v, but i run them at 3v when surfing etc and they are nearly silent. I can run them at 7v for alot of games and they do just fine and are very tolerable at that power.
> 
> I wish I could figure out a way to tie them to the 295x2's temperature and set a fan curve. My MB has two CPU 4 pins and 3 system 3 pin headers. Maybe I could 'Y' them into one of the sys-fan headers and use Corsair Link or Speedfan to accomplish this? Right now running off the case fan controller, which feeds off a Molex, neither of these programs 'see' the Nocuta's.
> 
> Right now I have two NF-F12 PWM fans sitting here gathering dust if anyone needs them.
Click to expand...

aquacomputer aquaero will for you or even a power adj 3 ( NOT PWM however, vs the aq5 has 1 pwm and the aq6 has 4 pwm )


Spoiler: Warning: Spoiler!


----------



## xer0h0ur

I have zero reason to spend that type of money on a fan controller when cheaper solutions will work perfectly fine for me.


----------



## Mega Man

so you will spend 600 + on a GPU, but 80 is too much for a fan controller ?









ok, its your money

i will add that everyone has said that, now most of the "everyone" who bought it inc me wont build a pc without them.

and i would recommend a aq6 pro

also to add if you read the quote, that was not directed to you at all


----------



## xer0h0ur

Dude I am not criticizing your own choices. More power to you if you want to spend that kind of money for a fan controller. My case has built in PWM headers and the software works perfectly fine in conjunction with it so if there is a simple solution like using a fan hub to sync all fans with a single PWM header then I am going to take that route instead of spending another couple hundred bucks on a Aquaero 6. If I ever decided to move everything out of this case into a larger case then that Aquaero 6 would be at the top of the list of things to buy.

Edit: I actually bought in when it was $1500 and on top of that spent the money to waterblock it. That is kind of the point. I have already put so much money into the system that at this point I don't even want to pump more money than I already will have to in the loop expansion.


----------



## Mega Man

last time i post this, but you DONT have to buy one for several hundred you can buy a aq5lt for 80, a power adj 3 for ~ 50 -60 at MOST

http://www.performance-pcs.com/fan-control/aquacomputer-poweradjust-3-usb-ultra-series.html
currently OOS

http://www.performance-pcs.com/fan-control/aquacomputer-poweradjust-3-usb-standard-series.html

unfortunately i dont know enough about them to know if you need ultra or usb.

either way your money, and either way it was not directed to you, it was to the person who WANTED to control the fans based on core temp whom i quoted


----------



## Rx4speed

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah when I expand my loop I am going to have to swap out my PWM NF-F12's for the 3-pin version and I will be controlling all of the fans using one of these http://www.amazon.com/Phanteks-PWM-Fan-Controller-PH-PWHUB_01/dp/B00M0R05WE/ref=sr_1_1?s=electronics&ie=UTF8&qid=1425430754&sr=1-1&keywords=Phanteks+PWM+Fan+Hub+Controller connected to a PWM header.


LMAO, I've had one of those in my cart on Amazon for like a month







My only worry is, does my 2nd CPU MB header supply enough current to run both iPPC fans?


----------



## xer0h0ur

Well the only reason I considered that unit was because it could be powered through that SATA power cable so you're not drawing any power from the PWM cable controlling the hub.


----------



## Alex132

Uh, guys. So the R9 295X2 is ~31cm long right?

I only have ~29cm of space in-between my case and reservoir









Any ideas what to do with this hunk of glass? (Other than buy a smaller res)....


----------



## wermad




----------



## washburn

Hi guys good day. I have a R9 295x2 and would like to ask for opinions regarding push/pull exhaust. I bought these https://www.google.com/url?sa=t&source=web&rct=j&ei=6ib3VJmiFpbV8gWO_4LQCg&url=http://www.amazon.com/Phanteks-3-Pin-Y-Splitter-Cable-PH-CB-Y3P/dp/B00M0R9BV0&ved=0CB4QFjAC&usg=AFQjCNEjelw64Z0m--1ju3n2FZ2-NRhfuQ&sig2=wmbThZSkjyvReDAVSJembQ
and 2 of these http://www.amazon.com/gp/aw/d/B00KEST8PQ/ref=mp_s_a_1_fkmr2_2?qid=1425483905&sr=8-2-fkmr2&pi=AC_SX110_SY165_QL70&keywords=Noctua+industrial+3+pin
Is it safe to use the fan header of the graphics card? Thank you!


----------



## Feyris

Quote:


> Originally Posted by *washburn*
> 
> Hi guys good day. I have a R9 295x2 and would like to ask for opinions regarding push/pull exhaust. I bought these https://www.google.com/url?sa=t&source=web&rct=j&ei=6ib3VJmiFpbV8gWO_4LQCg&url=http://www.amazon.com/Phanteks-3-Pin-Y-Splitter-Cable-PH-CB-Y3P/dp/B00M0R9BV0&ved=0CB4QFjAC&usg=AFQjCNEjelw64Z0m--1ju3n2FZ2-NRhfuQ&sig2=wmbThZSkjyvReDAVSJembQ
> and 2 of these http://www.amazon.com/gp/aw/d/B00KEST8PQ/ref=mp_s_a_1_fkmr2_2?qid=1425483905&sr=8-2-fkmr2&pi=AC_SX110_SY165_QL70&keywords=Noctua+industrial+3+pin
> Is it safe to use the fan header of the graphics card? Thank you!


At that point i would use splitter on a header on motherboard.


----------



## washburn

Quote:


> Originally Posted by *Feyris*
> 
> At that point i would use splitter on a header on motherboard.


Thanks for replying! What would you suggest to automatically control the fan speed? Thanks again.


----------



## Feyris

Quote:


> Originally Posted by *washburn*
> 
> Thanks for replying! What would you suggest to automatically control the fan speed? Thanks again.


You could get internal fan controller to be extra cool or just setup fan curve in bios.


----------



## Alex132

Fan curve in BIOS will only go off of CPU/mobo/other built in temps sensors as far as I know. you'll have to use speedfan (pwm controlling software) and set up a fan curve in there relating to GPU temp


----------



## malik22

Hey Guys I have a question I would like to run two corsair sp pwm on the 295x2 rad in push pull can I remove the stock fan completely and disconnect its 3 pin from the gpu? will this cause issues?


----------



## Alex132

Quote:


> Originally Posted by *malik22*
> 
> Hey Guys I have a question I would like to run two corsair sp pwm on the 295x2 rad in push pull can I remove the stock fan completely and disconnect its 3 pin from the gpu? will this cause issues?


Nope, otherwise watercooling wouldn't work nicely on any GPUs.

(Also Quiet SP120s or normal? I have 2 Quiet SP120s laying around and I wonder if they can do the job in push/pull)


----------



## malik22

sorry for being stupid but d you mean yes I can remov it? and I have the quiet editions


----------



## Alex132

Quote:


> Originally Posted by *malik22*
> 
> sorry for being stupid but d you mean yes I can remov it? and I have the quiet editions


I have the Quiet editions too, let me know how they perform









And yes, you can remove it just fine. What I was saying is that almost every single card to put a waterblock on it you will remove the stock cooler (and fan) - and those work just fine. So, yeah, you can remove it just fine


----------



## Sheyster

After very careful consideration I've decided to pass on the Newegg deal. Several reasons: Don't want to have to get a new PS to OC the card, my case is not really suitable for two AIO coolers, and no G-Sync support.

I see a 980 Strix in the near future.







Once I upgrade my monitor (hopefully with an Acer Predator 34"), I'll add a second one.


----------



## Feyris

Quote:


> Originally Posted by *Sheyster*
> 
> After very careful consideration I've decided to pass on the Newegg deal. Several reasons: Don't want to have to get a new PS to OC the card, my case is not really suitable for two AIO coolers, and no G-Sync support.
> 
> I see a 980 Strix in the near future.
> 
> 
> 
> 
> 
> 
> 
> Once I upgrade my monitor (hopefully with an Acer Predator 34"), I'll add a second one.


Lookup Free sync. Its coming out fast very soon. You could do the devil 13


----------



## Sheyster

Quote:


> Originally Posted by *Feyris*
> 
> Lookup Free sync. Its coming out fast very soon. You could do the devil 13


I actually looked at the D13 card also; the price is right on that one as well. A bit too toasty for my taste, and same issue with the PS.









I'm familiar with the AMD Freesync initiative. I specifically want that Acer monitor (even if turns out to be a TN panel, gasp!), mainly for the size of the display. Not sure if they'll actually produce a Freesync version of that monitor or not, but G-Sync is for sure.


----------



## Feyris

Quote:


> Originally Posted by *Sheyster*
> 
> I actually looked at the D13 card also; the price is right on that one as well. A bit too toasty for my taste, and same issue with the PS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm familiar with the AMD Freesync initiative. I specifically want that Acer monitor (even if turns out to be a TN panel, gasp!), mainly for the size of the display. Not sure if they'll actually produce a Freesync version of that monitor or not, but G-Sync is for sure.


Any and all displays with displayport 1.4a are Freesync

Wait for the sexy IPS 4K FS/GS monitors









A PS upgrade isnt bad. I paid 100 for my 1350w you really need just 850 - 1kw for oc

Then selling old one here or amazon to offset.

295 is much faster at higher res. 980 bottlenecks itself from bandwidth


----------



## xer0h0ur

I wonder if Nvidia is forever going to keep their cards locked out of any version of displayport that offers adaptive sync. I wouldn't be surprised.


----------



## Feyris

295x2 D13 IS NOW $580 AFTER REBATE!!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584

if you have/hate/dont need the mouse that makes it even cheaper after you sell it~<3


----------



## wermad

Fyi: Keep the mouse in case you have to rma with newegg. If you loose or sell the mouse, newegg will not help with rma (has to go through the manufacturer) even within the return window.

Wow, jellies now I should have waited Edit: didn't click the link and found its the devil


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Fyi: Keep the mouse in case you have to rma with newegg. If you loose or sell the mouse, newegg will not help with rma (has to go through the manufacturer) even within the return window.
> 
> Wow, jellies now I should have waited


After egg rma period. Sell orobous for 120ish

295 under 500=profit??


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> Fyi: Keep the mouse in case you have to rma with newegg. If you loose or sell the mouse, newegg will not help with rma (has to go through the manufacturer) even within the return window.
> 
> Wow, jellies now I should have waited


no water block though :/


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> no water block though :/










didn't see it was the devil. .. never mind then







. The only aib custom dual core card the maintains its value is the Rog ares and Mars cards. Next is the amd reference.

So tired and a ways to go...


----------



## Mega Man

i like the single slot , i dislike the inputs on the rog


----------



## Feyris

Someone who has the shroud off please tell me if theres any plastic besides fan and if it can be easily removed to leave metal only piece intact

I plan on powdercoating metal pink for a soon to be pinkified buildlog

Thanks<3 mine are coming saturday but i must plan


----------



## Mega Man

i must ask why you love pink so much ?

fav color ?

mothers fav color ?


----------



## Feyris

Quote:


> Originally Posted by *Mega Man*
> 
> i must ask why you love pink so much ?
> 
> fav color ?
> 
> mothers fav color ?


It matches my room mostly. And its cute color to me? Basically asking why a guy would like blue well a girl can like pink xP


----------



## Mega Man

its fine, as i said i just wondered, i started to notice a pattern of pink

i do think it would be awesome, but i dont think it ( powder coating ) will stick well to the surface/material, but i could be wrong


----------



## wermad

New rig














Monsta in close proximity of the Koolance blocks.


----------



## xer0h0ur

Dang that is cutting it close. Those Monsta radiators certainly do take up a lot of real estate.


----------



## remnant

Quote:


> Originally Posted by *wermad*
> 
> New rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Monsta in close proximity of the Koolance blocks.


----------



## Roaches

Not sure if I belong here since its really kinda not a 295X2 though or maybe. This is probably the most heaviest card I ever held on to. It feels so special I'm even afraid of hold it with one hand. Won't be able to install it until Saturday morning due to work schedule.

Took some comparison photos of it and its predecessor. Them quad 8 pin power input is giving me the heebie jeebies. Backplate really makes its stand out, even though I don't like the SMD components exposed, contradicting the typical graphics card backplate design intentions of protecting the PCB. And nope, it doesn't fit in my FT02 case as its just as long as the 7990 version.

Lighting is kinda dim in my room, sorry if my photos don't seem to show much.



Weighs about 5 pounds, much more heavier than the 7990 version and 680 SOC which I'm about to retire.

Can't wait to test this beast of a card. If all goes well might get a second to Quadfire.

#AMDHousefires420BlazeIt.


----------



## wermad

Quote:


> Originally Posted by *remnant*


Thank you









a few more shots:




Quote:


> Originally Posted by *Roaches*
> 
> Not sure if I belong here since its really kinda not a 295X2 though or maybe. This is probably the most heaviest card I ever held on to. It feels so special I'm even afraid of hold it with one hand. Won't be able to install it until Saturday morning due to work schedule.
> 
> Took some comparison photos of it and its predecessor. Them quad 8 pin power input is giving me the heebie jeebies. Backplate really makes its stand out, even though I don't like the SMD components exposed, contradicting the typical graphics card backplate design intentions of protecting the PCB. And nope, it doesn't fit in my FT02 case as its just as long as the 7990 version.
> 
> Lighting is kinda dim in my room, sorry if my photos don't seem to show much.
> 
> *snip*
> 
> Weighs about 5 pounds, much more heavier than the 7990 version and 680 SOC which I'm about to retire.
> 
> Can't wait to test this beast of a card. If all goes well might get a second to Quadfire.
> 
> #AMDHousefires420BlazeIt.


Devil 290Xx2? welcome! Sweet looking devilish (







) beasty there


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> Thank you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> a few more shots:
> Devil 290Xx2? welcome! Sweet looking devilish (
> 
> 
> 
> 
> 
> 
> 
> ) beasty there


Thanks!







I'm actually more worried if its going to rip out my PCI-E slots of the motherboard. Hopefully it doesn't put too much stress when mounted 90 degrees.


----------



## washburn

Quote:


> Originally Posted by *Alex132*
> 
> Fan curve in BIOS will only go off of CPU/mobo/other built in temps sensors as far as I know. you'll have to use speedfan (pwm controlling software) and set up a fan curve in there relating to GPU temp


Hi thanks for the reply. Could you guys help me out with speedfan because I can't control the fans with it? It doesn't seem to affect any of my fans. On the other hand, I could set it on the bios but it only monitors CPU and system temps and not GPU temps. My mobo is a MSI z97 gaming 5. Thanks!

Migrated to new home...


----------



## Roaches

Have you tried MSI Afterburner. It has a fan curve control option which should work. Speedfan is ancient, haven't used it in years.


----------



## p4inkill3r

Quote:


> Originally Posted by *Roaches*
> 
> #AMDHousefires420BlazeIt.


Totally rad. I want to see some benches on that.


----------



## wermad

Quote:


> Originally Posted by *Roaches*
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> I'm actually more worried if its going to rip out my PCI-E slots of the motherboard. Hopefully it doesn't put too much stress when mounted 90 degrees.


I had this concern w/ my pair blocked. I ended up switching to a horizon (aka horizontal) layout, TT X9. Case is pretty decent for the price and it can hold three 480/420 rads. Lots of potential if you go custom water.


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> I had this concern w/ my pair blocked. I ended up switching to a horizon (aka horizontal) layout, TT X9. Case is pretty decent for the price and it can hold three 480/420 rads. Lots of potential if you go custom water.


Ya! ThermalTake did a fantastic job, probably their best enthusiast oriented case in years. I'm fully aware going under water is a must to get the full potential out these heaters. Though I've yet to see how hot the Devil 13 gets on air without worries given my case is designed to maximize airflow.
I got 2 AP182s running at full blast in them.


----------



## washburn

Quote:


> Originally Posted by *Roaches*
> 
> Have you tried MSI Afterburner. It has a fan curve control option which should work. Speedfan is ancient, haven't used it in years.


Feyris suggested to connect it to a motherboard fan header. I'm doing a push/pull with Noctua industrial NF-F12 PWM 3-pin fans with a splitter.


----------



## Mega Man

there is no full cover for it :/


----------



## xer0h0ur

I do not suggest leaving a 5 pound video card unsupported in a PCI-E slot if you're letting gravity do its job.


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> I do not suggest leaving a 5 pound video card unsupported in a PCI-E slot if you're letting gravity do its job.


Thumbscrews for life bro.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> I do not suggest leaving a 5 pound video card unsupported in a PCI-E slot if you're letting gravity do its job.


it is fine unless you are actively moving the rig


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> there is no full cover for it :/


Have NatemanDoo make one







. If not...

Uni-blocks to the rescue!


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> its fine, as i said i just wondered, i started to notice a pattern of pink
> 
> i do think it would be awesome, but i dont think it ( powder coating ) will stick well to the surface/material, but i could be wrong


You'd be surprised how well her rig looks with pink so far. Mayhems X1 pink looks really good.
I'm excited to see the end product at least


----------



## Feyris

Quote:


> Originally Posted by *Mega Man*
> 
> its fine, as i said i just wondered, i started to notice a pattern of pink
> 
> i do think it would be awesome, but i dont think it ( powder coating ) will stick well to the surface/material, but i could be wrong


It can bond to any surface that can widthstand 450f oven. The metals get sand blasted back down to bare then coated.

Thats why its critical because any plastic you cannot detach on shroud would melt


----------



## Alex132

Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...

I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...
> 
> I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.


Here ya go: http://www.overclock.net/t/1540259/r9-295x2-rad-fan-shootout/0_50

The SP120's do quite well, i'm running a couple of Noctua iPPC NF-F12's on mine and the Cougar's should perform well too


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...
> 
> I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.
> 
> 
> 
> Here ya go: http://www.overclock.net/t/1540259/r9-295x2-rad-fan-shootout/0_50
> 
> The SP120's do quite well, i'm running a couple of Noctua iPPC NF-F12's on mine and the Cougar's should perform well too
Click to expand...

Thanks









Oh, also is there a way to disable the red LED?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...
> 
> I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.
> 
> 
> 
> Here ya go: http://www.overclock.net/t/1540259/r9-295x2-rad-fan-shootout/0_50
> 
> The SP120's do quite well, i'm running a couple of Noctua iPPC NF-F12's on mine and the Cougar's should perform well too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, also is there a way to disable the red LED?
Click to expand...

Yes!

Under the shroud you can unplug the LED's power cable, taking off the shroud is quite easy and will not void your warranty


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...
> 
> I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.
> 
> 
> 
> Here ya go: http://www.overclock.net/t/1540259/r9-295x2-rad-fan-shootout/0_50
> 
> The SP120's do quite well, i'm running a couple of Noctua iPPC NF-F12's on mine and the Cougar's should perform well too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, also is there a way to disable the red LED?
> 
> Click to expand...
> 
> Yes!
> 
> Under the shroud you can unplug the LED's power cable, taking off the shroud is quite easy and will not void your warranty
Click to expand...

The LED has its own power cable? +1 AMD.
It really annoys me launching and having to install the bloatware that is known as GF Experience JUST to turn off the LED on my GTX 690.

Coming from Nvidia's "best" dual GPU to AMD's best. I really am going to be interested to see the difference








One thing I love that almost no reviews seem to mention is the customizability of the cooling - simply change out the fans for different ones.

I heard/read that the 295X2 has a thermal throttle of 75'c - whereas the 290Xs have a higher throttle (85'c?). Seeing as how the R9 295X2 should have binned 290X cores - surely the thermal throttle is just for the AIO cooler / heat / power draw. Is it safe going beyond it (with a BIOS flash on the one BIOS to disable it)?

Purely for benchmarking, I will be running this thing stock 24/7 otherwise


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...
> 
> I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.
> 
> 
> 
> Here ya go: http://www.overclock.net/t/1540259/r9-295x2-rad-fan-shootout/0_50
> 
> The SP120's do quite well, i'm running a couple of Noctua iPPC NF-F12's on mine and the Cougar's should perform well too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, also is there a way to disable the red LED?
> 
> Click to expand...
> 
> Yes!
> 
> Under the shroud you can unplug the LED's power cable, taking off the shroud is quite easy and will not void your warranty
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> The LED has its own power cable? +1 AMD.
> It really annoys me launching and having to install the bloatware that is known as GF Experience JUST to turn off the LED on my GTX 690.
> 
> Coming from Nvidia's "best" dual GPU to AMD's best. I really am going to be interested to see the difference
> 
> 
> 
> 
> 
> 
> 
> 
> One thing I love that almost no reviews seem to mention is the customizability of the cooling - simply change out the fans for different ones.
> 
> I heard/read that the 295X2 has a thermal throttle of 75'c - whereas the 290Xs have a higher throttle (85'c?). Seeing as how the R9 295X2 should have binned 290X cores - surely the thermal throttle is just for the AIO cooler / heat / power draw. Is it safe going beyond it (with a BIOS flash on the one BIOS to disable it)?
> 
> Purely for benchmarking, I will be running this thing stock 24/7 otherwise
Click to expand...

here is the cable:



And yeah, the 295x2 throttles at 75c due to the amount of heat the pump can withstand afaik, the 290x has a throttle point of 95c.

i have the Sapphire OC Bios flashed onto my card and it runs quite well, haven't tried any hard benching but with the stock Bios i managed 1150/1625 with +100mV in Afterburner.

With the Sapphire Bios i can go up to +300mV in Sapphire Trixx.....waiting for cooler weather though


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> here is the cable:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And yeah, the 295x2 throttles at 75c due to the amount of heat the pump can withstand afaik, the 290x has a throttle point of 95c.
> 
> i have the Sapphire OC Bios flashed onto my card and it runs quite well, haven't tried any hard benching but with the stock Bios i managed 1150/1625 with +100mV in Afterburner.
> 
> With the Sapphire Bios i can go up to +300mV in Sapphire Trixx.....waiting for cooler weather though


Thanks.

Well surely water temp =/= GPU temp. I know it takes a good long while to heat up my water in my loop. So I'm sure it should be fine hitting ~90'c for a minimal time usage (ie benching).


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> here is the cable:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And yeah, the 295x2 throttles at 75c due to the amount of heat the pump can withstand afaik, the 290x has a throttle point of 95c.
> 
> i have the Sapphire OC Bios flashed onto my card and it runs quite well, haven't tried any hard benching but with the stock Bios i managed 1150/1625 with +100mV in Afterburner.
> 
> With the Sapphire Bios i can go up to +300mV in Sapphire Trixx.....waiting for cooler weather though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.
> 
> Well surely water temp =/= GPU temp. I know it takes a good long while to heat up my water in my loop. So I'm sure it should be fine hitting ~90'c for a minimal time usage (ie benching).
Click to expand...

Should be yeah, if you manage to get past 75c without it throttling let me know, would be interested myself


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> here is the cable:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And yeah, the 295x2 throttles at 75c due to the amount of heat the pump can withstand afaik, the 290x has a throttle point of 95c.
> 
> i have the Sapphire OC Bios flashed onto my card and it runs quite well, haven't tried any hard benching but with the stock Bios i managed 1150/1625 with +100mV in Afterburner.
> 
> With the Sapphire Bios i can go up to +300mV in Sapphire Trixx.....waiting for cooler weather though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.
> 
> Well surely water temp =/= GPU temp. I know it takes a good long while to heat up my water in my loop. So I'm sure it should be fine hitting ~90'c for a minimal time usage (ie benching).
> 
> Click to expand...
> 
> Should be yeah, if you manage to get past 75c without it throttling let me know, would be interested myself
Click to expand...

Well sadly it's gonna be a few weeks til I get my R9 295X2


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> here is the cable:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And yeah, the 295x2 throttles at 75c due to the amount of heat the pump can withstand afaik, the 290x has a throttle point of 95c.
> 
> i have the Sapphire OC Bios flashed onto my card and it runs quite well, haven't tried any hard benching but with the stock Bios i managed 1150/1625 with +100mV in Afterburner.
> 
> With the Sapphire Bios i can go up to +300mV in Sapphire Trixx.....waiting for cooler weather though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.
> 
> Well surely water temp =/= GPU temp. I know it takes a good long while to heat up my water in my loop. So I'm sure it should be fine hitting ~90'c for a minimal time usage (ie benching).
> 
> Click to expand...
> 
> Should be yeah, if you manage to get past 75c without it throttling let me know, would be interested myself
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Well sadly it's gonna be a few weeks til I get my R9 295X2
Click to expand...

long wait....









btw, nice music, having a listen atm


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> long wait....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw, nice music, having a listen atm


Thanks









Haven't really been making much lately, although if you look at my soundcloud you can hear my 690's fan noise lol!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> long wait....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw, nice music, having a listen atm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haven't really been making much lately, although if you look at my soundcloud you can hear my 690's fan noise lol!
Click to expand...

hehe,

A friend used to have a 690.....he hated it, ended up selling it and grabbing a couple of R9 290 Vapor-X's instead.....was kinda jealous till i got my 295x2


----------



## Sheyster

Quote:


> Originally Posted by *wermad*
> 
> New rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Monsta in close proximity of the Koolance blocks.


Looking good pal!









I used to have a huge watercooled mega rig also, but eventually sold it to some guy on CL. I keep it pretty simple now.


----------



## remnant

Quote:


> Originally Posted by *Alex132*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haven't really been making much lately, although if you look at my soundcloud you can hear my 690's fan noise lol!


Taking a listen myself. Any personal favorites/what are you most proud of?


----------



## Alex132

Quote:


> Originally Posted by *remnant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haven't really been making much lately, although if you look at my soundcloud you can hear my 690's fan noise lol!
> 
> 
> 
> Taking a listen myself. Any personal favorites/what are you most proud of?
Click to expand...

Thanks!
And, not really to be honest, I think my latest work (past ~6 months) has been some of my favourite though








My favourite for the past 6 months or so would have to be; Arm1n Remix / See Ya Remix / "2" / Jaded Remix









Sidenote; tried 4k on my GTX 690 couldn't even break 1000 points.


----------



## Sgt Bilko

false
Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *remnant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haven't really been making much lately, although if you look at my soundcloud you can hear my 690's fan noise lol!
> 
> 
> 
> Taking a listen myself. Any personal favorites/what are you most proud of?
> 
> Click to expand...
> 
> Thanks!
> And, not really to be honest, I think my latest work (past ~6 months) has been some of my favourite though
> 
> 
> 
> 
> 
> 
> 
> 
> My favourite for the past 6 months or so would have to be; Arm1n Remix / See Ya Remix / "2" / Jaded Remix
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sidenote; tried 4k on my GTX 690 couldn't even break 1000 points.
Click to expand...

Eeekk, looks like your vram capped out


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Eeekk, looks like your vram capped out


Yeah it says "minimum 3GB required", surprised it let me run the benchmark.
Just doing these now so I can get a comparison of when I get the R9 295X2









Do you guys think my 2500K will be limiting the 295X2? It's at 4.9GHz now, but still kinda old.


----------



## remnant

Quote:


> Originally Posted by *Alex132*
> 
> Yeah it says "minimum 3GB required", surprised it let me run the benchmark.
> Just doing these now so I can get a comparison of when I get the R9 295X2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you guys think my 2500K will be limiting the 295X2? It's at 4.9GHz now, but still kinda old.


Wondering this myself would like to add, if it won't limit a single 295x2 what about dual?


----------



## wermad

Quote:


> Originally Posted by *Sheyster*
> 
> Looking good pal!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used to have a huge watercooled mega rig also, but eventually sold it to some guy on CL. I keep it pretty simple now.


Thanks! I tried to keep it simple but when you consider the great prices I got my wc gear, it just didn't make sense to reduce the gear for the same if not more money then my initial investment in this gear.

Here's some more pics:


----------



## rdr09

Quote:


> Originally Posted by *wermad*
> 
> Thanks! I tried to keep it simple but when you consider the great prices I got my wc gear, it just didn't make sense to reduce the gear for the same if not more money then my initial investment in this gear.
> 
> Here's some more pics:
> 
> 
> Spoiler: Warning: Spoiler!


wow. must have took a lot of planning. very nice build.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Here's some more pics:
> 
> 
> Spoiler: Warning: Spoiler!


Is that a Monsta rad? It's huge...

edit - It is, damn and I thought my RX360 was thick


----------



## F4ze0ne

Quote:


> Originally Posted by *Alex132*
> 
> Do you guys think my 2500K will be limiting the 295X2? It's at 4.9GHz now, but still kinda old.


No. It'll be fine. I ran mine on a 2500K and it performed the same as the 5820K I upgraded too.


----------



## wermad

SB is still pretty good, the only downside is that it doesn't do pcie 3.0.

My old 2700k went toe to toe with a hex SBE and scored less then one frame behind it (3240x1920, @ 5.0, tri 780s).

Quote:


> Originally Posted by *Alex132*
> 
> Is that a Monsta rad? It's huge...
> 
> edit - It is, damn and I thought my RX360 was thick


@86mm thick, it is a monster. There's three 480s in the case. Total overkill, but might as well take up the challenge to fit them







.


----------



## Sheyster

Quote:


> Originally Posted by *wermad*
> 
> Thanks! I tried to keep it simple but when you consider the great prices I got my wc gear, it just didn't make sense to reduce the gear for the same if not more money then my initial investment in this gear.
> Here's some more pics:


Those rads are just crazy huge!


----------



## remnant

Quote:


> Originally Posted by *Alex132*
> 
> Thanks!
> And, not really to be honest, I think my latest work (past ~6 months) has been some of my favourite though
> 
> 
> 
> 
> 
> 
> 
> 
> My favourite for the past 6 months or so would have to be; Arm1n Remix / See Ya Remix / "2" / Jaded Remix
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sidenote; tried 4k on my GTX 690 couldn't even break 1000 points.


fyi been jamming out to your music all day at work


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sheyster*
> 
> Looking good pal!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used to have a huge watercooled mega rig also, but eventually sold it to some guy on CL. I keep it pretty simple now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks! I tried to keep it simple but when you consider the great prices I got my wc gear, it just didn't make sense to reduce the gear for the same if not more money then my initial investment in this gear.
> 
> Here's some more pics:
Click to expand...

lol, you change cases and setups like other people change the oil on their autos.


----------



## Alex132

Quote:


> Originally Posted by *remnant*
> 
> fyi been jamming out to your music all day at work


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> lol, you change cases and setups like other people change the oil on their autos.


There are far more worse offenders then me





















. As mentioned before, new case was a must or run the risk of damaging mb and/or cards. No biggie tbh


----------



## Alex132

I really like the X9 tbh.

Although it's like 80% perfect, I think a V2 or something could really improve it.


----------



## remnant

Quote:


> Originally Posted by *wermad*
> 
> There are far more worse offenders then me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> . As mentioned before, new case was a must or run the risk of damaging mb and/or cards. No biggie tbh


Were you concerned by the weight of the stock card? or after adding the water blocks?


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> I really like the X9 tbh.
> 
> Although it's like 80% perfect, I think a V2 or something could really improve it.


it does have a bit of improvement to make it epic. Mb system is my biggest gripe.
Quote:


> Originally Posted by *remnant*
> 
> Were you concerned by the weight of the stock card? or after adding the water blocks?


With blocks. The blocks alone (sans hardware, port pieces, etc.) came in at 1kg. Once you start adding the pcb, backplate, hardware, bridges (i'm using a Koolance triple bridge) and then multiply this by two cards, its one heavy piggy. I went decided to bail on my lovely 900D and go with the new TT X9.


----------



## Roaches

Manage to fit this thing in my case, its so tall I sorta had to force my front panel in since the cables are sticking out a bit. Cable management might become a problem with a second one.




GTX 680 retirement photo, showcasing the collection, can't get enough of muh triple slot goodness.




I can almost feel the heat, it idles around 43 degrees Celsius, ambient temps is at 80 degrees Fahrenheit.
Its pretty much ready for testing tomorrow, Omega drivers installation was flawless.


----------



## wermad

Mine were idling in the low 50s!









Placed the case on top of my desk next to a window and bam! temps dropped to low 30s


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> Mine were idling in the low 50s!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Placed the case on top of my desk next to a window and bam! temps dropped to low 30s


Its been slightly warm throught this week around the LA County coastal area, though my system isn't far from the window.

I'm tempted to re-TIM the GPU with GC-Extreme, though it might be risking warranty for it.

I sorta got a few redundant backup GPUs I need to sell here which were bought from other OCN members. Closet is almost full with old hardware.


----------



## wermad

Thought about selling them and getting reference or an ares?


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> Thought about selling them and getting reference or an ares?


Thinking about it. I even have a CM mini-ITX case, a Lian Li test bench, some old AMD mobos and CPUs that could make me some quick cash.

If its something from Asus, that'll be the Mars II I'd love to get, though theres nothing I like out of Ares series other than it being the true forerunner of the 295X2 cooling system and that briefcase. Just love collecting huge cards though


----------



## wermad

Doesn't ares 3 come with an EK block fitted??? I think #2 come with an aio (dual tahiti).

Grab some 295x2's and slap some blocks on them. Ppcs.com has the Koolance block for ~$100 (I got two).


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Doesn't ares 3 come with an EK block fitted??? I think #2 come with an aio (dual tahiti).
> 
> Grab some 295x2's and slap some blocks on them. Ppcs.com has the Koolance block for ~$100 (I got two).


Yesh ARES III Comes with rebadged EK Block.

postpone SAKURA projects paintwork or get Koolance block UGHCHOICES


----------



## wermad

Looks like a custom block from ek for a custom pcb. Block and pcb look phatty compared to the reference.


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Looks like a custom block from ek for a custom pcb. Block and pcb look phatty compared to the reference.


YAH I think its wider too...

someone fed it cake


----------



## wermad

Quote:


> Originally Posted by *Feyris*
> 
> YAH I think its wider too...
> 
> someone fed it cake










lol


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol


Asus exec: what can we do to make ares 3 unique?

Rest of board: ph...phases....MOAR PHASES!!!

That is wow. Did they forget xdma exsisted.....


----------



## Sheyster

Quote:


> Originally Posted by *Alex132*
> 
> I really like the X9 tbh.
> 
> Although it's like *80% perfect*, I think a V2 or something could really improve it.


Where I come from, 80% perfect is a B product.







The only video card I'm aware of that has gotten an A+ (100% perfect) rating recently is the ASUS 980 Strix.


----------



## Sheyster

Quote:


> Originally Posted by *Feyris*
> 
> Asus exec: what can we do to make ares 3 unique?
> 
> Rest of board: ph...phases....MOAR PHASES!!!
> 
> That is wow. Did they forget xdma exsisted.....


Quick 'n Dirty way to "justify" the insane price it carries.


----------



## Feyris

Quote:


> Originally Posted by *Sheyster*
> 
> Quick 'n Dirty way to "justify" the insane price it carries.


Whats insane is price of the epeen editions of 980s which make no sense with 295 so low :3


----------



## Sheyster

Quote:


> Originally Posted by *Feyris*
> 
> Whats insane is price of the epeen editions of 980s which make no sense with 295 so low :3


I seriously considered a 295 after seeing the XFX on sale at Newegg. In the end, the cons outweighed the pros (for me at least), plus we're just too close to the next gen cards release for me to justify buying anything new right now. FWIW, I hope the 390X kicks ass.


----------



## Feyris

Quote:


> Originally Posted by *Sheyster*
> 
> I seriously considered a 295 after seeing the XFX on sale at Newegg. In the end, the cons outweighed the pros (for me at least), plus we're just too close to the next gen cards release for me to justify buying anything new right now. FWIW, I hope the 390X kicks ass.


If its anything like es ive seen than it shall~


----------



## wermad

Quote:


> Originally Posted by *Sheyster*
> 
> Where I come from, 80% perfect is a B product.
> 
> 
> 
> 
> 
> 
> 
> The only video card I'm aware of that has gotten an A+ (100% perfect) rating recently is the ASUS 980 Strix.


X9 =


----------



## Alex132

Quote:


> Originally Posted by *Sheyster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I really like the X9 tbh.
> 
> Although it's like *80% perfect*, I think a V2 or something could really improve it.
> 
> 
> 
> Where I come from, 80% perfect is a B product.
> 
> 
> 
> 
> 
> 
> 
> The only video card I'm aware of that has gotten an A+ (100% perfect) rating recently is the ASUS 980 Strix.
Click to expand...

The most impressed I have been with a GPU was my 5870s. Too bad my old PSU killed one and ESD probably killed another (yes, I actually killed a GPU with ESD... I think.... It won't even display ANY video out but spins up just fine etc. It was working before I put it in storage on my shelf for a year or so







)

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sheyster*
> 
> Where I come from, 80% perfect is a B product.
> 
> 
> 
> 
> 
> 
> 
> The only video card I'm aware of that has gotten an A+ (100% perfect) rating recently is the ASUS 980 Strix.
> 
> 
> 
> X9 =
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

I'd much rather have it than my 800D. But it's just missing some styling IMO. Also a dozen or so small things I would love changed on it








I mean, if someone wanted to trade my 800D for that I wouldn't hesitate to say no - but I wouldn't spend money on it right now.


----------



## wermad

wermad puts function over form (hence why he's had a couple of CL in the past







).

btw, i think some of you have confused what alex said. He's referring to the thermaltake X9 case (and not a gpu) as being "80%".


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sheyster*
> 
> Where I come from, 80% perfect is a B product.
> 
> 
> 
> 
> 
> 
> 
> The only video card I'm aware of that has gotten an A+ (100% perfect) rating recently is the ASUS 980 Strix.
> 
> 
> 
> X9 =
Click to expand...

I am seriously considering one (or two) of these for my next build over the 900D tbh......all that space


----------



## remnant

Quote:


> Originally Posted by *Sheyster*
> 
> I seriously considered a 295 after seeing the XFX on sale at Newegg. In the end, the cons outweighed the pros (for me at least), plus we're just too close to the next gen cards release for me to justify buying anything new right now. FWIW, I hope the 390X kicks ass.


Rumors have popped up saying AMD won't be releasing a new card until computex


----------



## xer0h0ur

Quote:


> Originally Posted by *remnant*
> 
> Rumors have popped up saying AMD won't be releasing a new card until computex


The rumors were that GM200 wouldn't be presented until months later as well and that got pushed up. In other words rumors are just rumors.


----------



## Feyris

Quote:


> Originally Posted by *remnant*
> 
> Rumors have popped up saying AMD won't be releasing a new card until computex


Truth, Actually. But they are being prepped for release right after no month+ delays


----------



## Sheyster

Quote:


> Originally Posted by *wermad*
> 
> btw, i think some of you have confused what alex said. He's referring to the thermaltake X9 case (and not a gpu) as being "80%".


DUH! I did indeed miss that. I thought he meant that about the video card. Sorry Alex!


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I am seriously considering one (or two) of these for my next build over the 900D tbh......all that space


Not perfect but very functional


----------



## BradleyW

Hello,

Has anyone found a mGPU profile that works for MGS-V Ground Zeroes on the 295X2?

Thank you.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I am seriously considering one (or two) of these for my next build over the 900D tbh......all that space
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not perfect but very functional
Click to expand...

Well i was planning on a couple of 480mm UT45's to cool everything (CPU, maybe motherboard and at least 2 GPU's) so this seemed like a good way to go......add to the fact it's half the price of the 900D here and it's got me very interested


----------



## Outlaw4lf

So I just OC'ed my 295 a bit these were the results I got, what, I am going to try a bit more









http://www.3dmark.com/3dm/6154132?


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well i was planning on a couple of 480mm UT45's to cool everything (CPU, maybe motherboard and at least 2 GPU's) so this seemed like a good way to go......add to the fact it's half the price of the 900D here and it's got me very interested


Should be great for that







. Have checked out the new BI radiators?


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well i was planning on a couple of 480mm UT45's to cool everything (CPU, maybe motherboard and at least 2 GPU's) so this seemed like a good way to go......add to the fact it's half the price of the 900D here and it's got me very interested
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should be great for that
> 
> 
> 
> 
> 
> 
> 
> . Have checked out the new BI radiators?
Click to expand...

I have not as yet, I've got some time before this all goes in, Building it around Zen + Fiji or Bermuda









Getting gear shipped to Aus is the hard part now......especially with Frozen going down


----------



## fishingfanatic

Careful ocing can become addictive....LOL









FF


----------



## Feyris

I will not be getting to play with 295X2 CF for awhile now. *Fedex has lost my package*

...lost......


----------



## fishingfanatic

That is a sweet looking build sir !!! That thing is going to kick butt!









I LOVE the look of that build with those rads,...

Another who can hardly wait to c some results !

FF









Oh man, a lost package NNNNOOOOOOOOO !!!!!

Hope it gets sorted quickly.









FF


----------



## wermad

Quote:


> Originally Posted by *Feyris*
> 
> I will not be getting to play with 295X2 CF for awhile now. *Fedex has lost my package*
> 
> ...lost......


that sucks







. was it coming from the north-east? Lots of weather delays from up there.


----------



## Roaches

Quote:


> Originally Posted by *Feyris*
> 
> I will not be getting to play with 295X2 CF for awhile now. *Fedex has lost my package*
> 
> ...lost......


That really sucks, I hope it wasn't some FedEx guy that opened the box and stole it while marking the package as lost/delayed.


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> that sucks
> 
> 
> 
> 
> 
> 
> 
> . was it coming from the north-east? Lots of weather delays from up there.


It already came to florida which is when tracking went AWOL went NJ > VA > FL (Gone)
Quote:


> Originally Posted by *Roaches*
> 
> That really sucks, I hope it wasn't some FedEx guy that opened the box and stole it while marking the package as lost/delayed.


I dont really know. It went from delivery date to:
No estimated delivery date available at this time.







and even if it does come out of being awol then they dont delivery next two days anyways









Its not officially missing but the city it left was two hours away from me. Unless theres some sort of timewarp going on, there should of been a scan in sometime between 2pm yesterday and now between hubs and usually they already deliver, Ive had items leave yulee at 3am same day delivery was scheduled make it. so this is insane ;;

Which makes me think something very funky is going on.


----------



## wermad

weird things can happen. Had a fedex package go from socal to norcal and back to socal and eventually the san diego hub. as long as there's some tracking info, it should be on its way. More then likely, you'll get an update tomorrow night or monday morning (~5-6am).


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> weird things can happen. Had a fedex package go from socal to norcal and back to socal and eventually the san diego hub. as long as there's some tracking info, it should be on its way. More then likely, you'll get an update tomorrow night or monday morning (~5-6am).


ogod that reminds me of the time company entered wrong zipcode for a rma issued label from usps. it went from FL to new york (the wrong zipcode) and then made its way back to texas which took 2+ weeks for a 2 day label







luckily they realized this and agreed to send me replacement even though they had not gotten it back yet


----------



## Roaches

Quote:


> Originally Posted by *Feyris*
> 
> It already came to florida which is when tracking went AWOL went NJ > VA > FL (Gone)
> I dont really know. It went from delivery date to:
> No estimated delivery date available at this time.
> 
> 
> 
> 
> 
> 
> 
> and even if it does come out of being awol then they dont delivery next two days anyways
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its not officially missing but the city it left was two hours away from me. Unless theres some sort of timewarp going on, there should of been a scan in sometime between 2pm yesterday and now between hubs and usually they already deliver, Ive had items leave yulee at 3am same day delivery was scheduled make it. so this is insane ;;
> 
> Which makes me think something very funky is going on.


I once had a strange experience from Fed-Ex once with Newegg. Bought some fans and accessories that shipped from the Tennessee warehouse, took over a week for it to reach my city of Long Beach. Next day, it was in Los Angeles and tracking went off the radar. Waited 2 weeks and persistently contacted new CS, which in the end refunded my money by threatening them to do a bank chargeback.


----------



## Gabe324

Getting my r9 295x2 in 2 days , just bought it from newegg for 599$ , sold my gtx 980. I recently got a lepa g1600 ma power supply also , anyone tried that psu ??? Also my i7 4790k at 5ghz should be more then enough for this card right??? I'm gaming at 1440p at 120HZ


----------



## wermad

Did you get a Powercolor Devil 290Xx2 or a reference 295x2?

I'm using that psu, and was recommended to use one rail for each 8-pin. I currently have rails 3 & 4 for card #1 and 5 & 6 for card #2. I took off the extra cable from each harness on the Lepa to make it cleaner.


----------



## Mega Man

@Feyris

i found no plastic on the shroud that couldnt be removed, the radeon led is barely adhesive on

which if anyone wanted to know is 12vdc ( my multimeter read 11.61vdc )


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> @Feyris
> 
> i found no plastic on the shroud that couldnt be removed, the radeon led is barely adhesive on
> 
> which if anyone wanted to know is 12vdc ( my multimeter read 11.61vdc )


I think she's going for the Koolance block now









Out of interest, is there only one LED in the 295X2?


----------



## Feyris

Quote:


> Originally Posted by *Alex132*
> 
> I think she's going for the Koolance block now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Out of interest, is there only one LED in the 295X2?


I
CANT
DECIDE
IF
I
SHOULD


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I think she's going for the Koolance block now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Out of interest, is there only one LED in the 295X2?
> 
> 
> 
> I
> CANT
> DECIDE
> IF
> I
> SHOULD
Click to expand...

DOOOOOOOOOOOIIIIIIIIIIIIITTTTTTTTTTTTTTTTT


----------



## wermad

I'm sure the xspc and the ek blocks have or can be made with led holes.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> I'm sure the xspc and the ek blocks have or can be made with led holes.


XSPC WB has led holes I'm sure. I meant in the actual stock cooler itself


----------



## wermad

mea culpa

Doesn't the Radeon name get its illumination from the fan?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> mea culpa
> 
> Doesn't the Radeon name get its illumination from the fan?


I'd tell you if mine wasn't lost in the post by Fedex....


----------



## wermad

mine are in storage, buried in my closet full of tools and spare pc parts/boxes.

edit: wait...I've heard the Radeon logo can glow some times. so it might mean its a different set of led(s) for it.


----------



## Feyris

Quote:


> Originally Posted by *Alex132*
> 
> DOOOOOOOOOOOIIIIIIIIIIIIITTTTTTTTTTTTTTTTT






I found this, I missed this HOW

ALSO, I thought RAZOR blocks way 100 tons more than say koolances, which is why I am weary.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> mea culpa
> 
> Doesn't the Radeon name get its illumination from the fan?


It's a separate LED


----------



## Alex132

They are. Just get Koolance blocks for cheap now?


----------



## Alex132

How to turn off fan LED then?


----------



## wermad

I would


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> They are. Just get Koolance blocks for cheap now?


Might as well, i mean $100 for the block is quite nice
Quote:


> Originally Posted by *Alex132*
> 
> How to turn off fan LED then?


Unplug the cable


----------



## Roaches

Heres my 3D Mark FireStrike scores on the Devil 13:

http://www.3dmark.com/fs/4262010 FireStrike

http://www.3dmark.com/fs/4262067 FireStrike Extreme

http://www.3dmark.com/fs/4262334 FireStrike Ultra

Not sure what up with 3D Mark on about my DRAM clocks, though its reads 1866mhz on the BIOS


----------



## wermad

$100 a piece....dooooooooooooooooooooo..........iiiiiiiiiiiiiiiiiiiitttttttttttttttttttttttttttttttttttttttttt!


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> $100 a piece....dooooooooooooooooooooo..........iiiiiiiiiiiiiiiiiiiitttttttttttttttttttttttttttttttttttttttttt!


Stop now I want one ;-;

I cannot possible cool it on just an RX360 with my CPU if my CPU hits high 60s already


----------



## wermad

The rule of thumb is one 120mm rad for each core (cpu &gpu). A thick rad like the rx should handle your loop unless you plan to cool crossfire Vesuvius.

My old 2600kand 2700k would hit the low to mid sixties in large loops (several thick rads) @4.8-5.0. The RX is still a better leap then what an aio will provide you. Especially a single rad aio working overtime in the 295x2.

You can always mod your 800D with two RX480s











Got me into motm a few years ago but came in fifth or sixth. You can always do the 240 mod below.


----------



## Alex132

I'd do the mods firstly if I could do a money-mod on life


----------



## Feyris

Ordered the koolance. goodbye $130







damn you tax


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> Ordered the koolance. goodbye $130
> 
> 
> 
> 
> 
> 
> 
> damn you tax


Yushhhhhhhhhhhh


----------



## wermad

At least you don't pay cali tax on a 295x2


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> At least you don't pay cali tax on a 295x2


PPCS tax + Shipping for a 70 mile trip basically IS cali tax









ie should I worry about these sagging slot? I really wish I had a powerjack from devil series


----------



## wermad

You should be able to buy the PC support bracket nah?

Or go horizon case!


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> You should be able to buy the PC support bracket nah?
> 
> Or go horizon case!


powercolor stopped selling them a long long time ago and i cant find any cheap that are decent for support


----------



## wermad

Even the HAF-X vga bracket is no longer listed on CM's store









edit: nm:

http://www.cmstore-usa.com/haf-x-accessories-kit-oem-package/

http://www.cmstore-usa.com/cm-690-ii-accessory-kit-oem-package/



Well, go custom or diy!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roaches*
> 
> Heres my 3D Mark FireStrike scores on the Devil 13:
> 
> http://www.3dmark.com/fs/4262010 FireStrike
> 
> http://www.3dmark.com/fs/4262067 FireStrike Extreme
> 
> http://www.3dmark.com/fs/4262334 FireStrike Ultra
> 
> Not sure what up with 3D Mark on about my DRAM clocks, though its reads 1866mhz on the BIOS


3DMark rarely reads Ram clocks right.....i can run it at 2400Mhz and it will tell me 667Mhz









nice scores though, what temps do you get with that beast under load?


----------



## Roaches

Quote:


> Originally Posted by *Feyris*
> 
> PPCS tax + Shipping for a 70 mile trip basically IS cali tax
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ie should I worry about these sagging slot? I really wish I had a powerjack from devil series


I have 2 Powerjacks that I'll probably never use and a third one coming from the second card once it arrives. I can ship it to you if you want it at the cost of shipping.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3DMark rarely reads Ram clocks right.....i can run it at 2400Mhz and it will tell me 667Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nice scores though, what temps do you get with that beast under load?




It can go as high around 70s on both cores with 100% fan speed, but in long gaming sessions I've seen it edge around 80-85 degrees celsius on the hottest core. Somehow the first core is about 10 degrees cooler than the second.


----------



## Feyris

Quote:


> Originally Posted by *Roaches*
> 
> I have 2 Powerjacks that I'll probably never use and a third one coming from the second card once it arrives. I can ship it to you if you want it at the cost of shipping.
> 
> 
> It can go as high around 70s on both cores with 100% fan speed, but in long gaming sessions I've seen it edge around 80-85 degrees celsius on the hottest core. Somehow the first core is about 10 degrees cooler than the second.


Quote:


> Originally Posted by *wermad*
> 
> Even the HAF-X vga bracket is no longer listed on CM's store
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, go custom or diy!


Thank you sags always been issue even w 7990
Omg ide love u forever sure!!!!


----------



## wermad

yup, had it with 4870x2, 590s, and 690s. though the keplers ended up on a huge custom MM:


Spoiler: Warning: Spoiler!


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> mea culpa
> 
> Doesn't the Radeon name get its illumination from the fan?


No and it is easily removable
Quote:


> Originally Posted by *Feyris*
> 
> Ordered the koolance. goodbye $130
> 
> 
> 
> 
> 
> 
> 
> damn you tax


Should a got the eks. I just blocked mine tonight. ..

God it had been a horrible weekend


----------



## wermad

Koolance was a breeze to install







Removing the stock cooler was a major pita.


----------



## Feyris

Quote:


> Originally Posted by *Mega Man*
> 
> No and it is easily removable
> Should a got the eks. I just blocked mine tonight. ..
> 
> God it had been a horrible weekend


My sugardaddy isnt sugary enough for that. Infact he owes me lol


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> My sugardaddy isnt sugary enough for that. Infact he owes me lol




Paddle = Feyris' statement
Blonde girl = me
Red/pink-haired girl = Feyris


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Koolance was a breeze to install
> 
> 
> 
> 
> 
> 
> 
> Removing the stock cooler was a major pita.


How are temps (GPU and VRM) with that Koolance block? Also, what about a backplate, or is the stock one used?


----------



## wermad

I haven't checked VRM temps, but the cores idle @30c. I'll check the vrm's tonight.

I'm using the stock xfx backplates.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> I haven't checked VRM temps, but the cores idle @30c. I'll check the vrm's tonight.
> 
> I'm using the stock xfx backplates.


I'd appreciate it. I'm still awaiting my caselabs case, so still no PC, but if temps are good I'd like to jump on the sale price for this block.


----------



## Mega Man

The blocks were so easy.

Last week I installed my 4x290xs and my Rivbe top pcie died.

Bought a new board. Was supposed to be here fri.

( it was guaranteed shipping by Fri ) they decided to deliver mon. I had to go pick it up (3 hours of my life )

I found it was the board and not the chip.

Installed quadfire.

Was verifying fittings were snug found one of my blocks striped.

(Ty to Swiftech for being awesome and taking care of me)

Just been one thing after another. .. continuing today. ..

The EK blocks look good. ... only complaint I wish they had black io plates


----------



## Roaches

Anyone getting Blackscreen (monitors not picking up signal) on wake from sleep? Its been happening quite often that I'm forced to do a system reset everytime it happens. I've tried removing DVI/DP cords and re-plugging it in back to the GPU, still no display. Other than that its been solid when gaming.

I usually just shut down my system from now on.


----------



## Ragingun

I'm returning my 2 day old 295. The thing randomly restarts my PC in the middle of gaming. I was happy as a peach with my last AMD purchase, the 7950 but not so sure now about this purchase. Fairly disappointed to say the least.


----------



## Roaches

Quote:


> Originally Posted by *Ragingun*
> 
> I'm returning my 2 day old 295. The thing randomly restarts my PC in the middle of gaming. I was happy as a peach with my last AMD purchase, the 7950 but not so sure now about this purchase. Fairly disappointed to say the least.


Sounds to me a power supply has been passing out on you. Post your system specs.


----------



## steezebe

Quote:


> Originally Posted by *wermad*
> 
> The rule of thumb is one 120mm rad for each core (cpu &gpu). A thick rad like the rx should handle your loop]


I must inquire as to where this rule of thumb came from?


----------



## wermad

Quote:


> Originally Posted by *steezebe*
> 
> I must inquire as to where this rule of thumb came from?


Wc thread/club. Six years of hanging out there and about 20 water cooled builds under my belt









Sorry guys, no benchies







. I'm running into the dreaded windows update failures on a fresh install of win7 (3rd reformat). I've gone through this extensively and I'm about to throw in the towel and pickup win8.1







.


----------



## tsm106

Quote:


> Originally Posted by *steezebe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> The rule of thumb is one 120mm rad for each core (cpu &gpu). A thick rad like the rx should handle your loop]
> 
> 
> 
> I must inquire as to where this rule of thumb came from?
Click to expand...

It came about a long freaking time ago. Although imo it is not really applicable because we have some pretty high tdp devices especially with the overclocking. If you're stock you can feasibly stick to that rule of thumb plus some. If you overclock double or triple it. If you don't like NOISE, double or triple it. I use a different method for my sanity and overclocking and that is to go by power draw. For each watt of power draw at maximum it requires 1mm of radiator surface area. This gives you silent cooling at around 1k rpm fan speed without melting your rig into the ground and plenty of reserve to handle some serious benching. If I had stuck to the 120mm per block rule, lmao I would be limited to 6 blocks + 120mm = 840mm. Haha, I would have to facepalm all the time if I ran that low amount of rad surface. In reality I 3120mm of rad when benching, but for 24/7 I shutdown the two GTX480s leaving 2160mm. You're probably thinking he's freaking nuts lol. But look below and think, how would you cool just two gpus and a cpu clocked very high with only 120mm per block + 120mm?



http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33400_40#post_23296223


----------



## wermad

Quote:


> Originally Posted by *wermad*
> 
> Wc thread/club. Six years of hanging out there and about 20 water cooled builds under my belt


Just to clarify, I didn't come up with this rule. It's been passed along by experienced watercoolers.

Think of this as a starting point; a minimum recommended if you will.

There's lots of variables that can come to play in each build that may require more. If you're unsure, ask (best in the Wc thread) and/or find a similar build/setup to use as a reference (ask the owner for help as well).


----------



## Mega Man

yep 120 + 1 rad ( 120 ) for your cpu and gpus, basic knowledge


----------



## Feyris

Rule of thumb is a liar. Waterblocked 7990 with 2 120MM rads on its own loop (thermaltake bigwater pro) with four fans still got to 75c load








thats what made me go full custom loop in first place!


----------



## tsm106

Quote:


> Originally Posted by *Feyris*
> 
> Rule of thumb is a liar. Waterblocked 7990 with 2 120MM rads on its own loop (thermaltake bigwater pro) with four fans still got to 75c load
> 
> 
> 
> 
> 
> 
> 
> thats what made me go full custom loop in first place!


Exactly. If I ran rule of thumb, there's no way I could clock my Lightnings to 1300/1700 times four, ahha crazy. At 840mm I could barely run over stock lol.

Btw, if you asked Martin, he'd tell you he runs 360mm per block.


----------



## Mega Man

besides that , you didnt follow the rule of thumb, you needed 3x120

( 120 ) +(120x number of cpu/gpus )


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> besides that , you didnt follow the rule of thumb, you needed 3x120
> 
> ( 120 ) +(120x number of cpu/gpus )


^^That was confusing.

Quote:


> Rule of thumb is to use at least one 120mm radiator (section) per each water cooled component plus one 'spare'.
> 
> For example, if one is liquid cooling a *CPU and a single high-performance graphics card* it is recommended at least one 240mm (2x 120mm) radiator for good performance. *Ideally, one would get a 360mm (3x 120mm) radiator for best performance*. Motherboard- and memory water blocks usually have lower power output therefore they are not included in this equation.


http://www.ekwb.com/support/index.php?act=article&code=view&id=27


----------



## Feyris

Quote:


> Originally Posted by *tsm106*
> 
> ^^That was confusing.
> http://www.ekwb.com/support/index.php?act=article&code=view&id=27


and to Swiftech...They recommended 360mm rad ontop of the H220X with 7990 and (3770k at the time) for my loop based on "what they experienced internally" You could use less rad space with nvidias I suppose though, the 500W+ Draw is the issue.


----------



## tsm106

That seems more realistic compared to the rubbish ek printed on their site.


----------



## Feyris

Quote:


> Originally Posted by *tsm106*
> 
> That seems more realistic compared to the rubbish ek printed on their site.


Honestly, I thought they were joking at first.


----------



## wermad

TT big water kits are rubish. might as well hack up an aio.

TSM is pushing for Mach10 on his gpu, his "needs" don't apply to 99% of the market

120mm usually implies to a 120xx120x35 and again, its a starting point. Once you've done wc for a while, then start moving up.

I used a Big water kit, it sucked, leaked, and was just a crappy experience.

I've had to cool quad gtx 480s oc'd to 965mhz on an x79 setup, so I'm no stranger to large heat-generating setups.

You have to consider this rule is for general guidelines for all. Not the 5-10% *we* fall into. In other words, this may not apply to all, as there are exceptions, both cases and users.

manufacturers will always embellish both recommendations. Didn't they recommend a 1200w for a single GTX 690 (







).

So, in conclusion, if you're new to wc and don't have to cool record-breaking gpu's, or aren't looking for every single last C of cooling, *start* with this guideline.

Also, this rule/guideline is meant for custom wc, and not aio or pieced-aio. Its a great stepping-stone as custom wc allows the freedom to upgrade your loop easily.


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> TT big water kits are rubish. might as well hack up an aio.
> 
> manufacturers will always embellish both recommendations. Didn't they recommend a 1200w for a single GTX 690 (
> 
> 
> 
> 
> 
> 
> 
> ).


They...they..seriously...did... (brb on floor laughing) that?!


----------



## wermad

Lol, I know. I've fallen for such nonsense in the way past but then learned the actual truth. Its all marketing gimmicks for consumers to buy big!

So the 390x is coming out with an aio. I'm hoping Amd learns from the 295x2 and bundles the 395x2 with a 240. Maybe PC will offer an EK blocked version w/ full warranty. 600w 395x2


----------



## electro2u

Ran a 295x2 + 290x + 4790k on a 360 + 280. It wasn't enough.


----------



## wermad

Check your air flow. I went from 50c idle on each core down to 30 by just taking my case from the floor under my desk to sitting on top of my desk. Even three Monsta 480mm 86mm thick rads can be chocked by improper air flow.


----------



## electro2u

Yah, I've made some changes and it would probably go better now that I'm not completely clueless but I gave up and went for 980s instead. Was cleaning my 295x2 block today and it spit at me twice.


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Lol, I know. I've fallen for such nonsense in the way past but then learned the actual truth. Its all marketing gimmicks for consumers to buy big!
> 
> So the 390x is coming out with an aio. I'm hoping Amd learns from the 295x2 and bundles the 395x2 with a 240. Maybe PC will offer an EK blocked version w/ full warranty. 600w 395x2


I wish I could tell you guys what Ive witnessed of the ES







but yes there will be more AIO WC Radeons through project hydra, as for a 240mm probably not but a double-thick or P/P bundled in...perhaps


----------



## wermad

Quote:


> Originally Posted by *electro2u*
> 
> Yah, I've made some changes and it would probably go better now that I'm not completely clueless but I gave up and went for 980s instead. Was cleaning my 295x2 block today and it spit at me twice.


Amd has really been pushing for more power draw, so these guys do generate a lot of heat. Nvidia, besides GK and probably GM200, has always been thrifty with power and has had low heat outputs as well. My old tri 780s were a bit toastier vs the quad 690s I had previously.

Quote:


> Originally Posted by *Feyris*
> 
> I wish I could tell you guys what Ive witnessed of the ES
> 
> 
> 
> 
> 
> 
> 
> but yes there will be more AIO WC Radeons through project hydra, as for a 240mm probably not but a double-thick or P/P bundled in...perhaps


Kewl









Well, if its like a 45 or a 55/60 that would at least do for adequate I guess. But thicker rads also fall under the law of diminishing-returns realm (I know







). If that's the case, its gonna be a challenge as the 120 in the 295x2 is not enough imho. 5k rpm fans + 120x60mm rad + 2x 300w cores....


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Amd has really been pushing for more power draw, so these guys do generate a lot of heat. Nvidia, besides GK and probably GM200, has always been thrifty with power and has had low heat outputs as well. My old tri 780s were a bit toastier vs the quad 690s I had previously.
> Kewl
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, if its like a 45 or a 55/60 that would at least do for adequate I guess. But thicker rads also fall under the law of diminishing-returns realm (I know
> 
> 
> 
> 
> 
> 
> 
> ). If that's the case, its gonna be a challenge as the 120 in the 295x2 is not enough imho. 5k rpm fans + 120x60mm rad + 2x 300w cores....


Part of the offset is using a more efficient AIO than asetek can provide (fine for 390x not 395 though) And higher max temp Limit Too


----------



## wermad

If asetek owns the patent, its pretty much the same design. from reading the h100 reviews, bigger rads (and not necessirly thicker) is the way to go. I'm sure those who pony up $1500+ do have a case with at least a 240mm mount me thanks


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> If asetek owns the patent, its pretty much the same design. from reading the h100 reviews, bigger rads (and not necessirly thicker) is the way to go. I'm sure those who pony up $1500+ do have a case with at least a 240mm mount me thanks


Better internals hopefully owo


----------



## Ragingun

Quote:


> Originally Posted by *Roaches*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> I'm returning my 2 day old 295. The thing randomly restarts my PC in the middle of gaming. I was happy as a peach with my last AMD purchase, the 7950 but not so sure now about this purchase. Fairly disappointed to say the least.
> 
> 
> 
> Sounds to me a power supply has been passing out on you. Post your system specs.
Click to expand...

X 99 with 5820k, Enermax 1050w and the 295x2. When my Enermax gets over drawn or over heated it will flash a red led when shut down occurs, it's not. It's just going to the stand by color. I've got a dedicated rail of 30 amps going to the card.


----------



## electro2u

Quote:


> Originally Posted by *Ragingun*
> 
> X 99 with 5820k, Enermax 1050w and the 295x2. When my Enermax gets over drawn or over heated it will flash a red led when shut down occurs, it's not. It's just going to the stand by color. I've got a dedicated rail of 30 amps going to the card.


It needs 50 amps.
You have to use two separate 30amp rails for each 8 pin connector on a 295x2. My seasonic x1250 does the same when I was using one 30amp rail on my 295x2. Shutdown no warning.


----------



## wermad

Yup, my xfx came with a warning card. For multi rail psu's, you need at least 2528amps per 8-pin connector. I'm using four 30 amp rails on my six rail Lepa.


----------



## malik22

hey guys Im forced to run my 295x2 in the 4 slot of my asus x99 so I have 3.0 at 8x will this effect performance on the card?


----------



## wermad

Nope; Hard did a review on a MVE crossfire 295x2 each running 8x 3.0.


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Yup, my xfx came with a warning card. For multi rail psu's, you need at least 25amps per 8-pin connector. I'm using four 30 amp rails on my six rail Lepa.


does that mean plugging one cable into every 2 plugs?

On thermaltake 1350 tp i plugged the plugs for tmr on two closest gpu slots on top and bottom....


----------



## Ragingun

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> X 99 with 5820k, Enermax 1050w and the 295x2. When my Enermax gets over drawn or over heated it will flash a red led when shut down occurs, it's not. It's just going to the stand by color. I've got a dedicated rail of 30 amps going to the card.
> 
> 
> 
> It needs 50 amps.
> You have to use two separate 30amp rails for each 8 pin connector on a 295x2. My seasonic x1250 does the same when I was using one 30amp rail on my 295x2. Shutdown no warning.
Click to expand...

Why is AMD recommending 28 amps in that case?


----------



## wermad

Lepa G1600:



yellow: 295x2 #2
blue: 295x2 #1

Mine is a six rail system with 20 amps for the first two rails (mainly for the mb/cpu) and four 30 amp rails (vga and peripherals). Each vga harness has two 6+2. I removed one cable from each harness to make it loo cleaner.



Quote:


> Originally Posted by *Feyris*
> 
> does that mean plugging one cable into every 2 plugs?
> 
> On thermaltake 1350 tp i plugged the plugs for tmr on two closest gpu slots on top and bottom....


Should be fine as your dual rail has the first rail (60amps) set for all vga connections:
Quote:


> 12V Rails Distribution
> 24 PIN Main Connector 12V2
> 4+4PIN +12V CPU Connector 12V2
> 8 PIN +12V CPU Connector 12V2
> Peripheral & Floppy Connector 12V2
> S-ATA Connector 12V2
> 
> 6pin Modular PCI-E Connector
> 
> 12V1
> 
> 6pin Modular PCI-E Connector
> 
> 12V1
> 
> 6pin Modular PCI-E Connector
> 
> 12V1
> 6+2pin Modular PCI-E Connector 12V1
> 6+2pin Modular PCI-E Connector 12V1
> 6+2pin Modular PCI-E Connector 12V1


http://www.thermaltakeusa.com/Power_Supply/Toughpower_Series_/Toughpower/C_00001660/Toughpower_1350W/design.htm


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Yup, my xfx came with a warning card. For multi rail psu's, you need at least 25amps per 8-pin connector. I'm using four 30 amp rails on my six rail Lepa.
> 
> 
> 
> does that mean plugging one cable into every 2 plugs?
> 
> On thermaltake 1350 tp i plugged the plugs for tmr on two closest gpu slots on top and bottom....
Click to expand...

With the TT 1350w you have 2 12v rails each capable of 750w. 12v1 and 12v2 respectively.
Problem is that the 12v1 runs everything BUT the PCIE power, the 12v2 runs the PCIE power entirely.

I mean, seeing as how it's just a temporary setup for benchmarks and stuff it's not the end of the world, but I'd be very cautious about running 2 off of a 750w rail.

















http://www.hardocp.com/article/2011/08/15/thermaltake_toughpower_1350w_power_supply_review/2#.VP64HeH3TyQ

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Lepa G1600:
> 
> 
> 
> yellow: 295x2 #2
> blue: 295x2 #1
> 
> Mine is a six rail system with 20 amps for the first two rails (mainly for the mb/cpu) and four 30 amp rails (vga and peripherals). Each vga harness has two 6+2. I removed one cable from each harness to make it loo cleaner.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Feyris*
> 
> does that mean plugging one cable into every 2 plugs?
> 
> On thermaltake 1350 tp i plugged the plugs for tmr on two closest gpu slots on top and bottom....
> 
> 
> 
> Should be fine as your dual rail has the second rail (60amps) set for all vga connections:
> Quote:
> 
> 
> 
> 12V Rails Distribution
> 24 PIN Main Connector 12V2
> 4+4PIN +12V CPU Connector 12V2
> 8 PIN +12V CPU Connector 12V2
> Peripheral & Floppy Connector 12V2
> S-ATA Connector 12V2
> 
> 6pin Modular PCI-E Connector
> 
> 12V1
> 
> 6pin Modular PCI-E Connector
> 
> 12V1
> 
> 6pin Modular PCI-E Connector
> 
> 12V1
> 6+2pin Modular PCI-E Connector 12V1
> 6+2pin Modular PCI-E Connector 12V1
> 6+2pin Modular PCI-E Connector 12V1
> 
> Click to expand...
> 
> http://www.thermaltakeusa.com/Power_Supply/Toughpower_Series_/Toughpower/C_00001660/Toughpower_1350W/design.htm
Click to expand...

Read my post please.


----------



## wermad

Quote:


> Originally Posted by *Ragingun*
> 
> Why is AMD recommending 28 amps in that case?


Probably in case if you oc or something.


----------



## electro2u

Quote:


> Originally Posted by *Ragingun*
> 
> Why is AMD recommending 28 amps in that case?


That's exactly *half* correct. =)


----------



## wermad

It says "up to"; maybe a little headroom the lawyers asked for, ???

edit:





http://www.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review#.VP66beGWz9Q


----------



## Ragingun

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> X 99 with 5820k, Enermax 1050w and the 295x2. When my Enermax gets over drawn or over heated it will flash a red led when shut down occurs, it's not. It's just going to the stand by color. I've got a dedicated rail of 30 amps going to the card.
> 
> 
> 
> It needs 50 amps.
> You have to use two separate 30amp rails for each 8 pin connector on a 295x2. My seasonic x1250 does the same when I was using one 30amp rail on my 295x2. Shutdown no warning.
Click to expand...

Never mind. Got it lol! Separate 28 amp rails, not combined.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> It says "up to"; maybe a little headroom the lawyers asked for, ???


Was that at my post about PCIE 12v only delivering 750w? Either way it won't run 2x 295X2s properly. At least one of them is mine so it's not like she'll be playing with it for _that_ long... hopefully


----------



## wermad

Quote:


> Originally Posted by *Ragingun*
> 
> Never mind. Got it lol! Separate 28 amp rails, not combined.


Let us know what transpires after you've made the changes








Quote:


> Originally Posted by *Alex132*
> 
> Was that at my post about PCIE 12v only delivering 750w? Either way it won't run 2x 295X2s properly. At least one of them is mine so it's not like she'll be playing with it for _that_ long... hopefully


It was for Feyris and electro2u. Lol, we all posted at the same time.

Just to simplify:

-28 amps per rail for each 8-pin for multi rail units or 50amps combined for a single rail used by both 8-pins.
-Make sure the rail(s) is not shared by the cpu/mb.

(amd guidelines)

My







:

-750w min for single card no oc on the gpu (check the rail-s!)
-850-1000w for a single card with oc
-1000-1200w for a single card with heavy oc

-1350w for crossfire with no oc (Using Hard's recommendations)
-1500-1600w with oc
-1500w+ using dual psu's in tandem for heavy oc (I would recommend one card on separate power supplies).


----------



## Alex132

Sadly that 1350w unit only has 750w on the PCIE 12v rail. The other 12v rail does not have a PCIE power cable attached. So it'd be 750w for 2x 295X2s....


----------



## wermad

TT Feyris is using or the Enermax Hard used?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> TT Feyris is using or the Enermax Hard used?


TT.

Also acquired a 8600 GT from my lecturer for in-between use when I sell my 690 (feel kinda sorry for that guy) and getting my 295X2.

Any idea why 3DMark won't let my get the achievement for using 3 GPUs in a system









Come to think of it, it didn't unlock the achievement for a 50% overclocked CPU either


----------



## wermad

for crossfire? yeah, it won't work out on the thermaltake 1350w. I believe Feyris is running one, so that should be fine.

So you planning on two 295x2?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> for crossfire, yeah, it won't work out on the thermaltake 1350w. I believe Feyris is running one, so that should be fine.
> 
> So you planning on two 295x2?


Feyris bought 2, one for me and one for her.

She's gonna play with QuadFire (ie; both of them) for funsies before shipping me my 295X2. That's why I said it's not that important as it'll just be a temporary thing.

Speaking of temporary things, you think my H850, 844w (has been recorded up to 1050w for ~1 hour) on the 12v rail) is fine for stock 295x2?


----------



## wermad

Damn, I wouldn't risk it on the TT tbh. The Enermax has six 30amp rails (interesting it peaks @ 1600w and the lepa @ 1700w, while the lepa has two 20 amp and four 30 amp).

Oh yeah, I pulled under 500w on a single 295x2, you should be fine on the 850w. The hx850 is still a bawhs psu imho. You got 70amps on that single rail and I wouldn't worry about it if you're gonna run stock.


----------



## fishingfanatic

Each system is different. There are many factors to consider. Size of fittings/lines, pump, rads, rad fans,...

If I try a higher cfm fan it cools better. Not necessarily quieter, so keep an eye on the dbs. Eight of those on 2 480 rads, over 800 cfm.

Gelid 75 cfms, takes twice as long to cool. You would think not twice as long but I tried it a cpl times to make sure.

Those are some pretty quiet fans.

FF


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> TSM is pushing for Mach10 on his gpu, his "needs" don't apply to 99% of the market


lol you keep mentioning this as if my opinion or how I use my rig has no bearing. FYI, *I don't buy and build a Ferrari to only run Cooper tires on it*, no offense to anyone running Cooper tires.









Your cooling will dictate how you can run your rig, how loud or annoying it is, and how much you can overclock. If your cooling limits your temps, that means you will be limited in those other aspects. Personally I choose to compromise because the price difference between a few rads is worth it to me. If others are ok with compromises then why even bother switching from the stock AIO?


----------



## Feyris

TT Claims 1350WTP is quadfire ready, at 750W rail....out the window. Maybe if you wanted to quadfire potatos







I want to run this so bad just for FS. if 750W is fine for 290X Quad FS. Sobeit otherwise im taking a little trip to tigerdirect


----------



## MOSER91

Is there any compatible waterblocks for the Powercolor r9 290x Devils 13? Thinking about buying this card since Newegg has a good deal on it at the moment.


----------



## electro2u

Quote:


> Originally Posted by *MOSER91*
> 
> Is there any compatible waterblocks for the Powercolor r9 290x Devils 13? Thinking about buying this card since Newegg has a good deal on it at the moment.


Unfortunately no. =(


----------



## BradleyW

Hello,

Has anyone found a mGPU profile that works for MGS-V Ground Zeroes on the 295X2?

Thank you.


----------



## Feyris

Can quadfire cause os install issues? Redid raid0, trying now w one gpu but i get this


Never got that before just now w 295s in.


----------



## Roaches

Why not just install windows with 1 card and add the second after installing drivers.

I don't do raid though hopefully I don't have to go through a windows re-install when my second Devil 13 arrives tomorrow.


----------



## Alex132

Quote:


> Originally Posted by *Roaches*
> 
> Why not just install windows with 1 card and add the second after installing drivers.
> 
> I don't do raid though hopefully I don't have to go through a windows re-install when my second Devil 13 arrives tomorrow.


I didn't with Nvidia drivers, let's hope its the same for AMD









You think I can just driver sweeper + ccleaner + etc. and install AMD drivers? Or should I reformat :/ ?


----------



## Roaches

Quote:


> Originally Posted by *Alex132*
> 
> I didn't with Nvidia drivers, let's hope its the same for AMD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You think I can just driver sweeper + ccleaner + etc. and install AMD drivers? Or should I reformat :/ ?


I've done to good ol-manual method. Never gave me problems when switching camps without formating my system. I have not formatted my system since moving to X79 from Z77 and still runs like new to this day.

1) Uninstalled Drivers from Add/Remove programs

2) Reboot and delete driver registry folders with Regedit

3) Hunt and delete all folders and files relating to graphics drivers (AMD or Nvidia folders in ProgramFiles x86 x64, Appdata and Programdata on the OS drive)

4) Reboot. and double check registries

5) Shutdown and switch cards

6) Power on and install drivers

Takes about 15-20 minutes to do. Worth it IMO.


----------



## Alex132

I'll try that first then, maybe reformat afterwards. I just HATE setting up a new drive (music production software and blah blah







)


----------



## Roaches

Quote:


> Originally Posted by *Alex132*
> 
> I'll try that first then, maybe reformat afterwards. I just HATE setting up a new drive (music production software and blah blah
> 
> 
> 
> 
> 
> 
> 
> )


I feel ya, Hate having to go through the re-licensing process with Autodesk and Dassault customer service when having to reinstall their expensive AutoCAD and Solidworks program.

Good luck, hope it works out for you. The method I posted also works in the worse case scenario, (when my 270X devils and 7970 Lightning had black screens on the other rigs) I did a manual driver wipe, and it solved it.


----------



## littledebbie

So I just put in my xfx 295x2 and the vrm fan is really fing loud is their any reason why or are they all this loud? I did install noctua 2000 rpms and the temps are not too bad


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> lol you keep mentioning this as if my opinion or how I use my rig has no bearing. FYI, *I don't buy and build a Ferrari to only run Cooper tires on it*, no offense to anyone running Cooper tires.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your cooling will dictate how you can run your rig, how loud or annoying it is, and how much you can overclock. If your cooling limits your temps, that means you will be limited in those other aspects. Personally I choose to compromise because the price difference between a few rads is worth it to me. If others are ok with compromises then why even bother switching from the stock AIO?


What i'm observing is that you tend to suggest extreme scenarios (because they apply to you) when sometimes, its best to just give sensible advise. If the member says I want a TSM rig, then I know this not for me to answer and wait for you or someone similar to jump in and go at it.

The wc suggestion, I been sitting on the thread for a years. As soon as someone starts saying "you need a GTX 360 for each core" a lot of folks will say no you don't. I don't know exactly what a person needs or wants. If they want the absolute best, and some say so right off the bat, then you give them the best. If they don't specify, we usually ask for budget or just give a reasonable suggestion.

-wermad










Quote:


> Originally Posted by *Feyris*
> 
> 
> 
> 
> 
> 
> 
> 
> TT Claims 1350WTP is quadfire ready, at 750W rail....out the window. Maybe if you wanted to quadfire potatos
> 
> 
> 
> 
> 
> 
> 
> I want to run this so bad just for FS. if 750W is fine for 290X Quad FS. Sobeit otherwise im taking a little trip to tigerdirect


Just get a single rail 750w unit and jump it w/ your TT. Run the second second card w/ the second psu. I ran two V1000 split between quad 7970 Lightnings. One of the V1000s ran both cards at idle and low usage while my G1600 arrived.
Quote:


> Originally Posted by *littledebbie*
> 
> So I just put in my xfx 295x2 and the vrm fan is really fing loud is their any reason why or are they all this loud? I did install noctua 2000 rpms and the temps are not too bad


What's your setup like? Check gpuz on the vrm temps. Mine on both stock coolers where a bit noisy even at idle. It wasn't annoying but I could definitely here the fans working in the open case. Did you buy the card new or preowned?

edit: Did you turn off ulps in AB (or Trixx)?


----------



## littledebbie

I am using gpu z and it isn't showing the vrm and neither is msi afterburner I am scratching my head. I had 2 290xs installed and I saw everything but just the arm found is running at full tilt


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> If asetek owns the patent, its pretty much the same design. from reading the h100 reviews, bigger rads (and not necessirly thicker) is the way to go. I'm sure those who pony up $1500+ do have a case with at least a 240mm mount me thanks


They "do" last I heard cm is fighting them.

They shouldn't. It is easy to prove that Swiftech had a pump/block combo however as with patent trolls it will take time in court.

Another company I will not support if possible

Have you ever liked into their patents. They basically own pc cooling

Quote:


> Originally Posted by *malik22*
> 
> hey guys Im forced to run my 295x2 in the 4 slot of my asus x99 so I have 3.0 at 8x will this effect performance on the card?


X8 is not recommended. Can you sure.

But you are effectively running both cards at x4 ( @pcie2 x8) which I would not revolved if you can help out. What's the point of x 99 if you don't take full advantage. More over why are you doing this
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> TSM is pushing for Mach10 on his gpu, his "needs" don't apply to 99% of the market
> 
> 
> 
> lol you keep mentioning this as if my opinion or how I use my rig has no bearing. FYI, *I don't buy and build a Ferrari to only run Cooper tires on it*, no offense to anyone running Cooper tires.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your cooling will dictate how you can run your rig, how loud or annoying it is, and how much you can overclock. If your cooling limits your temps, that means you will be limited in those other aspects. Personally I choose to compromise because the price difference between a few rads is worth it to me. If others are ok with compromises then why even bother switching from the stock AIO?
Click to expand...

I agree. People always complain about the price. However the rads can move to my next rig np. Which is Why I don't mind.
One has 5x480 (1x45,1x60,3x80) my other had 5-7 3.0 ( not done rebuilding and still deciding. )
Either way. Is it needed no. Would I recommend if they can yes.


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> They "do" last I heard cm is fighting them.
> 
> They shouldn't. It is easy to prove that Swiftech had a pump/block combo however as with patent trolls it will take time in court.
> 
> Another company I will not support if possible
> 
> Have you ever liked into their patents. They basically own pc cooling
> 
> X8 is not recommended. Can you sure.
> 
> But you are effectively running both cards at x4 ( @pcie2 x8) which I would not revolved if you can help out. What's the point of x 99 if you don't take full advantage. More over why are you doing this
> 
> I agree. People always complain about the price. However the rads can move to my next rig np. Which is Why I don't mind.
> One has 5x480 (1x45,1x60,3x80) my other had 5-7 3.0 ( not done rebuilding and still deciding. )
> Either way. Is it needed no. Would I recommend if they can yes.


-That's what I'm concerned. They're sitting on their butts because they own the aio patent and don't have to budge (apple anyone?) if they're feeding off the royalties and settlements. Well, its up to Amd to decide I guess.

-The onboard plx takes that 8x 3.0 and makes in to two 8x 3.0. I asked this right after the 295x2 launched:

Quote:


> Originally Posted by *Joa3d43*
> 
> ...the card's PLEX chip handles the PCIe lane traffic onboard the 295x2 PCB, so if the mobo slot is PCIe3 x16, then you will get 2x PCIe3 x16, courtesy of PLEX; if the slot of PCIe 3 x8, then you will get 2x PCIe 3 x 8 etc


also, hard ocp did their quadfire test with an Asus Maximus V extreme which runs dual cards @ 8x 3.0. This was also pointed out to me.

-Most new ppl to water already buy their main components and then wc. That's where we have to provide sensible recommendations. Most ppl who buy their wc gear before making up their mind and purchasing the main components are typically wc modders who are doing some sort of special wc build (think Snef, BNegative, JamesWalt, etc.).


----------



## daicon0

I recently received most of my parts for a new rig except the case, a be quiet! 800. So, before I get the case I'm planning what fans go where etc. I apologize ahead of time if this has been answered but where is the best place you guys have found to mount the radiator for lower temps? In push, pull or both? At top of case, front, rear or bottom. Thanks!


----------



## xer0h0ur

Push/pull always has been better for me than just push or just pull. My case doesn't give me very many options for mounting radiators so I just put it wherever the sucker fits.


----------



## wermad

Quote:


> Originally Posted by *daicon0*
> 
> I recently received most of my parts for a new rig except the case, a be quiet! 800. So, before I get the case I'm planning what fans go where etc. I apologize ahead of time if this has been answered but where is the best place you guys have found to mount the radiator for lower temps? In push, pull or both? At top of case, front, rear or bottom. Thanks!


Front seems to be the most popular. You can do top but that puts it right next to your nhd14. Here's a pic I found:
Quote:


>


Do remember this card will start throttling once the cores reach 75°C. with your case, I would place the 295x2 rad up front and the have intake fans on top and front with the rear as exhaust.

I did not test p/p but some say it does help depending on the fans. Some folks run a single high rpm fan as well (not sure how "BeQuiet" that will be







). Also, I've been suggested the SP120 hp's in push pull. If you decide to go custom, you have plenty of space for a couple of 240s in that case.


----------



## F4ze0ne

Quote:


> Originally Posted by *daicon0*
> 
> I recently received most of my parts for a new rig except the case, a be quiet! 800. So, before I get the case I'm planning what fans go where etc. I apologize ahead of time if this has been answered but where is the best place you guys have found to mount the radiator for lower temps? In push, pull or both? At top of case, front, rear or bottom. Thanks!


I have mine up top in front pushing up. Works well for my case, temps are around 70*c when gaming for a while.

It also doubles as a heater for my hands and feet on cold nights.


----------



## kayan

Quote:


> Originally Posted by *daicon0*
> 
> I recently received most of my parts for a new rig except the case, a be quiet! 800. So, before I get the case I'm planning what fans go where etc. I apologize ahead of time if this has been answered but where is the best place you guys have found to mount the radiator for lower temps? In push, pull or both? At top of case, front, rear or bottom. Thanks!


I had mine mounted in the back of my old case, as an exhaust. Maxed out at 71C while playing Far Cry 4 for 2 hours.


----------



## daicon0

Thanks for the suggestions guys as well as the picture. The be quiet! case is still too new to find many pictures online and compare it to my build. I should get the case tomorrow and begin testing. Personally I want it in the front or bottom. That way both rear and top will act as exhausts. I'll update what I found best.


----------



## steezebe

Finally.

Single-Slot Dual GPU SCANDALLLLL







It's so pretty.







And hopefully it will get those warm VRMs under control.



I like comparing this to my ol' dual 6970's with DangerDen blocks. Man things have changed a lot in WC the past 5 years


----------



## wermad

Does the ek 6990/7990 single i/o bracket work with the 295x2? I believe the 7990 does (derick might have confirmed this....?)


----------



## littledebbie

Really likeing the 295x2 it runs about 42 full tilt push pull noctua 2000rpm when playing dota 2 1440. Just wish I could figure out the vrm fan and why it is so obviously loud


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Does the ek 6990/7990 single i/o bracket work with the 295x2? I believe the 7990 does (derick might have confirmed this....?)


I will check shortly. but even if they do its total pita to find


----------



## xer0h0ur

I could have sworn EK sells spare parts like the single slot bracket.


----------



## Feyris

Quote:


> Originally Posted by *xer0h0ur*
> 
> I could have sworn EK sells spare parts like the single slot bracket.


When i looked in december it was all OOS except on ebay from 3rd party for 15 pounds.

Couldve changed tho


----------



## wermad

Shipping is crazy from Slovenia to the US ($30!!!!). I'll ask Akira if he can confirm on this. Ppcs.com has the ek 6990:

http://www.performance-pcs.com/ek-vga-i-o-bracket-hd6990.html

The 7990 one is available from Ek's shop:

http://www.ekwb.com/shop/blocks/vga-blocks/i-o-brackets/ek-vga-i-o-bracket-hd7990-se.html

6990 i/o ebay US sellers:

http://www.ebay.com/itm/Swiftech-PCI-BCKT-6990-Single-slot-PCI-bracket-/330563684614?pt=LH_DefaultDomain_0&hash=item4cf7218906

http://www.ebay.com/itm/Swiftech-Single-slot-PCI-bracket-for-Radeon-HD6990-/390671799006?pt=LH_DefaultDomain_0&hash=item5af5da92de

6990:



7990:


----------



## dmv808

Hey everyone.

I am considering purchasing an r9 295x2 but I have some questions. I have checked and it looks like my power supply would be enough to handle my rig. But If you guys could shine some light on the potential issues I might run into, it would be great. Also, tips would be very welcomed. Thanks a lot!

Case: Fractal Design Node 804
PSU: EVGA 220-GS-0850-V1 850W ATX12V / EPS12V 80 PLUS GOLD Certified
Mobo: ASUS GRYPHON Z87 DDR3 1600 LGA 1150 Motherboard
CPU: Intel Xeon E3-1241 v3 Haswell 3.5GHz


----------



## remnant

Quote:


> Originally Posted by *dmv808*
> 
> Hey everyone.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I am considering purchasing an r9 295x2 but I have some questions. I have checked and it looks like my power supply would be enough to handle my rig. But If you guys could shine some light on the potential issues I might run into, it would be great. Also, tips would be very welcomed. Thanks a lot!
> 
> Case: Fractal Design Node 804
> PSU: EVGA 220-GS-0850-V1 850W ATX12V / EPS12V 80 PLUS GOLD Certified
> Mobo: ASUS GRYPHON Z87 DDR3 1600 LGA 1150 Motherboard
> CPU: Intel Xeon E3-1241 v3 Haswell 3.5GHz


to tag along this question ^^

I've been reading ya'll talking about having each 8-pin on its own rail, would that mean this power supply is less desirable because it is 1 rail / 70 amp vs 2 rails at 35? or option c that I'm completely confused about rails


----------



## wermad

Quote:


> Originally Posted by *dmv808*
> 
> Hey everyone.
> 
> I am considering purchasing an r9 295x2 but I have some questions. I have checked and it looks like my power supply would be enough to handle my rig. But If you guys could shine some light on the potential issues I might run into, it would be great. Also, tips would be very welcomed. Thanks a lot!
> 
> Case: Fractal Design Node 804
> PSU: EVGA 220-GS-0850-V1 850W ATX12V / EPS12V 80 PLUS GOLD Certified
> Mobo: ASUS GRYPHON Z87 DDR3 1600 LGA 1150 Motherboard
> CPU: Intel Xeon E3-1241 v3 Haswell 3.5GHz


-card is long, so you will loose a mounting spot for the rad for the front-bottom. Use the front-top or the top fan mounts (you have four if you don't use the hdd cages)
-psu is fine (single rail, 70 amps), just don't push the clocks if I were you.

overall a great setup to get started. I'm not going to delved into your cpu as it really comes down to what you do (and what games you play).

Quote:


> Originally Posted by *remnant*
> 
> to tag along this question ^^
> 
> I've been reading ya'll talking about having each 8-pin on its own rail, would that mean this power supply is less desirable because it is 1 rail / 70 amp vs 2 rails at 35? or option c that I'm completely confused about rails


Its a bit confusing but the important thing is amperage. it doesn't matter if you have a single or multi rail psu, as long as it has the right amps, you're good. You have to check the specs of the psu. Using the example you gave, a single rail 70amp is fine for one card (don't push the clocks too high) ~750-850w). A two-rail unit with 35 amps each is not ideal and here's why:

Amd recommends 28 amps per 8-pin or 50amps for both combined 8-pins.

the dual 35 amp rail has enough for one 8-pin, but the other rail needs amps for your system, and that won't leave enough for the second 8-pin. So this unit is a no-go as one rail cannot handle both 8-pins (remember, combined, minimum is 50 amps) on a single 35 amp rail and you won't have enough amps on the second rail (shared by the system, and probably not connected to the vga outputs).

edit: I'm going to compile a list of psu's and hopefully this can help others.


----------



## dmv808

Thank you very much for your feedback. That's good, I'm not planning on using any of the HDD cages, so I'll mount it where you recommended.

As far as the heat is concerned, do you think the card is going to run too hot in the Fractal Node 804? I plan on getting some Noctua fans.

Also, do you think the Xeon is going to 'bottleneck' the 295x2? I mainly play FPS games like Battlefield 4 and do some video editing.

I'm currently only running (1) 1080p 144Hz monitor, but I plan on adding another one in the near future to set up a dual monitor display.

Thanks again


----------



## wermad

At 1080 w/ 120hz+ you may run into issues in some games. If you plan to go higher in resolutions (eyefinity, wqhd, or 4k) it takes off a lot of the load from the cpu and more on the gpu's. Try it out, you can always get a desktop chip instead later on.


----------



## dmv808

Thanks for the heads up. Last question. If I did decide to get an i7 4970K, I would definitely want to jump up to around 1000W PSU right?

I was hoping the Xeon would run a lot cooler too. But I might make the switch if it's going to cause problems.

Thanks again.


----------



## Mega Man

Quote:


> Originally Posted by *steezebe*
> 
> Finally.
> 
> Single-Slot Dual GPU SCANDALLLLL
> 
> 
> 
> 
> 
> 
> 
> It's so pretty.
> 
> 
> 
> 
> 
> 
> 
> And hopefully it will get those warm VRMs under control.
> 
> 
> 
> 
> I like comparing this to my ol' dual 6970's with DangerDen blocks. Man things have changed a lot in WC the past 5 years


nice huh, i have 2 in my system now

mine are black though, no damage to pcie slots
Quote:


> Originally Posted by *wermad*
> 
> Shipping is crazy from Slovenia to the US ($30!!!!). I'll ask Akira if he can confirm on this. Ppcs.com has the ek 6990:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.performance-pcs.com/ek-vga-i-o-bracket-hd6990.html
> 
> The 7990 one is available from Ek's shop:
> 
> http://www.ekwb.com/shop/blocks/vga-blocks/i-o-brackets/ek-vga-i-o-bracket-hd7990-se.html
> 
> 6990 i/o ebay US sellers:
> 
> http://www.ebay.com/itm/Swiftech-PCI-BCKT-6990-Single-slot-PCI-bracket-/330563684614?pt=LH_DefaultDomain_0&hash=item4cf7218906
> 
> http://www.ebay.com/itm/Swiftech-Single-slot-PCI-bracket-for-Radeon-HD6990-/390671799006?pt=LH_DefaultDomain_0&hash=item5af5da92de
> 
> 6990:
> 
> 
> 
> 7990:


should be the same
Quote:


> Originally Posted by *remnant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dmv808*
> 
> Hey everyone.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I am considering purchasing an r9 295x2 but I have some questions. I have checked and it looks like my power supply would be enough to handle my rig. But If you guys could shine some light on the potential issues I might run into, it would be great. Also, tips would be very welcomed. Thanks a lot!
> 
> Case: Fractal Design Node 804
> PSU: EVGA 220-GS-0850-V1 850W ATX12V / EPS12V 80 PLUS GOLD Certified
> Mobo: ASUS GRYPHON Z87 DDR3 1600 LGA 1150 Motherboard
> CPU: Intel Xeon E3-1241 v3 Haswell 3.5GHz
> 
> 
> 
> 
> 
> 
> to tag along this question ^^
> 
> I've been reading ya'll talking about having each 8-pin on its own rail, would that mean this power supply is less desirable because it is 1 rail / 70 amp vs 2 rails at 35? or option c that I'm completely confused about rails
Click to expand...

you need 50 a for both ( total ) so @ 70a your fine


----------



## Feyris

Quote:


> Originally Posted by *Mega Man*
> 
> nice huh, i have 2 in my system now
> 
> mine are black thoug, no damage to pcie slots
> should be the same
> you need 50 a for both ( total ) so @ 70a your fine


Checked my 7990 Slot to the 295X2 and yeah they are identical (thank you AMD for letting us carry over precious things like that) cheap single slots, yay!


----------



## Mega Man

if only they didnt nvidia out on the 290/290x i/o

honestly in this day and age the only I/O i want on a video card is 6 mini dp


----------



## wermad

Quote:


> Originally Posted by *dmv808*
> 
> Thanks for the heads up. Last question. If I did decide to get an i7 4970K, I would definitely want to jump up to around 1000W PSU right?
> 
> I was hoping the Xeon would run a lot cooler too. But I might make the switch if it's going to cause problems.
> 
> Thanks again.


You don't tbh, Haswell is still pretty power efficient. If you jump to X79 or X99 and plan to do some heavy oc'in, I would get something bigger (remember the amps per rail amd guideline!).

Quote:


> Originally Posted by *Feyris*
> 
> Checked my 7990 Slot to the 295X2 and yeah they are identical (thank you AMD for letting us carry over precious things like that) cheap single slots, yay!


I figured as I do recall derick confirming this (can't find the post). Anyways, there's a ton of the 6990 covers but the finish looks different then the polished one of the 7990. I'll ask akira if they plan to send more to ppcs.com soon.


----------



## Feyris

Quote:


> Originally Posted by *Mega Man*
> 
> if only they didnt nvidia out on the 290/290x i/o
> 
> honestly in this day and age the only I/O i want on a video card is 6 mini dp


Your the one running around screaming "Daisy chain them all!!!" with handful of mDP cables around office arent you









Quote:


> Originally Posted by *wermad*
> 
> You don't tbh, Haswell is still pretty power efficient. If you jump to X79 or X99 and plan to do some heavy oc'in, I would get something bigger (remember the amps per rail amd guideline!).
> I figured as I do recall derick confirming this (can't find the post). Anyways, there's a ton of the 6990 covers but the finish looks different then the polished one of the 7990. I'll ask akira if they plan to send more to ppcs.com soon.


the 7990 one has a "lip" that makes it look better with the EK block but you have to dremel it off otherwise because it is a decent sized lip of it does not like the non EK wbs, i HAD to attack mine with a dremel to remove said lip. for the Razor HD. I will tell you what happens with Koolance tomorrow though


----------



## wermad

5870 Eyefinity 6:



6870 Eyefinity 6:



7870 Eyefinity 6:



Holy grail 5970 Eyefinity *12*:





^^^Just looking at that one makes feel like:


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> You don't tbh, Haswell is still pretty power efficient. If you jump to X79 or X99 and plan to do some heavy oc'in, I would get something bigger (remember the amps per rail amd guideline!).
> I figured as I do recall derick confirming this (can't find the post). Anyways, there's a ton of the 6990 covers but the finish looks different then the polished one of the 7990. I'll ask akira if they plan to send more to ppcs.com soon.


Quote:


> Originally Posted by *wermad*
> 
> 5870 Eyefinity 6:
> 
> 
> 
> 6870 Eyefinity 6:
> 
> 
> 
> 7870 Eyefinity 6:
> 
> 
> 
> Holy grail 5970 Eyefinity *12*:
> 
> 
> 
> 
> 
> ^^^Just looking at that one makes feel like:
> 
> powercolor does not count. powercolors always on drugs


powercolor does not count. powercolors always on drugs when they do custom things


----------



## wermad

Maybe they need to send some of that stuff (and cocaine energy drinks) to amd and nvidia engineers


----------



## Mega Man

to those having issues with amd CFX @ middle earth shadow of morridor

i fixed mine


make a CFX profile to afr Friendly
Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Feyris*
> 
> Checked my 7990 Slot to the 295X2 and yeah they are identical (thank you AMD for letting us carry over precious things like that) cheap single slots, yay!
> 
> 
> 
> I figured as I do recall derick confirming this (can't find the post). Anyways, there's a ton of the 6990 covers but the finish looks different then the polished one of the 7990. I'll ask akira if they plan to send more to ppcs.com soon.
Click to expand...

@Akira

will you make some in black for meh !!! id pay !

@Feyris

no i hate daisy chain. can only do 2 of my monitors ( 144hz )


----------



## remnant

Quote:


> Originally Posted by *Feyris*
> 
> powercolor does not count. powercolors always on drugs when they do custom things


"devil 13" must be the street name for what ever they were on at the time.


----------



## wermad

@ Feyris: can you confirm if it matches the EK 6990 schematics????

They did not have it for the 7990SE











If it does match, I'll be ordering two and squirting them satin black or matte white









Btw guys, EK does warn not to use the single i/o bracket for blocks weighing 1kg+. My koolance blocks is over 1kg but luckily, I'm running horizon! I guess its meant for SFF where you could have the card in horizontal atx or the 6990 block is heavy (ie Danger Den!)


----------



## Alex132

It should, the 7990 has a lip and the 6990 doesn't. So the 6990 should fit the 7990 - and the R9 295X2. But not sure if the 7990/R9 295X2 will fit the 6990.


Spoiler: 7990

















Spoiler: 6990















Why don't they make PCI-E brackets pure black? I really like how they look in black, but no one does it


----------



## Mega Man

Swiftech does ( see 79xx komodo)


----------



## wermad

The swiftech 6990 single bracket is chromed with the lip!

http://www.performance-pcs.com/swiftech-pci-bracket-hd6990.html

Wow, i can't believe ppcs.com charges almost $10 for priority mail for two of these. They easily fir the small flat rate box. Meh, won't give ppcs.com my money today


----------



## Alex132

Either way thank based AMD for making the brackets universal


----------



## littledebbie

I think I might or got a bad card this morning I had the cup just idleing and I could hear the arm fan through the bathroom wall very audibly. Gonna have to rma it I believe also the arm temps aren't showing on gpu-z or msi. I put my finger on the fan to slow it down and that is the problem area the noctuas are actually pretty tame.if you guys could help me out with help that would be great but I know It is my issue


----------



## Alex132

Quote:


> Originally Posted by *littledebbie*
> 
> I think I might or got a bad card this morning I had the cup just idleing and I could hear the arm fan through the bathroom wall very audibly. Gonna have to rma it I believe also the arm temps aren't showing on gpu-z or msi. I put my finger on the fan to slow it down and that is the problem area the noctuas are actually pretty tame.if you guys could help me out with help that would be great but I know It is my issue


What are your temps at idle?

hwmonitor, GPU-Z, MSI AB should be able to read it.


----------



## littledebbie

I will try hwmonitor when I get home but msi afterburner and gpu z show me vddc and not the vrm temps or rpm speed.I did fresh installs of both programs as well Stupid question do I need to bios flash so I can see that info? Also the fan was loud as soon as I installed it from powering on the system. It is louder that the reference 290x at 60% kinda loud


----------



## electro2u

Quote:


> Originally Posted by *littledebbie*
> 
> I will try hwmonitor when I get home but msi afterburner and gpu z show me vddc and not the vrm temps or rpm speed.I did fresh installs of both programs as well Stupid question do I need to bios flash so I can see that info? Also the fan was loud as soon as I installed it from powering on the system. It is louder that the reference 290x at 60% kinda loud


Once you install the drivers it should slow the vrm fan.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> I'll check vrm later


Did you ever get a chance to check vrms with that koolance block?


----------



## wermad

Haven't able to run any benches as windows is crapping out on me. I may just give in and pick up 8.1.

edit: just checked gpuz for idle temps and there's no vrm there.....


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> The swiftech 6990 single bracket is chromed with the lip!
> 
> http://www.performance-pcs.com/swiftech-pci-bracket-hd6990.html
> 
> Wow, i can't believe ppcs.com charges almost $10 for priority mail for two of these. They easily fir the small flat rate box. Meh, won't give ppcs.com my money today


If you email them they will usually adj it. It seens They used to have a min weight in their Web site which had been fixed as I have done this a few times
Quote:


> Originally Posted by *littledebbie*
> 
> I think I might or got a bad card this morning I had the cup just idleing and I could hear the arm fan through the bathroom wall very audibly. Gonna have to rma it I believe also the arm temps aren't showing on gpu-z or msi. I put my finger on the fan to slow it down and that is the problem area the noctuas are actually pretty tame.if you guys could help me out with help that would be great but I know It is my issue


Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *littledebbie*
> 
> I think I might or got a bad card this morning I had the cup just idleing and I could hear the arm fan through the bathroom wall very audibly. Gonna have to rma it I believe also the arm temps aren't showing on gpu-z or msi. I put my finger on the fan to slow it down and that is the problem area the noctuas are actually pretty tame.if you guys could help me out with help that would be great but I know It is my issue
> 
> 
> 
> What are your temps at idle?
> 
> hwmonitor, GPU-Z, MSI AB should be able to read it.
Click to expand...

Are you sure last I knew we couldn't.


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Are you sure last I knew we couldn't.


I thought he meant GPU temps / GPU fan. I didn't really bother correcting myself









It annoys me that you can't monitor vrm temps on a lot of new GPUs. I suppose it doesn't matter much as R9 295X2 VRM temps barely get to ~75'c (from reviews) and up to 125'c is really perfectly fine...


----------



## xer0h0ur

You're not going to find anything listing the VRM temps on the 295X2. No software seems to read that information.


----------



## smoggysky

hey guys I am building a new rig. I have been testing it for the last 2 weeks. I am about to add a second r9 295x2. do I need to clean out the video drivers before install the card or can I just plug and play.
the rig: i7-5960x @ 4.0ghz 1.18vcore, psu:evga1600p2, rampage 5 extreme, 32gbs ripjaws 4 2400 ghz, corsair 900d case, dual watercooling loop: 1st rx360 on the cpu, 2nd: rx360 cooling the 2 r9 295x2's, Samsung xp941 m.2 ultra 512 ssd drive, 3- Seagate 3tb hhd's. win7 64 bit,
thanks guys and gals


----------



## wermad

well, I'll have to find my laser thermometer then. One more reformat and I hope this will hold until i get my new os.


----------



## Mega Man

Quote:


> Originally Posted by *smoggysky*
> 
> hey guys I am building a new rig. I have been testing it for the last 2 weeks. I am about to add a second r9 295x2. do I need to clean out the video drivers before install the card or can I just plug and play.
> the rig: i7-5960x @ 4.0ghz 1.18vcore, psu:evga1600p2, rampage 5 extreme, 32gbs ripjaws 4 2400 ghz, corsair 900d case, dual watercooling loop: 1st rx360 on the cpu, 2nd: rx360 cooling the 2 r9 295x2's, Samsung xp941 m.2 ultra 512 ssd drive, 3- Seagate 3tb hhd's. win7 64 bit,
> thanks guys and gals


you dont need to but you can

should auto install needed drivers


----------



## NinjaKabuto

Hi,
I apologize if has been discussed before but I appear to having a hard time finding the solution any where.

My spec:
3770k @ 4.4
Asrock Z77 Extreme 4
16GB 1600mhz
EVGA 1000W G2
2560x1440p

I bought a brand new 295x2 last Friday, I even bought the new PSU to accommodate this power hungry card. Installation was breeze, CPU-Z shows temp idling @ 30-35 and load @ 45-60, which is pretty normal I think.

I updated the bios, did a driver sweeper to make sure my previous Nvidia driver (780TI superclocked) is completely wiped out, I reinstalled the latest 14.12 driver at least 4 times. I even reformat and install new Win 8.1 on a new hdd as a last resort to start completely fresh.

I tested a few games in my library: BF4, FC4, Dying Light 4, AC Unity, Bioshock 3, TF2, granted a couple of those titles have some questionable optimization.

PROBLEM: I run all those game at least 10 minute each and I notice FPS (Frap/Afterburner) drops drastically every few second ranging from 100 to 20 to 60. CPU-Z shows the GPU Load not stable at all when the game is running. It was spiking constantly (from 90% to 1% in a matter of second), I believe this is the reason for FPS plummets every few seconds.

I re-adjusted the card, tried a different PCI slot, turn off ULPS and nothing seem to do any tricks. Any solutions or suggestions would be greatly appreciated.


----------



## wermad

run a benchmark to see if the usage is a bit more stable. Also, don't use driver sweeper as that tends to crap things. Both nvidia and amd have driver removable options by launching the installer again.


----------



## xer0h0ur

Check to to see if you're getting thermal throttling when your GPU usage/FPS is dropping. There is also an outside chance at PLX overheating and/or VRM overheating.


----------



## smoggysky

I just installed the second r9 295x2 sapphire card. I ran cod advanced warfare sp, the fps seemed low to me. fluctuating between 79 low to 115 high. ave around 95 fps. I ran 3dmark advantage pro. my previous score with 1 r9 295x2 was p65030. with the 2nd card installed I only got to P72040. but here is what I am troubled about. my catalyst is showing 3 cards. my device driver is showing 3 cards. and the system info of the 3dmark advantage is showing 3 cards. shouldn't these be showing 4 cards? any ideas? do I need to uninstall and reinstall the video drivers?
any help will be appreciated.


----------



## xer0h0ur

You didn't re-install the driver when you popped in the 2nd 295X2? I know my system hates me when I go from tri-fire to crossfire or from crossfire to tri-fire. It nearly always requires a driver re-install.


----------



## Roaches

Quote:


> Originally Posted by *smoggysky*
> 
> I just installed the second r9 295x2 sapphire card. I ran cod advanced warfare sp, the fps seemed low to me. fluctuating between 79 low to 115 high. ave around 95 fps. I ran 3dmark advantage pro. my previous score with 1 r9 295x2 was p65030. with the 2nd card installed I only got to P72040. but here is what I am troubled about. my catalyst is showing 3 cards. my device driver is showing 3 cards. and the system info of the 3dmark advantage is showing 3 cards. shouldn't these be showing 4 cards? any ideas? do I need to uninstall and reinstall the video drivers?
> any help will be appreciated.


If you can, make sure ULPS is manually disabled on all cores in the system. If MSI AB isn't detecting all GPUs when your system is idle. UPLS can be the cause.



If all else fails, do a manual driver wipe and reinstall.


----------



## wermad

Gpuz shows only three "r9 200"?


----------



## Roaches

Just got my second card like just now, gonna be away for a bit to install this mofo. 1200W of heat here I come!




I got a bad feeling my G2 1300 is gonna pass out on this one.


----------



## Ragingun

I got my power supply figured out and have 60 amps getting to my card now. I HAMMERED my 3Dmark score!!!

http://www.3dmark.com/3dm/6197192?


----------



## xer0h0ur

Yeah that isn't bad at all. I get 22K tri-fired so that is a good score 295X2 alone.


----------



## smoggysky

I uninstalled the catalyst driver 14.12 and reinstalled . now my device driver show 4 gpu's. gpuz show's 4 gpu's , but CCC wont show overdrive and show's two crossfires instead??? *** ? next I will uninstall the ccc driver again and install the driver that came with the sapphire cards. I originally downloaded the cat 14.11 from the amd web site. this is ridiculous. I did a google search and a WHOLE HECK OF ALOT OF PEOPLE ARE HAVING THIS SAME PROBLEM. with these r9 295x2's. nice to hear from you wermad..fantastic build by the way


----------



## dndfm

Hi guys, I am new to this forum
I have the opportunity to buy a second hand r9 295x2, for cheap money, almost a third of new price in Norway. The price is almost 450 us dollars.
But it's not tip top, it has a leak in the rad, not sure why but owner promises that the card work itself works. now i did some digging and I found out that amd Brittan could sell me an entire new aio with shipping for about 130 usd. but im thinking of modding this thing, im wondering if it is possible to take two h50's and throw them on the card, has anyone tried doing something similar here, because the mountings for the water blocks are the same design, all i need is to enlarge the openings on the shred for the tubes. any thoughts?


----------



## littledebbie

Yeah when I put on crysis 3 their was a ramping up or down with the vrm fan it stayed a a steady rate positive news is that the gpus only got to 58 and 59 with about 25 minutes of my crap tastic play.they idle at about 37 ish should I pull the Pam fan off of the pub and see of that helps or should I rma it it is seriously 5 times louder than all of my case fans right now


----------



## xer0h0ur

Quote:


> Originally Posted by *smoggysky*
> 
> I uninstalled the catalyst driver 14.12 and reinstalled . now my device driver show 4 gpu's. gpuz show's 4 gpu's , but CCC wont show overdrive and show's two crossfires instead??? *** ? next I will uninstall the ccc driver again and install the driver that came with the sapphire cards. I originally downloaded the cat 14.11 from the amd web site. this is ridiculous. I did a google search and a WHOLE HECK OF ALOT OF PEOPLE ARE HAVING THIS SAME PROBLEM. with these r9 295x2's. nice to hear from you wermad..fantastic build by the way


Before you go re-installing the driver just restart the system. I have had that occur upon the initial restart from the install.


----------



## Mega Man

Quote:


> Originally Posted by *dndfm*
> 
> Hi guys, I am new to this forum
> I have the opportunity to buy a second hand r9 295x2, for cheap money, almost a third of new price in Norway. The price is almost 450 us dollars.
> But it's not tip top, it has a leak in the rad, not sure why but owner promises that the card work itself works. now i did some digging and I found out that amd Brittan could sell me an entire new aio with shipping for about 130 usd. but im thinking of modding this thing, im wondering if it is possible to take two h50's and throw them on the card, has anyone tried doing something similar here, because the mountings for the water blocks are the same design, all i need is to enlarge the openings on the shred for the tubes. any thoughts?


should be possible i just pulled mine off, i will try to take a pic later tonight 9as i have to leave soon

Quote:


> Originally Posted by *littledebbie*
> 
> Yeah when I put on crysis 3 their was a ramping up or down with the vrm fan it stayed a a steady rate positive news is that the gpus only got to 58 and 59 with about 25 minutes of my crap tastic play.they idle at about 37 ish should I pull the Pam fan off of the pub and see of that helps or should I rma it it is seriously 5 times louder than all of my case fans right now


not trying to be rude, but it sounds like user error we can help but we need more info

is AA maxed? what res? temps? screenshots ??

please fill out rig builder ( see my sig )


----------



## fishingfanatic

Sweet !!! Great job dude!!! I always feel euphoric after fixing an issue and then getting some new best scores makes it worth it for me.









FF


----------



## dndfm

I would really appreciate pictures if you could take them


----------



## wermad

BRB guys, had docs visit, finishing up the psu list, and I'm trying to install sp1 again.

If a gpu goes missing, check gpuz first. Ab needs a few restart sometimes to pick up the cores again. If you still can't get both cores to show, run that card by itself and see if you can pick both up again. Also make sure your slot is at least showing 8x 3.0.


----------



## Alex132

Quote:


> Originally Posted by *Roaches*
> 
> Just got my second card like just now, gonna be away for a bit to install this mofo. 1200W of heat here I come!
> 
> 
> 
> 
> I got a bad feeling my G2 1300 is gonna pass out on this one.


You're still using an old Zalman CPU cooler







Those were good in like 2008/9









It must be getting all the hot air from your D13s (Double-D 26?) and heating up your CPU like mad.

Upgrade to AT LEAST an AIO please


----------



## Roaches

Quote:


> Originally Posted by *Alex132*
> 
> You're still using an old Zalman CPU cooler
> 
> 
> 
> 
> 
> 
> 
> Those were good in like 2008/9
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It must be getting all the hot air from your D13s (Double-D 26?) and heating up your CPU like mad.
> 
> Upgrade to AT LEAST an AIO please


It was the only cooler at the time that didn't interfere with ram modules, as much as I'd like to change it, as for AIO I'll never put my hands on another after seeing one leak in person from a friend's PC.
Hoping someone like Cryorig comes out with something decent and LGA 2011 ram module friendly, Oh wait forgots theres Noctua









It does a good job keeping my CPU around the 70 and 80 degrees Celsius during a long Mental Ray rendering run at 4.3 ghz.

Anyhow, just finish installing the cards. I'm afraid to run any benches until the G2 1600 arrives.


----------



## xer0h0ur

I don't even care about the looks of the Devil 13. I just like the better power delivery but its a shame they never made a waterblock for it.


----------



## wermad

Its my first spreadsheet published on ocn







. Let me know if you guys want to make changes









edit: keep in mind this doesn't actually tell us the routing of the power. Most cases, the mb/cpu get the first rail or two (~20A each) in a multi rail setup. If you can't find the distribution chart/diagram from the manufacturer's site, hit up their support. You can also pm me and I can look for the info.

double edit: please use the link the spreadsheet for oem. Note some models (same sku) may have multiple oem's.

https://docs.google.com/spreadsheets/d/1o3o9xomRy1lL_jXvJzISzcxwgtyb7eDgfBxYjgAnyjs/pubhtml?widget=true&headers=false


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't even care about the looks of the Devil 13. I just like the better power delivery but its a shame they never made a waterblock for it.


To be honest, the Razor mouse bundled with it is kinda pointless since buyers already would have a decent mouse. Though I wondered how well it would sell if Powercolor did partner up with EK to bundle a block with it, even at the original MSRP it could've fly off the shelves if it had come with one. My jaw kinda dropped looking at 8+8+8+8 pins on a single card and wondered too if it was to be watercooled


----------



## xer0h0ur

I swear that card would be a beast under a full coverage waterblock.


----------



## Feyris

Quote:


> Originally Posted by *Roaches*
> 
> To be honest, the Razor mouse bundled with it is kinda pointless since buyers already would have a decent mouse. Though I wondered how well it would sell if Powercolor did partner up with EK to bundle a block with it, even at the original MSRP it could've fly off the shelves if it had come with one. My jaw kinda dropped looking at 8+8+8+8 pins on a single card and wondered too if it was to be watercooled


You could sell them make back extra 180 bucks. But yea. Might be nice and decent option vs ares


----------



## Roaches

Quote:


> Originally Posted by *Feyris*
> 
> You could sell them make back extra 180 bucks. But yea. Might be nice and decent option vs ares


Maybe when Newegg's 30 day return window is expired. Otherwise I'll keep them as back up mouse when needed. They are quite compy on palm grip, other than that it feels so lightweight and cheap compared to my Corsair M65

Also my system did unexpectedly shutdown during the end 3D Mark Firestrike. As expected the G2 1300 is a no go on handling 2 Devils gonna have to wait on the 1600.


----------



## Ragingun

Quote:


> Originally Posted by *Ragingun*
> 
> I got my power supply figured out and have 60 amps getting to my card now. I HAMMERED my 3Dmark score!!!
> 
> http://www.3dmark.com/3dm/6197192?


Apparently I spoke too soon. I have a full 60 amps going to the card now and I still had a random shut down after gaming for 30 minutes. Really sucks loosing all that progress and I still don't know what the hell is going on!


----------



## Mega Man

Quote:


> Originally Posted by *dndfm*
> 
> Hi guys, I am new to this forum
> I have the opportunity to buy a second hand r9 295x2, for cheap money, almost a third of new price in Norway. The price is almost 450 us dollars.
> But it's not tip top, it has a leak in the rad, not sure why but owner promises that the card work itself works. now i did some digging and I found out that amd Brittan could sell me an entire new aio with shipping for about 130 usd. but im thinking of modding this thing, im wondering if it is possible to take two h50's and throw them on the card, has anyone tried doing something similar here, because the mountings for the water blocks are the same design, all i need is to enlarge the openings on the shred for the tubes. any thoughts?


I'll take pics on a min.

Looks like you could. Another option if you want is you could buy a rad maybe a red and just cut out the old rad. Use 1/4 in hose Barbs etc

It seems though the outer ring will need to be removed for the mounting to come off


----------



## smoggysky

great job on the spreadsheet wermad!








Instead of reinstalling the cat driver I finally rebooted the system. it took 2 reboots and viola! 4 gpus.








I still have one hd audio driver malfunctioning but its on the slave graphics card. I ran 3dmark advantage professional{performance} but only got p74340. i'll work on it. I found a post online that ulps could case this problem. I ran redgit and disabled it. . then I played 2 levels of cod advanced warfare sp and was generating 150-250 fps..that made me smile...








i'll keep working on this .


----------



## xer0h0ur

Quote:


> Originally Posted by *Ragingun*
> 
> Apparently I spoke too soon. I have a full 60 amps going to the card now and I still had a random shut down after gaming for 30 minutes. Really sucks loosing all that progress and I still don't know what the hell is going on!


Are you sure its powered properly? Some power supplies have been shown to lie about having single 12V rails and turned out to be multi-rail so in the end people have to plug them in a specific manner to run off two separate rails.


----------



## wermad

Quote:


> Originally Posted by *smoggysky*
> 
> great job on the spreadsheet wermad!
> 
> 
> 
> 
> 
> 
> 
> 
> Instead of reinstalling the cat driver I finally rebooted the system. it took 2 reboots and viola! 4 gpus.
> 
> 
> 
> 
> 
> 
> 
> 
> I still have one hd audio driver malfunctioning but its on the slave graphics card. I ran 3dmark advantage professional{performance} but only got p74340. i'll work on it. I found a post online that ulps could case this problem. I ran redgit and disabled it. . then I played 2 levels of cod advanced warfare sp and was generating 150-250 fps..that made me smile...
> 
> 
> 
> 
> 
> 
> 
> 
> i'll keep working on this .


Thanks







. I was very surprised how many psu's I had to drop as they didn't meet the amd requirements. Also, very interesting how the list is very thin for crossfire. Even a beefy psu like ST1500 silver didn't make the cut for at least one card. I had lots of time in the late hours as my new meds don't let me sleep. Might as well do something productive!

Glad to hear things are moving along! Yeah, sometimes a few reboots is what these guys need. I was taking some temperatures when I decided to check my kill-a-watt and accidentally flipped the switch (







). Juggling the laser thermometer and flashlight is a pita.

edit: going to reach out to Nav if he can add the spreadsheet to the op. Its important to let owners or future owners know how special the power requirements are for this beast.

Btw, did you guys see the 2Kw unit?







I believe its non-US as it runs on 240/250v.

Quote:


> Originally Posted by *kayan*
> 
> Did you ever get a chance to check vrms with that koolance block?


Kayan, so sorry for the delays. I had to sort my windows as 3dmark won't run w/out sp1. I was running a test and got some readings from the backs of the cards using my laser thermometer:

Card #1: has the back vrm fully exposed: recorded ~56°C average for both sides.

Card #1 was a tad cooler (probably as the front intake fan is directing more air through it), it read about 54°C average both sides. Very good considering vrm's are capable of ~120°C. From looking some info up in the amd forums, found a post siting sapphire reps indicating they can do 100+ and 120c as well. Going to run the benchmark again and post a screen shot of AB. AB is getting very touchy. It crashes frequently when running with another app. I really need to move on to windows 8.1


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Apparently I spoke too soon. I have a full 60 amps going to the card now and I still had a random shut down after gaming for 30 minutes. Really sucks loosing all that progress and I still don't know what the hell is going on!
> 
> 
> 
> Are you sure its powered properly? Some power supplies have been shown to lie about having single 12V rails and turned out to be multi-rail so in the end people have to plug them in a specific manner to run off two separate rails.
Click to expand...

Yes. I have 6 12v rails. 2 separate rails are connected to it.


----------



## dndfm

i could replace the rad but id rather not because i have never tried to refill a aio cooler before, changing the entire thing with dual h50 seems a bit simpler and would probably perform better, but thanks for the help man, all i needed to know was if the new dual h50s are compitable, things will get thight in the air 240, il post pics when done, thanks again <3









but if anyone here has any sugestions for the most badass and "ghetto" mod that would fit inside a air240 i will be very thankfull


----------



## Mega Man

sorry for the delays hope they help

in the last pic i tried to get it on camera the mounting looks like it comes off with removing the outer screws on the bottom of the pump
.... This didn't pay last night for some reason....

Looks like the pics didn't either

Sorry ill upload again when I am at home


----------



## wermad

Everything stock. Need to delid that DC chip as it sits under 70°C at load. May do it today if I can find my ICD7.


----------



## xer0h0ur

Quote:


> Originally Posted by *Ragingun*
> 
> Yes. I have 6 12v rails. 2 separate rails are connected to it.


What power supply are you running? You might as well fill out your system specs with the rigbuilder or sig it to make it easier for people to help you.


----------



## dndfm

thanks man


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Yes. I have 6 12v rails. 2 separate rails are connected to it.
> 
> 
> 
> What power supply are you running? You might as well fill out your system specs with the rigbuilder or sig it to make it easier for people to help you.
Click to expand...

I'll do that when I get to my PC. I have an Enermax Revo 1050w.


----------



## wermad

Most of the high end enermax/lepa units have multiple 25-30am rails.

I almost fell of my chair when I saw the eight 30amp rails the antec hcp1200 has! good lord!


----------



## xer0h0ur

Well in that case it might be worth trying to use different rails to see if something may be wrong with the rails you're currently using. But from what I remember each connector should be getting 30A and that seems like your PSU is cutting it either too close or not up to snuff.


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> Most of the high end enermax/lepa units have multiple 25-30am rails.
> 
> I almost fell of my chair when I saw the eight 30amp rails the antec hcp1200 has! good lord!


Right, my Enermax has 6 12v raTed at 30 amps per. Crazy but that's why I got it!


----------



## xer0h0ur

What I am reading on Enermax's site is that those 12V rails are rated for 30A. Its OCP where it lists 34-45A.

http://www.enermax.com/userfiles/image/revolution85_chart12.jpg


----------



## wermad

yup, just over protection. I would never use more then the specs. Otherwise,something can go boom. My lepa can peak at 1700w but it wasn't design to maintain that.


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> yup, just over protection. I would never use more then the specs. Otherwise,something can go boom. My lepa can peak at 1700w but it wasn't design to maintain that.


True. Trust the specs no doubt. I believe my 1050w is also rated to run at 1400w for more than an hour. Regardless, the standard 30 amp rails are more than enough to feed this card and it shouldn't be sitting down my system.


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> What I am reading on Enermax's site is that those 12V rails are rated for 30A. Its OCP where it lists 34-45A.
> 
> http://www.enermax.com/userfiles/image/revolution85_chart12.jpg


That's correct.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> yup, just over protection. I would never use more then the specs. Otherwise,something can go boom. My lepa can peak at 1700w but it wasn't design to maintain that.


HX850 can go to ~1000w for an hour or so. Nice considering it probably will go over 850w if I overclock


----------



## wermad

edit: which rails are you using btw?


----------



## Alex132

Lol still weird to see -12v for PCI









Wonder when it's going to die out, I mean it's already basically gone in motherboards.


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> edit: which rails are you using btw?


12v 1 and 12v 5. Nothing else running on these rails either.


----------



## Alex132

Quote:


> Originally Posted by *Ragingun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> edit: which rails are you using btw?
> 
> 
> 
> 
> 
> 
> 
> 12v 1 and 12v 5. Nothing else running on these rails either.
Click to expand...

Just try switching up rails? I dunno, without really looking it up that might do something lol.

Unless the cross-load between the rails is really that bad.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> What I am reading on Enermax's site is that those 12V rails are rated for 30A. Its OCP where it lists 34-45A.
> 
> http://www.enermax.com/userfiles/image/revolution85_chart12.jpg
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> 12v 1and *12v 5* . Nothing else running on these rails either.
Click to expand...

Pic xer0h0ur posted:



If you're only using rail #5 for the gpu, that's your issue. You only have 30amps where you need 50amps for both. Like my Lepa (same group as Enermax), you need to split the load from two rails for the 295x2.

Rail #3 is the fixed pcie, use this one along w/ rail #5 for the 295x2. Or Use #4 and #5; or #5 and #6.

I have my two cards like this:

Card #1: rail #3 + #4
Card #2: rail #5 + #6

Edit, if you're concerned about having unused cables, I removed the extra harness from each cable set I used for my cards. I'm using a ppcs.com pin extractor and its been through some heavy sleeving and mods and keeps working. Lepa/Enermax pins are a pita to extract. Push the cable inwards, insert the tool properly, and slide out. You may not to give it a good tug/pull. Fingers will be hurting afterwards


----------



## xer0h0ur

That is why I went with a single rail design to avoid these issues altogether. I still need to get a kill-a-watt meter though to check total system power draw at the outlet. I never did find out what it is.


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> What I am reading on Enermax's site is that those 12V rails are rated for 30A. Its OCP where it lists 34-45A.
> 
> http://www.enermax.com/userfiles/image/revolution85_chart12.jpg
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> 12v 1and *12v 5* . Nothing else running on these rails either.
> 
> 
> 
> 
> 
> Click to expand...
> 
> Pic xer0h0ur posted:
> 
> 
> 
> If you're only using rail #5 for the gpu, that's your issue. You only have 30amps where you need 50amps for both. Like my Lepa (same group as Enermax), you need to split the load from two rails for the 295x2.
> 
> Rail #3 is the fixed pcie, use this one along w/ rail #5 for the 295x2. Or Use #4 and #5; or #5 and #6.
> 
> I have my two cards like this:
> 
> Card #1: rail #3 + #4
> Card #2: rail #5 + #6
> 
> Edit, if you're concerned about having unused cables, I removed the extra harness from each cable set I used for my cards. I'm using a ppcs.com pin extractor and its been through some heavy sleeving and mods and keeps working. Lepa/Enermax pins are a pita to extract. Push the cable inwards, insert the tool properly, and slide out. You may not to give it a good tug/pull. Fingers will be hurting afterwards
Click to expand...

I have both the 12v 1 AND the 12v 5 going to the card. That is 60 amps, not 30.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is why I went with a single rail design to avoid these issues altogether. I still need to get a kill-a-watt meter though to check total system power draw at the outlet. I never did find out what it is.


Issues only really exist in older PSUs not designed with this massive load in mind


----------



## xer0h0ur

Quote:


> Originally Posted by *Ragingun*
> 
> I have both the 12v 1 AND the 12v 5 going to the card. That is 60 amps, not 30.


According to the chart I posted 12V1 is being shared with the motherboard. That is not what you want to do. In fact don't connect that rail at all to a video card.


----------



## wermad

Quote:


> Originally Posted by *Ragingun*
> 
> I have both the 12v 1 AND the 12v 5 going to the card. That is 60 amps, not 30.


Are you using the fixed pcie cable?


----------



## Alex132

dont use 12v that is shared with cpu / mobo lol...


----------



## wermad

Rail #1 and #2 are for the mb and cpu, so you must have plugged in the extra cpu cable into the card (????







). I hope you didn't, rail #3 is for the fixed 6+2/6+2 pcie cables.


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> I have both the 12v 1 AND the 12v 5 going to the card. That is 60 amps, not 30.
> 
> 
> 
> Are you using the fixed pcie cable?
Click to expand...

Yes. As it is listed it's actually 12v 3, not the 12v 1 as I said...it was just combined physical location so I called it 12v 1. Therefore I am actually using 12v 3 amd 12v 1.


----------



## Ragingun

Quote:


> Originally Posted by *Ragingun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> I have both the 12v 1 AND the 12v 5 going to the card. That is 60 amps, not 30.
> 
> 
> 
> Are you using the fixed pcie cable?
> 
> Click to expand...
> 
> Yes. As it is listed it's actually 12v 3, not the 12v 1 as I said...it was just combined physical location so I called it 12v 1. Therefore I am actually using 12v 3 amd 12v 1.
Click to expand...

Sorry! 12v 3 and 12v 5 lol


----------



## wermad

kewl









for t/s, try another rail along w/ the #3. If doesn't work. try a different combo w/ #5.


----------



## Alex132

Quote:


> Originally Posted by *Ragingun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> I have both the 12v 1 AND the 12v 5 going to the card. That is 60 amps, not 30.
> 
> 
> 
> Are you using the fixed pcie cable?
> 
> Click to expand...
> 
> Yes. As it is listed it's actually 12v 3, not the 12v 1 as I said...it was just combined physical location so I called it 12v 1. Therefore I am actually using 12v 3 amd 12v 1.
> 
> Click to expand...
> 
> Sorry! 12v 3 and 12v 5 lol
Click to expand...

So like wermad who's running:

Card #1: rail #3 + #4
Card #2: rail #5 + #6

?

Try that - if it works then, well great. If it doesn't, moar combos, then if that doesn't work -> RMA / get new PSU / Check cards / rails individually.


----------



## Ragingun

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> I have both the 12v 1 AND the 12v 5 going to the card. That is 60 amps, not 30.
> 
> 
> 
> Are you using the fixed pcie cable?
> 
> Click to expand...
> 
> Yes. As it is listed it's actually 12v 3, not the 12v 1 as I said...it was just combined physical location so I called it 12v 1. Therefore I am actually using 12v 3 amd 12v 1.
> 
> Click to expand...
> 
> Sorry! 12v 3 and 12v 5 lol
> 
> Click to expand...
> 
> So like wermad who's running:
> 
> Card #1: rail #3 + #4
> Card #2: rail #5 + #6
> 
> ?
> 
> Try that - if it works then, well great. If it doesn't, moar combos, then if that doesn't work -> RMA / get new PSU / Check cards / rails individually.
Click to expand...

I've never had this with this psu before even in triple sli. I've tested the amps and volts. I just got the card last week so it's likely that. I only have a single card not 2 cards so there's no reason I don't have much more than enough power.


----------



## Alex132

Quote:


> Originally Posted by *Ragingun*
> 
> I've never had this with this card before even in triple sli. I've tested the amps and volts. I just got the card last week so it's likely that. I only have a single card not 2 cards so there's no reason I don't have much more than enough power.


Instant shutdown = PSU issue almost always.

I'd be nice if you could use a friend's PSU to test tbh...

Heck it could be something obscure like 12v1 being overloaded, and VRMs not getting enough power and shutting down? Is that even possible? I know 12v1 is shared with CPU... and the load on CPU could get to be a fair amount...


----------



## Ragingun

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragingun*
> 
> I've never had this with this card before even in triple sli. I've tested the amps and volts. I just got the card last week so it's likely that. I only have a single card not 2 cards so there's no reason I don't have much more than enough power.
> 
> 
> 
> Instant shutdown = PSU issue almost always.
> 
> I'd be nice if you could use a friend's PSU to test tbh...
> 
> Heck it could be something obscure like 12v1 being overloaded, and VRMs not getting enough power and shutting down? Is that even possible? I know 12v1 is shared with CPU... and the load on CPU could get to be a fair amount...
Click to expand...

Right. But to clarify as I did before, 12v 3 and 12v 5 are going to my card. They are isolated rails delivering full potential to the card so 60 amps is getting there according to my Fluke 12v tester.


----------



## xer0h0ur

Just try using different rails until you find a setup that works for you. How old is this power supply, as in how many years has it been in use?


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just try using different rails until you find a setup that works for you. How old is this power supply, as in how many years has it been in use?


3 years at most. I suppose I could leave a volt/amp meter on it as I'm gaming some I only checked it once while in game.


----------



## xer0h0ur

Do you by any chance have extra PCI-E cables so you can connect to a different rail so you're not using the 12V3 rail? So for instance 12V4 and 12V6 or 12V5 and 12V6 or 12V5 and 12V4? Basically I just wonder if the standard 12V3 rail may have been compromised. Or hell it may be the opposite where 12V3 is fine and 12V5 is the culprit.


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> Do you by any chance have extra PCI-E cables so you can connect to a different rail so you're not using the 12V3 rail? So for instance 12V4 and 12V6 or 12V5 and 12V6 or 12V5 and 12V4? Basically I just wonder if the standard 12V3 rail may have been compromised. Or hell it may be the opposite where 12V3 is fine and 12V5 is the culprit.


I switched the 12v3 rail to the 12v5 rail. We shall see after this. What's I also find interesting is this PSU has over voltage protection, so if it were to be the PSU and it shut off due to that, the indicator light should turn red. It does not. It goes from green being the on position to orange being the off position when it shuts off during games. There is no red indicator light and I know it works because when I forget to plug something in such as the CPU 8 pin plug it will turn red when I attempt to start the computer and not allow it to start.


----------



## xer0h0ur

You were already hooked up to the 5th rail before though so as long as you aren't connecting both PCI-E power cables to the same rail it should be fine.


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> You were already hooked up to the 5th rail before though so as long as you aren't connecting both PCI-E power cables to the same rail it should be fine.


Right. I switched the 12v 5 to the 12v 6 and moved the 12v 3 to the 12v 5. So I tried out the 12v 5 and 12v 6 amd just had the same thing happen twice on 2 different games and still no red warning light from the PSU. Gotta be the card. It's the only thing I changed since I returned my other cards.


----------



## xer0h0ur

The only other thing I would try would be to hook up again now avoiding the 5th rail. If that doesn't do the trick then you may be right about the card being the issue. Out of curiosity you're not running some extreme overclock on your processor are you?


----------



## wermad

Ragin, are you using any extensions or custom cables for your gpu?


----------



## Mega Man

finally home heres the pics, so sorry for the delay

ahhh they are too big ! thats why it didnt upload

https://onedrive.live.com/redir?resid=4832FA7A23C5C469!3333&authkey=!ACUFe0HdWTXyrNA&ithint=folder%2c


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> The only other thing I would try would be to hook up again now avoiding the 5th rail. If that doesn't do the trick then you may be right about the card being the issue. Out of curiosity you're not running some extreme overclock on your processor are you?


No extreme OC. I typically run my 5820k at 4ghz. I can OC to 4.8ghz and never had an issue but for normal day to day use its at 4ghz.


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> Ragin, are you using any extensions or custom cables for your gpu?


Not for my gpu. I have an extension for my cpu but I've had that for years and in much higher draw cpu's too.


----------



## wermad

run a cpu only bench and see if the shuts off (prime, aid64, ibt, etc.).


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> run a cpu only bench and see if the shuts off (prime, aid64, ibt, etc.).


I will do that as well as switch the 12v 5 to the 12v 4.


----------



## Ragingun

Might have figured it out y'all! My PSU fan ain't workin. Time to replace that and I'll post back up tonight!


----------



## wermad

Any warranty to rma?


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> Any warranty to rma?


Ya, Newegg 30 day exchange policy. Like I mentioned I just bought this 7 days ago.


----------



## silencespr

Starting not to like my R9 295x2 thing gets so hot playing 4k in BF4


----------



## wermad

@ragingun

Wow, you recently bought that one? On special?


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> @ragingun
> 
> Wow, you recently bought that one? On special?


Yup. That's why I am concerned it's the card. It didn't happen with my tri sli 970. It was on special for $599.


----------



## Ragingun

Quote:


> Originally Posted by *silencespr*
> 
> Starting not to like my R9 295x2 thing gets so hot playing 4k in BF4


What temps? I play at 1440p all settings maxed and temps are never above 70.


----------



## Alex132

Quote:


> Originally Posted by *Ragingun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> @ragingun
> 
> Wow, you recently bought that one? On special?
> 
> 
> 
> Yup. That's why I am concerned it's the card. It didn't happen with my tri sli 970. It was on special for $599.
Click to expand...

Oh you also jumped on that?


----------



## Alex132

What PPD would I get on a 295X2?

This thing would be my space-heater in Winter


----------



## Ragingun

Yup I jumped on it. It gets lower temps than my 970's did.


----------



## wermad

so the psu fan is not working or something else? did you get the psu recently? Its kinda of an older model....

edit: I'm a bit confused as I was referring to the psu you mentioned the fan is not spinning and if you had just recently purchased that.


----------



## xer0h0ur

Yeah I am confused. You said PSU fan isn't working and that the PSU is like 3 years old but then you respond to a question about warranty with the 30 day warranty for the video card. Wat?


----------



## electro2u




----------



## wermad

Lol, I luv that movie!

Socal temps just went from 65 to 90f...frahk!


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> Lol, I luv that movie!
> 
> Socal temps just went from 65 to 90f...frahk!


Its 90s right now at work here in LA.


----------



## Alex132

90'f is childs-play.

125'f in the shade is where it's at yo.


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I am confused. You said PSU fan isn't working and that the PSU is like 3 years old but then you respond to a question about warranty with the 30 day warranty for the video card. Wat?


Lol! Thought ya meant the card! No warranty that I'm aware of left on the psu.


----------



## wermad

Ah, ok. well just pick up fan and see how long it will hold. But, for the sake of the expensive hardware, i think a new unit might be in order. There's still a bit of ex-mining surplus on ebay, might wanna head there. I have a V1000 if you're interested as well.

edit:

A little bit of oc'ing:


----------



## Ragingun

What CPU you running wermad?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ragingun*
> 
> What CPU you running wermad?


Intel i5 4690K


----------



## Ragingun

I think I posted this already but here is my score with one card. http://www.3dmark.com/3dm/6197192

And I've got SUCCESS! Just ran for over an hour and NO CRASHES! Had to have been my PSU fan. Good deal!


----------



## wermad

Nice score! I'm @ 4.6 but stock everything else. I spent the day yesterday exploring what my little cpu can do.


----------



## Roxycon

to the ones with EK WB on their cards, did you use PVC washers between the backplate and the cards? instructions doesn't mention the washers but it was required on my 680's so im just worried i guess


----------



## wermad

http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869093.pdf

http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869079.pdf

Interesting, the block install says you do on some holes and the backplate doesn't show them when you reuse those holes









To err on the side of safety, I would use them (knowing how of a pita that will be).

Edit: anyone interested in nominating themselves or someone to take over and run the club? Nav hasn't been online in a bit and I would like to get the op updated and regularly managed (I'm sure he's busy and it does happen to many clubs). Once we find someone, we can reach out to the section mod.


----------



## xer0h0ur

Quote:


> Originally Posted by *Roxycon*
> 
> to the ones with EK WB on their cards, did you use PVC washers between the backplate and the cards? instructions doesn't mention the washers but it was required on my 680's so im just worried i guess


I did use them at least.


----------



## starichok

Guys need your help,my 295x2 is getting too hot,around 74-75 at regular gaming.
Any tips on making it run colder?bought new fans,even open door to the case,which helps a little and let the temperature stays at 73 most,but i have to keep my case on the table in order to keep this temperature at this level,if i put my desktop on the floor under the table and leave side door open,it get hot again....
So im looking for solution...should i buy some cooling kit for th card?or any ideas?thank you in advance


----------



## Gabe324

how is this???

http://www.3dmark.com/fs/4325765 didn't overclock at all temps where like around 60 - 62


----------



## starichok

Good score man!im having 18325 at fire strike,and temperatures 64-68
But as soon as i do some gaming,temperatures come to high and i start throtting


----------



## Feyris

Had to buy a 1000w psu for friend SO get to test quadfire. (cmon thermaltake ULTRA of all people has 12 90a single 996w rail on their 1KW psu, what were u thinking for the TP 1350)

I wonder who makes the Ultra X4 1000W Special Edition though

Apparently it is Andyson E- Series platform


----------



## Gabe324

guys , is there a option to turn of the second r9 295x2 when you are just browsing , it was doing it a couple hours ago but now it doesnt.... ever since i was messing with msi afterburner. i thought it was because ulps is disabled but i enabled it again and its not turning off the second gpu..... what can i do at this point?


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869093.pdf
> 
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109869079.pdf
> 
> Interesting, the block install says you do on some holes and the backplate doesn't show them when you reuse those holes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To err on the side of safety, I would use them (knowing how of a pita that will be).
> 
> Edit: anyone interested in nominating themselves or someone to take over and run the club? Nav hasn't been online in a bit and I would like to get the op updated and regularly managed (I'm sure he's busy and it does happen to many clubs). Once we find someone, we can reach out to the section mod.


I didn't and it works fine.

If you look how close the oem back plate is there is no worry about a short. They already snipped the long legs. On soldered things
Quote:


> Originally Posted by *Feyris*
> 
> Had to buy a 1000w psu for friend SO get to test quadfire. (cmon thermaltake ULTRA of all people has 12 90a single 996w rail on their 1KW psu, what were u thinking for the TP 1350)
> 
> I wonder who makes the Ultra X4 1000W Special Edition though
> 
> Apparently it is Andyson E- Series platform


... wanna be my friend
Quote:


> Originally Posted by *Gabe324*
> 
> guys , is there a option to turn of the second r9 295x2 when you are just browsing , it was doing it a couple hours ago but now it doesnt.... ever since i was messing with msi afterburner. i thought it was because ulps is disabled but i enabled it again and its not turning off the second gpu..... what can i do at this point?


It is upls. Did you restart your pc? If not are you using flash


----------



## Gabe324

Quote:


> Originally Posted by *Mega Man*
> 
> I didn't and it works fine.
> 
> If you look how close the oem back plate is there is no worry about a short. They already snipped the long legs. On soldered things
> ... wanna be my friend
> It is upls. Did you restart your pc? If not are you using flash


i restarted pc 3 times already and for a few seconds its off then turns on.... no idea why


----------



## wermad

Quote:


> Originally Posted by *starichok*
> 
> Guys need your help,my 295x2 is getting too hot,around 74-75 at regular gaming.
> Any tips on making it run colder?bought new fans,even open door to the case,which helps a little and let the temperature stays at 73 most,but i have to keep my case on the table in order to keep this temperature at this level,if i put my desktop on the floor under the table and leave side door open,it get hot again....
> So im looking for solution...should i buy some cooling kit for th card?or any ideas?thank you in advance


Check your case's air flow. I had a 10c difference when i placed my case from under my desk to the top. i have a single chamber for my custom water loop and so it helps to feed it fresh air. Also, check the air flow to the card itself to ensure the vrm's and plx are not getting too hot.

Recommend you fill out your specs in the rig builder (click your name on the top right and go down the menu. Click edit to add your rig's specs.
Quote:


> Originally Posted by *Gabe324*
> 
> how is this???
> 
> http://www.3dmark.com/fs/4325765 didn't overclock at all temps where like around 60 - 62
> 
> Quote:
> 
> 
> 
> Originally Posted by *starichok*
> 
> Good score man!im having 18325 at fire strike,and temperatures 64-68
> But as soon as i do some gaming,temperatures come to high and i start throtting
Click to expand...

Benches will run your gpu's max for a few stints. Remember the last two benches of 3d11 and FS are cpu and combo. Looking at the usage in Ab, only the gpu dedicated tests reach 99% and also consider the brief cool down's between tests. A game is constantly running and so temps will rise soon.

Quote:


> Originally Posted by *Feyris*
> 
> Had to buy a 1000w psu for friend SO get to test quadfire. (cmon thermaltake ULTRA of all people has 12 90a single 996w rail on their 1KW psu, what were u thinking for the TP 1350)
> 
> I wonder who makes the Ultra X4 1000W Special Edition though
> 
> Apparently it is Andyson E- Series platform


I was actually filling in the oem on the list I compiled but it was nicked to make the list asap. I did leave the link to the site where you can find the oem pretty easily. Surprising on how so many units with the same model sku are made by a few different oem's. I do like how some companies will change the sku naming to indicate a big change.

Quote:


> Originally Posted by *Gabe324*
> 
> guys , is there a option to turn of the second r9 295x2 when you are just browsing , it was doing it a couple hours ago but now it doesnt.... ever since i was messing with msi afterburner. i thought it was because ulps is disabled but i enabled it again and its not turning off the second gpu..... what can i do at this point?
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> I didn't and it works fine.
> 
> If you look how close the oem back plate is there is no worry about a short. They already snipped the long legs. On soldered things
> ... wanna be my friend
> It is upls. Did you restart your pc? If not are you using flash
Click to expand...

Ulps as mega said. Though it usually turns off the other cores not a single card. You will notice that core # 2-4 will show 0°C and 0% usage.

Quote:


> Originally Posted by *Gabe324*
> 
> i restarted pc 3 times already and for a few seconds its off then turns on.... no idea why


Usually its on, try a fresh install of drivers. Before you do this, I would recommend uninstall AB and Trixx, make sure there's no folder left. Reinstall these guys after a driver install. See if ulps kicks in.

Gabe, it also helps if you fill your specs in the rig builder (follow the instructions i gave starichok).


----------



## Mega Man

Quote:


> Originally Posted by *Gabe324*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> I didn't and it works fine.
> 
> If you look how close the oem back plate is there is no worry about a short. They already snipped the long legs. On soldered things
> ... wanna be my friend
> It is upls. Did you restart your pc? If not are you using flash
> 
> 
> 
> i restarted pc 3 times already and for a few seconds its off then turns on.... no idea why
Click to expand...

if you already restarted it sounds to me like you have hardware acceleration enabled

you need to disable it in several places, and firefox itself is extremely hard to shut off,

Ninja edit

proof of concept



Dowered from my aquaero

also to note it is not dim-able via PWM


----------



## Feyris

Random but. my 295X2 from newegg I went ahead registered on xfxsupport. But instead of my customer name (like 7990 is on site) when I see warranty info with S/N i get customer name of "MAGNE1 " on both with shipment dates of 1/14/15. I am so confused.


----------



## remnant

Would two of these cards be limittd by haveing a z87 mobo or an i5


----------



## electro2u

Quote:


> Originally Posted by *remnant*
> 
> Would two of these cards be limittd by haveing a z87 mobo or an i5


Not at all


----------



## wermad

Quote:


> Originally Posted by *remnant*
> 
> Would two of these cards be limittd by haveing a z87 mobo or an i5


-i5 (looks at rig)

-8x pcie 3.0 (see hard ocp quad-crossfire review using MVE Z87 8x/8x 3.0)


----------



## Alex132

Heck, I'm going to test it out with an old-ass i5 and PCIE 2.0









I am pretty certain that the same amount of bandwidth is utilized for the 295X2 as the 290X. Because each core doesn't 'talk' to the CPU/Mobo at the same time, they alternate through the use of the PLX chip. Right?

We'll only see limitations of PCIE 3 being reached with 390X HBM


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Check your case's air flow. I had a 10c difference when i placed my case from under my desk to the top. i have a single chamber for my custom water loop and so it helps to feed it fresh air. Also, check the air flow to the card itself to ensure the vrm's and plx are not getting too hot.
> 
> Recommend you fill out your specs in the rig builder (click your name on the top right and go down the menu. Click edit to add your rig's specs.
> Benches will run your gpu's max for a few stints. Remember the last two benches of 3d11 and FS are cpu and combo. Looking at the usage in Ab, only the gpu dedicated tests reach 99% and also consider the brief cool down's between tests. A game is constantly running and so temps will rise soon.
> I was actually filling in the oem on the list I compiled but it was nicked to make the list asap. I did leave the link to the site where you can find the oem pretty easily. Surprising on how so many units with the same model sku are made by a few different oem's. I do like how some companies will change the sku naming to indicate a big change.
> Ulps as mega said. Though it usually turns off the other cores not a single card. You will notice that core # 2-4 will show 0°C and 0% usage.
> Usually its on, try a fresh install of drivers. Before you do this, I would recommend uninstall AB and Trixx, make sure there's no folder left. Reinstall these guys after a driver install. See if ulps kicks in.
> 
> Gabe, it also helps if you fill your specs in the rig builder (follow the instructions i gave starichok).


Im pretty sure andyson E sucks because well... PSU smells funny.


----------



## ReV2ReD

Quote:


> Originally Posted by *remnant*
> 
> Would two of these cards be limittd by haveing a z87 mobo or an i5


I had two of them in a Z77 Ivy Bridge (PCIe 3.0) i5 3570k (4.6 OC) system, and it did not run well. Benchmarks like Firestrike ran great, but I had major problems in all the games that I play regularly (Battlefield 4, Shadow of Mordor, Far Cry 4, Far Cry 3, all of my flight sims). I ended up returning one of the 295x2s and replaced it with a single 290x. I now run Trifire with a 295x2 and a 290x, and I'm mostly happy. I do have a weird problem where my games load noticeably slower with the 290X enabled (which was even worse with the second 295x2), but it's not a major issue, just a little annoying (if anyone has any thoughts on this, I would love to hear them). Were I to guess at my Quadfire probklem, I would venture that the PLX chips on each of the 295x2s were playing hell on the PCIe lanes from the i5. It looks like those running the X99 systems (which, in effect, basically doubles the available PCIe lanes) are not having any problems at all. I'm not sure how the Z97 Haswell systems are faring with Quadfire 295x2s. Oh, and keep in mind that you will need a hefty power supply to run the two 295x2s. With each card requiring two dedicated 28A 8-pin PCIe connectors, you're looking at 112A just for your video cards at full draw.


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> Im pretty sure andyson E sucks because well... PSU smells funny.


----------



## DraDeC

Hello everyone!









Another official R9 295x2 owner here









I am worried about its performance during some benchmarks, I suspect there must be something wrong (or may be not but I am not being able to wonder what's going on).

I have done some benchmarks to my high end system , and there are 2 specially benchmarks which are giving low score in my opinion. Since I am not an expert , I prefer to share it with people who is









Let's check the results of 3D Mark firestrike (normal , not extreme, not 4K) . The score is quite low , specially the graphic tests O_O

Here's the link : http://www.3dmark.com/3dm/6263301?

Take a look not the global score only but specially the graphic one O_O Isn't it the really low? Ins't it supposed to score higher?

I really don't know what's going on. It is a fresh set up , clean , latest drivers. I tried to overclock the card but , man , I didn't stop having restarts...it was quite unstable.

Sometimes during test , it is noticeable how it gets stops and goes slow dropping fps quite much. It is also noticeable when testing Unigine Heaven 4.0

Monitor is 4K but the resolution for test has been set to 1080 . PSU is an Enermax Platimax 1500w , (I had the idea to try another R9 295x2 but ....well...







) Of course crossfire option is activated, but not override one.

Any ideas, opinions are more than welcome.

Meanwhile , I just will go to my dark corner and cry lol

Thank you!


----------



## Alex132

What are your temps during the runs? Did you do a fresh driver install (ie; driver sweeper etc.) before you installed the GPU? A fresh install of Windows can't be bad either...

That's very bad compared to my GTX690 even...


----------



## DraDeC

Up to 65ºC whole time. I have been pushing it further with several benchmarks and it has not raised much. Which is curious because it reaches 74ºC (Max I've seen so far) when gaming for hours or when watching movies (full screen in 4K resolution).

After checking the comparison I want to cry more









I have noticed something with CPU-Z program, first GPU says to have Core clock at 1018Mhz and Memory clock at 1250 Mhz but the second GPU... something is wrong it says Core Clock 300Mhz and Memory Clock 150Mhz O_O

I don't know why .

I will test with MSI Afterburner and see if I can change it . Last time I tried with Catalyst (which I don't know if I can choose GPU) just trying to raise he Core clock up to 1050Mhz made system unstable


----------



## remnant

Quote:


> Originally Posted by *electro2u*
> 
> Not at all


Quote:


> Originally Posted by *wermad*
> 
> -i5 (looks at rig)
> 
> -8x pcie 3.0 (see hard ocp quad-crossfire review using MVE Z87 8x/8x 3.0)


Quote:


> Originally Posted by *Alex132*
> 
> Heck, I'm going to test it out with an old-ass i5 and PCIE 2.0
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am pretty certain that the same amount of bandwidth is utilized for the 295X2 as the 290X. Because each core doesn't 'talk' to the CPU/Mobo at the same time, they alternate through the use of the PLX chip. Right?
> 
> We'll only see limitations of PCIE 3 being reached with 390X HBM


Ok, thanks guys. I'm looking I'm working toward a graduation upgrade getting parts as I can on sale or used(gotta offset the cost of 295x2







). Wanted to know before I chose to get a used z87 and the ram I wanted for cheap.

edit







: I googled Hard ocp ... and wasn't sure exactly what I was looking for.


----------



## joeh4384

Quote:


> Originally Posted by *DraDeC*
> 
> Hello everyone!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another official R9 295x2 owner here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am worried about its performance during some benchmarks, I suspect there must be something wrong (or may be not but I am not being able to wonder what's going on).
> 
> I have done some benchmarks to my high end system , and there are 2 specially benchmarks which are giving low score in my opinion. Since I am not an expert , I prefer to share it with people who is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let's check the results of 3D Mark firestrike (normal , not extreme, not 4K) . The score is quite low , specially the graphic tests O_O
> 
> Here's the link : http://www.3dmark.com/3dm/6263301?
> 
> Take a look not the global score only but specially the graphic one O_O Isn't it the really low? Ins't it supposed to score higher?
> 
> I really don't know what's going on. It is a fresh set up , clean , latest drivers. I tried to overclock the card but , man , I didn't stop having restarts...it was quite unstable.
> 
> Sometimes during test , it is noticeable how it gets stops and goes slow dropping fps quite much. It is also noticeable when testing Unigine Heaven 4.0
> 
> Monitor is 4K but the resolution for test has been set to 1080 . PSU is an Enermax Platimax 1500w , (I had the idea to try another R9 295x2 but ....well...
> 
> 
> 
> 
> 
> 
> 
> ) Of course crossfire option is activated, but not override one.
> 
> Any ideas, opinions are more than welcome.
> 
> Meanwhile , I just will go to my dark corner and cry lol
> 
> Thank you!


My 295x2 scores around 22k in the gpu score on firestrike. My 290x in another rig gets around 12k or so in graphics score. Something doesn't seem right with your card.

295x2
http://www.3dmark.com/fs/4309213

290x
http://www.3dmark.com/fs/3952564


----------



## wermad

Quote:


> Originally Posted by *remnant*
> 
> Ok, thanks guys. I'm looking I'm working toward a graduation upgrade getting parts as I can on sale or used(gotta offset the cost of 295x2
> 
> 
> 
> 
> 
> 
> 
> ). Wanted to know before I chose to get a used z87 and the ram I wanted for cheap.
> 
> edit
> 
> 
> 
> 
> 
> 
> 
> : I googled Hard ocp ... and wasn't sure exactly what I was looking for.


http://m.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review#.VQjtYxBlA0N


----------



## joeh4384

Quote:


> Originally Posted by *DraDeC*
> 
> Hello everyone!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another official R9 295x2 owner here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am worried about its performance during some benchmarks, I suspect there must be something wrong (or may be not but I am not being able to wonder what's going on).
> 
> I have done some benchmarks to my high end system , and there are 2 specially benchmarks which are giving low score in my opinion. Since I am not an expert , I prefer to share it with people who is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let's check the results of 3D Mark firestrike (normal , not extreme, not 4K) . The score is quite low , specially the graphic tests O_O
> 
> Here's the link : http://www.3dmark.com/3dm/6263301?
> 
> Take a look not the global score only but specially the graphic one O_O Isn't it the really low? Ins't it supposed to score higher?
> 
> I really don't know what's going on. It is a fresh set up , clean , latest drivers. I tried to overclock the card but , man , I didn't stop having restarts...it was quite unstable.
> 
> Sometimes during test , it is noticeable how it gets stops and goes slow dropping fps quite much. It is also noticeable when testing Unigine Heaven 4.0
> 
> Monitor is 4K but the resolution for test has been set to 1080 . PSU is an Enermax Platimax 1500w , (I had the idea to try another R9 295x2 but ....well...
> 
> 
> 
> 
> 
> 
> 
> ) Of course crossfire option is activated, but not override one.
> 
> Any ideas, opinions are more than welcome.
> 
> Meanwhile , I just will go to my dark corner and cry lol
> 
> Thank you!


What are the specs of your rig and do you have decent airflow? It would be helpful to post a graph with temps and clock speeds on both GPUs.


----------



## Alex132

Quote:


> Originally Posted by *DraDeC*
> 
> Up to 65ºC whole time. I have been pushing it further with several benchmarks and it has not raised much. Which is curious because it reaches 74ºC (Max I've seen so far) when gaming for hours or when watching movies (full screen in 4K resolution).
> 
> After checking the comparison I want to cry more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have noticed something with CPU-Z program, first GPU says to have Core clock at 1018Mhz and Memory clock at 1250 Mhz but the second GPU... something is wrong it says Core Clock 300Mhz and Memory Clock 150Mhz O_O
> 
> I don't know why .
> 
> I will test with MSI Afterburner and see if I can change it . Last time I tried with Catalyst (which I don't know if I can choose GPU) just trying to raise he Core clock up to 1050Mhz made system unstable


Try disabling Zero-core


----------



## DraDeC

Quote:


> Originally Posted by *Alex132*
> 
> Try disabling Zero-core


How I do that with MSI Afterburner (I am newbie with this program, first time







)

Quote:


> Originally Posted by *joeh4384*
> 
> My 295x2 scores around 22k in the gpu score on firestrike. My 290x in another rig gets around 12k or so in graphics score. Something doesn't seem right with your card.
> 
> 295x2
> http://www.3dmark.com/fs/4309213
> 
> 290x
> http://www.3dmark.com/fs/3952564


OMG , ok now I have a reference , thank you joeh4384
Quote:


> Originally Posted by *joeh4384*
> 
> What are the specs of your rig and do you have decent airflow? It would be helpful to post a graph with temps and clock speeds on both GPUs.


My specs :

CPU: Intel Core i7-5820K 3.3Ghz
GPU: R9 295x2
Cooler : Air flow Noctua NH-D15
MotherBoard: ASUS X99 Deluxe ( 1305 BIOS )
PSU: Enermax Platimax 1500W Modular 80 Plus Platinum
RAM: G.Skill Ripjaws 4 Blue DDR4 2133 PC4-17000 32GB CL15
Monitor: Asus PB287Q 28" LED 4K UltraHD ( set at 1080p resolution ).

Yesterday during night I was doing a late check and I've noticed something that may be bad, I am not liking what I saw at least, so I am going to share it with your expert opinions









Crossfire is enabled through Catalyst Control Center using latest driver . Overdrive not activated.

This is what CPU-Z shows (it looks like second GPU is not working or sleeping or who knows)







Is there any way to change this through GPU Tweak or MSI After burner?

Most results I have got from google search engine so far in these few mins I got from work break (lol yeah sneaky ninja) are about uninstalling and instally the drivers and testing even beta ones. I will give it a try once I get a chance today.

am I the only one having this issue?









Thank you everyone for the ideas and help.


----------



## Feyris

Thats normal. Ulps/zero core disables 2nd gpu when it is not needed to save power and lower heat. If you play game itll show in gpuz with same clocks as gpu1. It only does that idling and with like flash etc


----------



## DraDeC

Quote:


> Originally Posted by *Feyris*
> 
> Thats normal. Ulps/zero core disables 2nd gpu when it is not needed to save power and lower heat. If you play game itll show in gpuz with same clocks as gpu1. It only does that idling and with like flash etc


Thank you Feyris , then if it's normal I will not worry









Once I get a chance when being at home I will try this : http://www.overclock.net/t/667144/crossfire-disabling-ulps . Will it work or perhaps it may get card too hot?

I'll test with a game at full screen and see if Crossfire logo (enabled) appears or not (not sure where or how but that's what I have checked in AMD forums)

It is being a strange thing for sure









==========================================

EDIT: I ran another 3DMark benchark, quite the same result (-100/200 difference with the other one).

I have taken screenshots with GPU-Z and it shows only 1 GPU is working and the other doesn't ( for some reason its clock stays at 300MHz and memory clock at 150MHz twice only, the rest of the time, mere decoration

(PSU => http://www.enermax.com/home.php?fn=eng/product_a1_1_1&lv0=1&lv1=52&no=191 )

Driver :


GPU-Z









It seems it may something limiting the second GPU use, may be the temps? Perhaps...something in BIOS profile ? or could it be the driver?

Any idea will be more than welcome .


----------



## ramos29

anandtech mad a test of the titanx and compared it to 295x2, the 295x2 is better in every test but i noticed that they are using 15.3 beta driver!!!!! in far cry crossfire is working at 4k the 295x2 delivers 58 fps!! the new driver is out today right?


----------



## joeh4384

Quote:


> Originally Posted by *DraDeC*
> 
> Thank you Feyris , then if it's normal I will not worry
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once I get a chance when being at home I will try this : http://www.overclock.net/t/667144/crossfire-disabling-ulps . Will it work or perhaps it may get card too hot?
> 
> I'll test with a game at full screen and see if Crossfire logo (enabled) appears or not (not sure where or how but that's what I have checked in AMD forums)
> 
> It is being a strange thing for sure
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ==========================================
> 
> EDIT: I ran another 3DMark benchark, quite the same result (-100/200 difference with the other one).
> 
> I have taken screenshots with GPU-Z and it shows only 1 GPU is working and the other doesn't ( for some reason its clock stays at 300MHz and memory clock at 150MHz twice only, the rest of the time, mere decoration
> 
> (PSU => http://www.enermax.com/home.php?fn=eng/product_a1_1_1&lv0=1&lv1=52&no=191 )
> 
> Driver :
> 
> 
> GPU-Z
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It seems it may something limiting the second GPU use, may be the temps? Perhaps...something in BIOS profile ? or could it be the driver?
> 
> Any idea will be more than welcome .


Are you alt tabbing out of a game when the graph dips? The temps seem pretty high if only 1 GPU is active. I would enable on screen monitoring in MSI Afterburner and you should be able to monitor clock speeds, temps and gpu load. MSI afterburner also makes it easy to disable ULPS. I would also download DDU and reinstall drivers.

http://www.wagnardmobile.com/DDU/ddudownload.htm

http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


----------



## Ragingun

Quote:


> Originally Posted by *ramos29*
> 
> anandtech mad a test of the titanx and compared it to 295x2, the 295x2 is better in every test but i noticed that they are using 15.3 beta driver!!!!! in far cry crossfire is working at 4k the 295x2 delivers 58 fps!! the new driver is out today right?


Supposedly it will be out today. I've been running FC4 with my 295 in xfire and can testify that FC4 is much better optimized than FC3. I can't even hold 70 fps in FC3 but FC4 I'm turning an average of 90 at 1440p.


----------



## ramos29

i disabled crossfire in fc4!! how did you manage to have such fps in crossfire? freindly afr mode? it never works for me


----------



## joeh4384

AFR friendly was working good in FC4 until the last ubisoft patch. You just cant use SMAA with it.


----------



## xer0h0ur

15.3.1 Beta: http://www.guru3d.com/files-get/amd-catalyst-15-x-download,2.html

Turns out this is an older version of the Beta. There is a newer build available now as well with a March 14th date on it but apparently the "official" Beta AMD is releasing is newer with more fixes.


----------



## Ragingun

Quote:


> Originally Posted by *ramos29*
> 
> i disabled crossfire in fc4!! how did you manage to have such fps in crossfire? freindly afr mode? it never works for me


I will check what I figured out tonight and let ya know.


----------



## ramos29

Quote:


> Originally Posted by *Ragingun*
> 
> I will check what I figured out tonight and let ya know.


ok thx bro, a screenshot of your ccc will be more than welcome


----------



## ramos29

Quote:


> Originally Posted by *xer0h0ur*
> 
> 15.3.1 Beta: http://www.guru3d.com/files-get/amd-catalyst-15-x-download,2.html


does this driver support vsr? cant test it before tomorow


----------



## xer0h0ur

Scratch that, turns out that is an older version of the Beta after all. There is another file version which is newer yet still not as updated as the "official" Beta AMD is supposed to be releasing today.


----------



## ramos29

Quote:


> Originally Posted by *xer0h0ur*
> 
> Scratch that, turns out that is an older version of the Beta after all. There is another file version which is newer yet still not as updated as the "official" Beta AMD is supposed to be releasing today.


we are seeing plenty delayed game hope this driver wont be delayed :/


----------



## ramos29

bastards, bastards every where: rockstar (gta5) CD Projekt ( witcher 3) and now amd
driver delayed for more f***** polish. i ve been waiting all daylong for this f***** driver and they pushed it back as simple as that


----------



## ReV2ReD

Quote:


> Originally Posted by *ramos29*
> 
> bastards, bastards every where: rockstar (gta5) CD Projekt ( witcher 3) and now amd
> driver delayed for more f***** polish. i ve been waiting all daylong for this f***** driver and they pushed it back as simple as that


Yeah, it's pretty bad. I had heard that AMD support was pretty bad, but this is horrible. 3+ months without a driver update is a lifetime in the gaming world. How can they expect to compete with Nvidia with this kind of support? This will be a huge factor in my decision of what GPU(s) to go with next.


----------



## ramos29

Quote:


> Originally Posted by *ReV2ReD*
> 
> Yeah, it's pretty bad. I had heard that AMD support was pretty bad, but this is horrible. 3+ months without a driver update is a lifetime in the gaming world. How can they expect to compete with Nvidia with this kind of support? This will be a huge factor in my decision of what GPU(s) to go with next.


i put playing fc4 and dying light on hold waiting for this driver, i seriously miss the days when i had a gtx 780 a driver each month at least


----------



## cmoney408

Quote:


> Originally Posted by *ramos29*
> 
> i ve been waiting all daylong for this f***** driver and they pushed it back as simple as that


I am using this driver, Guru3D-AMD-Catalyst-15.3.1Beta-64Bit-Win8.1-Mar14.zip (it seems to be working fine in Windows 7 64bit)

https://mega.co.nz/#!18QymYoT!DDinhPolyzFo7vJ30LIYqzaCeDjyHydVbrG9xTbbkno

i still cant get the numbers anandtech is getting? i get like 20fps @ 4k, and 30FPS at 1440p (with big drops)

i have a 4770k
16gb ram
game installed on its own ssd.
vizio p652ui-b2 4k tv (4k @ 30hz with current mini dp to hdmi adapter)

this is my first AMD card in 10+ years. i have no idea what i am doing. where do i go to configure the best settings for far cry 4 in 4k? as in anywhere else besides the game menu? like in catalyst.

if anyone has the time, can you let me know what the catalyst settings should be, or what your FC4 settings are?

thank you


----------



## ramos29

Quote:


> Originally Posted by *cmoney408*
> 
> I am using this driver, Guru3D-AMD-Catalyst-15.3.1Beta-64Bit-Win8.1-Mar14.zip (it seems to be working fine in Windows 7 64bit)
> 
> https://mega.co.nz/#!18QymYoT!DDinhPolyzFo7vJ30LIYqzaCeDjyHydVbrG9xTbbkno
> 
> i still cant get the numbers anandtech is getting? i get like 20fps @ 4k, and 30FPS at 1440p (with big drops)
> 
> i have a 4770k
> 16gb ram
> game installed on its own ssd.
> vizio p652ui-b2 4k tv (4k @ 30hz with current mini dp to hdmi adapter)
> 
> this is my first AMD card in 10+ years. i have no idea what i am doing. where do i go to configure the best settings for far cry 4 in 4k? as in anywhere else besides the game menu? like in catalyst.
> 
> if anyone has the time, can you let me know what the catalyst settings should be, or what your FC4 settings are?
> 
> thank you


in fc4 forget about 4k, you have to play in 1080p and inactivate crossfire, open ccc in game section create a profile for fc4 by browsing its .exe, once you did that go down and disable crossfire, i will post a screen shot later


----------



## cmoney408

Quote:


> Originally Posted by *ramos29*
> 
> in fc4 forget about 4k, you have to play in 1080p and inactivate crossfire, open ccc in game section create a profile for fc4 by browsing its .exe, once you did that go down and disable crossfire, i will post a screen shot later


thanks for the heads up, 2 questions. does anandtech have something special we dont? how will FC4 perform with a single GPU perform compared to my previous 660 sli setup?


----------



## joeh4384

On FC4, I was getting 50-60 at ultra at 1440p with one of my 295x2 GPUs. I get 80s with AFR friendly but the last patch introduced shadow flickering issues.


----------



## ReV2ReD

Quote:


> Originally Posted by *joeh4384*
> 
> On FC4, I was getting 50-60 at ultra at 1440p with one of my 295x2 GPUs. I get 80s with AFR friendly but the last patch introduced shadow flickering issues.


There is a fix for the flickering after the lastest 1.9 FC4 patch while still using AFR. I can't recall it exactly, and I can't really look it up right now (because at work), but it involved editing the .ini file and changing a lighting setting to "UltraHigh." I was skeptical, but the fix actually did work. The shadows are a bit darker, but it's definitely playable, smooth, and flicker-free.


----------



## Sgt Bilko

Anyone feel like taking over this club?

Not sure if Nav still checks this thread or not anymore


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Anyone feel like taking over this club?
> 
> Not sure if Nav still checks this thread or not anymore


Yeah, i pm'd him a while ago and got no response. I also asked if anyone would be interested. We just need to contact a mod to make the switch in the op (and also add the psu recommended list).


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Anyone feel like taking over this club?
> 
> Not sure if Nav still checks this thread or not anymore
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, i pm'd him a while ago and got no response. I also asked if anyone would be interested. We just need to contact a mod to make the switch in the op (and also add the psu recommended list).
Click to expand...

Yep, that PSU list is some good info indeedy!

I'd take it on but unfortunately i don't have the time anymore


----------



## Orivaa

I just downloaded the new driver. Does anyone remember the name of the GPU driver uninstall tool that has been mentioned a couple of times in this thread?

Edit: Nevermind, I found it.
http://www.guru3d.com/files-details/display-driver-uninstaller-download.html


----------



## Mega Man

Ddu
Display driver Uninstaller


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> I just downloaded the new driver. Does anyone remember the name of the GPU driver uninstall tool that has been mentioned a couple of times in this thread?
> 
> Edit: Nevermind, I found it.
> http://www.guru3d.com/files-details/display-driver-uninstaller-download.html


Don't forget to disable ULPS again after you run DDU and install the new driver.


----------



## Pandora's Box

So who's sticking it out with their 295X2? Who's jumping ship to Nvidia?

I have a pre-order in for 2 GTX Titan X's. They are on backorder until Tuesday. Starting to think I should just stick with the 295X2. I game at 3440x1440.


----------



## wermad

Titan has always been bad value (I had two in the past







). You may wanna wait for 390x WCE or 980 Ti. Eventually, there will be a Titan X BE, so Nvidia has the milking done pretty proper. i paid ~$600 for my cards, so I'm still under budget and I don't care for the extra vram. Honestly, I'm more excited about 390X WCE vs 980 Ti GM200 ($700, 6-8gb, etc.). Also, excited on GTX 990 vs 395x2...









Btw, Baasha did a 12k (triple 4k) thread and the most he pulled was ~5.5gb. The quad Titan BE's were running out of power due to the massive resolution. Typical, hp runs out before you max out vram (in most cases with a few minor exceptions).


----------



## Pandora's Box

I just cancelled my 2 Titan X preorder. Going to hold on to this 295X2 for now. AMD releasing new drivers yesterday pulled me away from the edge heh. Just need a Dying Light profile now









Also the 295X2 is considerably quieter than 1 Titan X, I imagine 2 Titan X's would be on par with a 290X in Uber mode.


----------



## wermad

Gm200 6gb Ti vs 390x Wce 8gb is gonna be one hell of a fight. That's where I would throw my money tbh. Two would run you less then sli titan x and should handle everything as well as titan x sli imho.


----------



## Ragingun

Quote:


> Originally Posted by *wermad*
> 
> Titan has always been bad value (I had two in the past
> 
> 
> 
> 
> 
> 
> 
> ). You may wanna wait for 390x WCE or 980 Ti. Eventually, there will be a Titan X BE, so Nvidia has the milking done pretty proper. i paid ~$600 for my cards, so I'm still under budget and I don't care for the extra vram. Honestly, I'm more excited about 390X WCE vs 980 Ti GM200 ($700, 6-8gb, etc.). Also, excited on GTX 990 vs 395x2...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, Baasha did a 12k (triple 4k) thread and the most he pulled was ~5.5gb. The quad Titan BE's were running out of power due to the massive resolution. Typical, hp runs out before you max out vram (in most cases with a few minor exceptions).


I agree with this and that titan is a terrible value, 980's sli beat the crap out of it for cheaper and so does the 295x2.

On a second note, they definitely do run outa horsepower at that high of a res but I do notice even at 1440p that my 295x2 runs outa VRAM when all quality settings are cranked up in games like AC:U. If I turn on MSAA it drops like a rock all the way down to a crawl of 2 fps at MXAAx8. I don't know what else it would be but running outa vram.


----------



## DraDeC

Hello everyone !









I am still new with this card (first AMD one) and I am not sure how to tweak the AMD Catalyst Center options (I tried to change options for a whole hour and I ended being lost with so many things lmao )

So I come here for expert opinions and advice ^_^

When comes to gaming, what are the recommended tweaks for 4K resolution and for 1440 resolution?

Since I am newbie I just keep playing with AMD preset profiles , not much difference so far .

Thanks!


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Gm200 6gb Ti vs 390x Wce 8gb is gonna be one hell of a fight.


Not really.


----------



## xer0h0ur

Its highly doubtful that Nvidia will put an uncut GM200 into a 980 Ti. The benchmarks floating around show a cut GM200 which performs well below the 390X and below the Titan X of course. They wouldn't be able to keep the price point on the Titan X if they slapped GM200 into the 980 Ti.


----------



## Alex132

They won't enable double prec and other full features in Maxwell. They want to make Pascal "10x faster*" than Maxwell after all









*in compute tasks that require double prec and "extremely rough estimates"


----------



## xer0h0ur

There is nothing to enable. Maxwell simply doesn't have the built in hardware to support more than 1/32 FP64 or in other words 0.2 Teraflops double precision.


----------



## joeh4384

Quote:


> Originally Posted by *xer0h0ur*
> 
> There is nothing to enable. Maxwell simply doesn't have the built in hardware to support more than 1/32 FP64 or in other words 0.2 Teraflops double precision.


I think there is a good opportunity for AMD to release a Fiji or whatever based firepro and gain some market share there.


----------



## xer0h0ur

Apparently the logic behind that is that the Tesla series is what is for double precision however Pascal as far as I know is going back to the previous strategy of having a jack of all trades GPU that they disable double precision from for the gaming cards.


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> Not really.


How so? Aren't both speculated to be in the same price point? ~$700 USD. i doubt Nvidia will not take advantage to plug something in between 980 and Titan X; knowing their track record and how they love to milk products. If you apply the same tactic w/ GK110, GM200 (w/ 6gb, half of Titan X) can be manipulated to fit the gap between GM204 and GM200. AS for amd, I'm not sure why they're toying with a $700 card but its probably to be faster then the $550-500 card. If rumors are true, no reuse of 290x may mean 390 will be the $550 and 390X (390X WCE) will be the $700 card. Owning the larger share of the market means Nvidia can take bigger risks then amd.


----------



## xer0h0ur

Actually, AMD said they wouldn't be re-branding previously used GPUs in the 3XX lineup and the Hawaii XTX was never used to begin with so they could still end up using it somewhere within the 3XX lineup. I really am getting tired of rolling in speculation. I just want at least a paper launch so I know what to expect.


----------



## wermad

Lol, yup...it's just a guessing game right now.

Now about dp 1.3....


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Not really.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How so? Aren't both speculated to be in the same price point? ~$700 USD. i doubt Nvidia will not take advantage to plug something in between 980 and Titan X; knowing their track record and how they love to milk products. If you apply the same tactic w/ GK110, GM200 (w/ 6gb, half of Titan X) can be manipulated to fit the gap between GM204 and GM200. AS for amd, I'm not sure why they're toying with a $700 card but its probably to be faster then the $550-500 card. If rumors are true, no reuse of 290x may mean 390 will be the $550 and 390X (390X WCE) will be the $700 card. Owning the larger share of the market means Nvidia can take bigger risks then amd.
Click to expand...

Not sure what I can or cannot say tbh.

How does a stock Titan X fare against a GTX980? 5-10% better?


----------



## abdellah40

plz i am new in this web site i realy need your help.

My config
AMD R9 295x2
5 monitors HP lp2475w

5 monitors dosent worked in eyefinity only 3 monitor ...4 monitor connected with hdmi and adapter hdmi to mini dp
and last one connected with dvi to dvi.

the all monitors worked fine but no way with 5 eyefinity...and in windows setting 3 monitors worked but 2 are desactivated ...Thanks for your time.

Cheers
Abe


----------



## abdellah40

plz i am new in this web site i realy need your help.

My config
AMD R9 295x2
5 monitors HP lp2475w

5 monitors dosent worked in eyefinity only 3 monitor ...4 monitor connected with hdmi and adapter hdmi to mini dp
and last one connected with dvi to dvi.

the all monitors worked fine but no way with 5 eyefinity...and in windows setting 3 monitors worked but 2 are desactivated ...Thanks for your time.

Cheers
Abe


----------



## Alex132

Quote:


> Originally Posted by *abdellah40*
> 
> plz i am new in this web site i realy need your help.
> 
> My config
> AMD R9 295x2
> 5 monitors HP lp2475w
> 
> 5 monitors dosent worked in eyefinity only 3 monitor ...4 monitor connected with hdmi and adapter hdmi to mini dp
> and last one connected with dvi to dvi.
> 
> the all monitors worked fine but no way with 5 eyefinity...and in windows setting 3 monitors worked but 2 are desactivated ...Thanks for your time.
> 
> Cheers
> Abe


You require mDP/DP to be used for all the monitors to be activated. I think adapters to non-DP standards won't make them function correctly.


----------



## abdellah40

thank you

what is mDP? it s DisplayPort to mini diplay port!!!!
and what about dvi in r9 295x2?

Cheers


----------



## Alex132

Quote:


> Originally Posted by *abdellah40*
> 
> thank you
> 
> what is mDP? it s DisplayPort to mini diplay port!!!!
> and what about dvi in r9 295x2?
> 
> Cheers


I'm guessing you can use the one DVI port + the other mDP ports.

mDP = mini Display Port.
DP = Display Port

I'd try using mDP (inputs on the card) to DP adapters (inputs on most displays) only.

If you have to use an mDP to HDMI or DVI adapter, it has to be an active adapter for Eyefinity to work beyond 3 displays, active adapters require extra power.

Quick look on Amazon:
Active mDP to DVI $30
mDP to DP connector $7.50
So it kinda makes more sense to get the mDP to DP connectors if your monitor supports them.

Note: I haven't actually tried out 4+ Eyefinity, I am just saying from what I have read / looked-up and can remember.


----------



## TooManyAlpacas

Just bought one of these cards cant wait for it to arrive

Anyone have any advice on thing I might need to know


----------



## Pandora's Box

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> Just bought one of these cards cant wait for it to arrive
> 
> Anyone have any advice on thing I might need to know


Be sure your power supply is up to the task, there is a very nice spreadsheet posted several pages back.

When mounting the radiator, it's best to have it above the card, like this:



Notice it is mounted so the tubing is at the bottom.

When connecting the power cables, do not use splitter cables, use a dedicated 8 pin pci-express connector for each port.

Disable ULPS. Download MSI Afterburner, go into settings, under AMD compatibility properties, select disable ULPS.

Use Display Driver Uninstaller (DDU) before installing new drivers.

http://www.wagnardmobile.com/DDU/


----------



## TooManyAlpacas

Thanks for the help I planed to mount the radiator above the card exhausting air from my case and I have a EVGA 1000W Super Nova G2 and I will use the dedicated 8 pin cables that came with the PSU


----------



## wermad

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> Just bought one of these cards cant wait for it to arrive
> 
> Anyone have any advice on thing I might need to know
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pandora's Box*
> 
> Be sure your power supply is up to the task, there is a very nice spreadsheet posted several pages back.
> 
> When mounting the radiator, it's best to have it above the card, like this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Notice it is mounted so the tubing is at the bottom.
> 
> When connecting the power cables, do not use splitter cables, use a dedicated 8 pin pci-express connector for each port.
> 
> Disable ULPS. Download MSI Afterburner, go into settings, under AMD compatibility properties, select disable ULPS.
> 
> Use Display Driver Uninstaller (DDU) before installing new drivers.
> 
> http://www.wagnardmobile.com/DDU/
Click to expand...

Its in my signature as well:

http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4890#post_23654185

Amd recommends a minimum 28 amps for each 8-pin connector or 50amps for both. Make sure its not a rail shared with your main system components (ie mb, cpu, etc) in a multi rail psu. Most manufacturers will provide rail specs for multi rail psu. At minimum 650-750w, though recommended 850-1000w if you plan to overclock.

Someone posted info on user using splitters, don't use them at all. Go straight from the psu (8 or 6+2 connectors). These can cause issues.

The card does produce quite a bit of heat so make sure you have plenty of room for the rad and enough airflow to help it.

Personally, i left ulps on when i ran my cards stock. You may have issues with it so disabling in AB like Pandora said is a simple task.


----------



## Alex132

Why do you disable ULPS?


----------



## Pandora's Box

Quote:


> Originally Posted by *Alex132*
> 
> Why do you disable ULPS?


Because I like monitoring the second GPU temps, plus it allows me to monitor GPU usage in games to see if crossfire is actually working.


----------



## TooManyAlpacas

About the power requirements which two 8 pin slots should I use on my PSU any one have a EVGA 1000W G2 Supernova if so which 2 8 pins slots do u use on the PSU out of the 4 available


----------



## xer0h0ur

Your power supply has a single strong 12V rail design dishing out 83A so you should be able to connect any two of the six PCI-E power connectors labeled VGA1 through VGA6 to the 295X2.


----------



## TooManyAlpacas

Thanks that what I thought but there are only VGA 1 through 4 on the 1000w but thanks for the help


----------



## xer0h0ur

Then are you sure you have the power supply you stated? EVGA's site shows that specific power supply you mentioned "EVGA 1000W G2 Supernova" as having 6 "VGA" power connections.

http://www.evga.com/Products/Product.aspx?pn=120-G2-1000-XR

http://www.evga.com/products/images/gallery/120-G2-1000-XR_XL_5.jpg


----------



## TooManyAlpacas

Oh sorry just took a look a my PSU i do have 6 have not looked at it in a while I was wrong my mistake


----------



## xer0h0ur

Yeah you're all good then.


----------



## Mega Man

one of the many reasons i love my superflower ( the oem of that psu ) single rail !~


----------



## wermad

Which one do you have? When you talk about your rig(s) they tend not to match your sig rig specs. Im a little confused







. I think you have quite a few builds nah?

I don't think we get the 2kw version (167amp rail) in the US.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> one of the many reasons i love my superflower ( the oem of that psu ) single rail !~


Same as my Corsair and my Silverstone.....Single rail is much easier to manage imo


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> Which one do you have? When you talk about your rig(s) they tend not to match your sig rig specs. Im a little confused
> 
> 
> 
> 
> 
> 
> 
> . I think you have quite a few builds nah?
> 
> I don't think we get the 2kw version (167amp rail) in the US.


i know i am bad, i have changed so much i dont keep them up to date

some day i will fix it,

i am learning alot atm ( soldering wiring and sleeving ) in the very near future to add hard tubing to that,

once i get my m8/th10 done there wont be as much to do. until i finally do a tx10 build.

but yea

i am updating it now

http://www.overclock.net/lists/display/view/id/5545251


----------



## wermad

Those things look gorgeous!


----------



## Mega Man

rig is updated, pricing and what not are not

but yea i think they are amazing, i imported them on a trip to china ( they didnt have the 1300w which at the time the largest )

i wanna get the 2kw version but after the caselabs/inwin debacle i will not support ocuk or w.e. that site is


----------



## ViRuS2k

2 psu`s pfft thats a think of the past









just got myself a 2000w super flower platinum PSU that has a peak voltage of 2350w








and its a beast









Though im interested to know what case that is ? lol


----------



## wermad

Because of the standard 115v outlet here in the US (and 15amp breaker), we don't get anything 1600+. 240v is possible but requires the right setup.

I ran a couple of V1000s with quad 7979 lightning.

Btw, that's a th10 case from CaseLabs.


----------



## ViRuS2k

Quote:


> Originally Posted by *wermad*
> 
> Because of the standard 115v outlet here in the US (and 15amp breaker), we don't get anything 1600+. 240v is possible but requires the right setup.
> 
> I ran a couple of V1000s with quad 7979 lightning.
> 
> Btw, that's a th10 case from CaseLabs.[/quote
> 
> Good thing is the psu runs in the uk @240v and we can pull up to 1350w from our walls


----------



## wermad

240v + 30amp breaker should be good for 3000w. I had one installed in my garage for an arc welder. My room is set on a 20amp breaker (good up to 2kw). The 15amp was tripping easily.


----------



## malik22

i guys im running arma 3 with a 295x2 and noticed the gpu utilization is 25% for one gpu and 75% for the other anyway to have them use up more?


----------



## rdr09

Quote:


> Originally Posted by *malik22*
> 
> i guys im running arma 3 with a 295x2 and noticed the gpu utilization is 25% for one gpu and 75% for the other anyway to have them use up more?


if your cpu is at stock or even oc'ed . . . check the core utilization. one or two maybe maxing out. the game is cpu bound i read.

wait, 295 for a 1080? say it isn't so.


----------



## malik22

Quote:


> Originally Posted by *rdr09*
> 
> if your cpu is at stock or even oc'ed . . . check the core utilization. one or two maybe maxing out. the game is cpu bound i read.
> 
> wait, 295 for a 1080? say it isn't so.


hey thanks no im playing at 4k and in mp sometimes one gpu is at 0%


----------



## Mega Man

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Because of the standard 115v outlet here in the US (and 15amp breaker), we don't get anything 1600+. 240v is possible but requires the right setup.
> 
> I ran a couple of V1000s with quad 7979 lightning.
> 
> Btw, that's a th10 case from CaseLabs.
> 
> 
> 
> Good thing is the psu runs in the uk @240v and we can pull up to 1350w from our walls
Click to expand...

As he stated it is a case labs th10
( new rev is th10a)
http://www.caselabs-store.com/magnum-th10a/

I am running 240v so I can do 2 kw

However the 2w super flower is only through ocuk AFAIK

http://www.overclockers.co.uk/showproduct.php?prodid=CA-031-SF

And I will not support them after they paid Inwin to rip off case labs

http://www.xtremerigs.net/2015/01/19/ces2015-win-heavily-inspired-competition/


----------



## joeh4384

Quote:


> Originally Posted by *malik22*
> 
> i guys im running arma 3 with a 295x2 and noticed the gpu utilization is 25% for one gpu and 75% for the other anyway to have them use up more?


I have read that game is a CPU cruncher and also not well optimized.


----------



## wermad

Just finished Metro LL ranger hc setting. Wow, its fun playing without all those little details that help you in a fps. Gpu's were working hard in 4k so I had to increase my fan speed to ~7v. Was pretty crappy at first until I forgot about turning off adv. physics. It was very smooth sailing after that. i5 still holding strong @ 4.6


----------



## owlieowl

Is this card a joke?? In Far Cry 4 I don't even get better FPS at 1080p Ultra than an R9 290. Same with WoW & BF4. It drops below 60fps!!! I paid this much money to have a good graphics card, not one that DOESN'T WORK.

I'm so pissed off. Wish I went with a GTX 980 instead. RMAing the card probably won't do anything.

How the hell can a card cost this much and not be able to run games like Far Cry 4 and BF4 at 1080p 60fps with no drops? Seriously, what a joke.

Considering a chargeback on my card. $650 some dollars for lies and bull****. Trash that performs barely better than a single R9 290 - and probably only due to the CPU overclock. Can't believe I bought this crap...


----------



## TooManyAlpacas

Um.... That does not seem right this cards performance is so much better, I think something might be wrong because just look up some benchmarks it kills the 980. The 980 barley passes it up in Sli.






The benchmarks start at about 1:30 into the video.


----------



## Orivaa

Quote:


> Originally Posted by *owlieowl*
> 
> Is this card a joke?? In Far Cry 4 I don't even get better FPS at 1080p Ultra than an R9 290. Same with WoW & BF4. It drops below 60fps!!! I paid this much money to have a good graphics card, not one that DOESN'T WORK.
> 
> I'm so pissed off. Wish I went with a GTX 980 instead. RMAing the card probably won't do anything.
> 
> How the hell can a card cost this much and not be able to run games like Far Cry 4 and BF4 at 1080p 60fps with no drops? Seriously, what a joke.
> 
> Considering a chargeback on my card. some dollars for lies and bull****. Trash that performs barely better than a single R9 290 - and probably only due to the CPU overclock. Can't believe I bought this crap...


1. WoW uses the CPU for the vast majority of its workload, not the GPU.
2. I just ran the Far Cry 4 intro and car crash with the newest patch and drivers (The new beta contains the CF profile for Far Cry 4, btw, so you need that if you want CF to work.), and I never dipped below 60 FPS.
3. Can't speak for BF4, as I haven't played it, but most people find good performance with the 295x2 in it.

So there are a couple of possiblities.
a) You don't have your drivers and/or game up to date.
b) Your unit is defective.
c) Your card is overheating, which can happen if your room's an oven and/or if you placed the cooler at a bad spot and have bad airflow
b) Your other components are **** and are dragging your performance down.


----------



## owlieowl

Here's 1 card looking at a cave wall, FC4 1080p Ultra w/ 2xAA: http://i.imgur.com/wqoHWwB.jpg.. 2 cards: http://i.imgur.com/jPLNRYG.jpg. Not even 50FPS staring at a wall! Crazy! How do you explain that given the circumstances? I certainly can't.

Yeah, something really does seem wrong.. It's frustrating as hell since I've done all I can software wise (update everything, purge & reinstall drivers, clean registry) short of re installing my OS. I even flipped the BIOS switch on the card. Max temps I've seen have been 70c after putting on a 2nd fan.

I've got the latest drivers (you can see in afterburner), the CPU is an i5 4690k @ 4.5GHz, and I have 12GB 1600Mhz DDR3. I should be fine. But I'm not. I'm at a loss. I've seen all the great benchmarks people have put out for this card but I just can't get anywhere close to the same performance. 1440p is out of the question and that's what I bought this card for.

I don't want to have to RMA this card for "underperforming". I don't even see how that makes sense, and figure I'll just be hassled by the RMA department. I mean, how can a card be "defective" if it runs at the right clocks, doesn't overheat, but just doesn't perform right? Did they accidentally put 2 R9 270s in there or something? I just don't understand.


----------



## Orivaa

Quote:


> Originally Posted by *owlieowl*
> 
> Here's 1 card looking at a cave wall, FC4 1080p Ultra w/ 2xAA: http://i.imgur.com/wqoHWwB.jpg.. 2 cards: http://i.imgur.com/jPLNRYG.jpg. Not even 50FPS staring at a wall! Crazy! How do you explain that given the circumstances? I certainly can't.
> 
> Yeah, something really does seem wrong.. It's frustrating as hell since I've done all I can software wise (update everything, purge & reinstall drivers, clean registry) short of re installing my OS. I even flipped the BIOS switch on the card. Max temps I've seen have been 70c after putting on a 2nd fan.
> 
> I've got the latest drivers (you can see in afterburner), the CPU is an i5 4690k @ 4.5GHz, and I have 12GB 1600Mhz DDR3. I should be fine. But I'm not. I'm at a loss. I've seen all the great benchmarks people have put out for this card but I just can't get anywhere close to the same performance. 1440p is out of the question and that's what I bought this card for.
> 
> I don't want to have to RMA this card for "underperforming". I don't even see how that makes sense, and figure I'll just be hassled by the RMA department. I mean, how can a card be "defective" if it runs at the right clocks, doesn't overheat, but just doesn't perform right? Did they accidentally put 2 R9 270s in there or something? I just don't understand.


http://i.imgur.com/cgit64X.jpg

That is me with Ultra preset. I also tried in windowed, and still got 60 FPS. (If you run windowed, Crossfire gets disabled.)

Are your settings standard Ultra preset, or did you tweak them? Also, try and open Catalyst Control Center and see if you have some settings overriding your in-game settings.

Also, are you using Omega drivers or the new beta?


----------



## TooManyAlpacas

Yeah your CPU should not be bottle necking it that much if any at all. I would do a clean OS install and if the performance did not change I would try to return it or exchange it for another one.


----------



## cmoney408

Quote:


> Originally Posted by *owlieowl*
> 
> I've got the latest drivers (you can see in afterburner), the CPU is an i5 4690k @ 4.5GHz, and I have 12GB 1600Mhz DDR3. I should be fine. But I'm not. I'm at a loss. I've seen all the great benchmarks people have put out for this card but I just can't get anywhere close to the same performance. 1440p is out of the question and that's what I bought this card for.


i was having major issues as well. 30fps @1080p, and 15fps @ 4k. but i looked up some catalyst settings that really helped. create a profile for the game and try some of the settings from this link

http://steamcommunity.com/app/259660/discussions/0/46476690870832888/

the only difference from the settings in the post is i put crossfire into AFR friendly mode. though i still don't think i have the best settings since i am new to AMD as well.

also, using the drivers released a couple days ago should help a bit.

i wish people who post all those reviews of the 295x2 with far cry would also state all their exact settings in catalyst.

by the way, now i can get a solid 30+fps at 4k with settings set to ultra (i can only get 30hz with my current setup, mini dp to HDMI 2.0 4k vizio p652ui-b2 tv). though i am having the issue where the mountains start to flash, i know its a setting in catalyst but i cant remember which one will stop that.

http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/11

this review shows the 295x2 killing the competition, but again, it doesnt go into detail about the settings which is frustrating.


----------



## pengs

Quote:


> Originally Posted by *rdr09*
> 
> wait, 295 for a 1080? say it isn't so.


Can't be dipping below 120fps,... ya know?

In my case I just like having the option and all the power + it's quite an event when you see both GPU's hit or come close to 70% usage. It only happens briefly and for a half-second and it's gone, like a comet or a ufo


----------



## rdr09

Quote:


> Originally Posted by *owlieowl*
> 
> Here's 1 card looking at a cave wall, FC4 1080p Ultra w/ 2xAA: http://i.imgur.com/wqoHWwB.jpg.. 2 cards: http://i.imgur.com/jPLNRYG.jpg. Not even 50FPS staring at a wall! Crazy! How do you explain that given the circumstances? I certainly can't.
> 
> Yeah, something really does seem wrong.. It's frustrating as hell since I've done all I can software wise (update everything, purge & reinstall drivers, clean registry) short of re installing my OS. I even flipped the BIOS switch on the card. Max temps I've seen have been 70c after putting on a 2nd fan.
> 
> I've got the latest drivers (you can see in afterburner), the CPU is an i5 4690k @ 4.5GHz, and I have 12GB 1600Mhz DDR3. I should be fine. But I'm not. I'm at a loss. I've seen all the great benchmarks people have put out for this card but I just can't get anywhere close to the same performance. 1440p is out of the question and that's what I bought this card for.
> 
> I don't want to have to RMA this card for "underperforming". I don't even see how that makes sense, and figure I'll just be hassled by the RMA department. I mean, how can a card be "defective" if it runs at the right clocks, doesn't overheat, but just doesn't perform right? Did they accidentally put 2 R9 270s in there or something? I just don't understand.


use the slider and check cpu usage. when i had a single 290 i can keep HT off with my i7. now, i don't even think about it.

in BF4, with HT off my core usage were high with a single 290.

Edit: tough with the 295 you can't see the vrm temps. here is what happened when i turned HT off in BF4 with 2 290s . . .


----------



## littledebbie

So I finally gave up on my original 295x2 and RMA'd it because of the VRM fan being extremely noisy. One of the last days owning it the fan shut completely off and no lights and no spinning if the fan. Just got my New one in and installed and presto chango VRM fan is silent and the noctuas are louder lol. This card is pretty incredible if what it achieves the titan x should hang it's head for the ridiculous price.


----------



## Alex132

Quote:


> Originally Posted by *owlieowl*
> 
> Here's 1 card looking at a cave wall, FC4 1080p Ultra w/ 2xAA: http://i.imgur.com/wqoHWwB.jpg.. 2 cards: http://i.imgur.com/jPLNRYG.jpg. Not even 50FPS staring at a wall! Crazy! How do you explain that given the circumstances? I certainly can't.
> 
> Yeah, something really does seem wrong.. It's frustrating as hell since I've done all I can software wise (update everything, purge & reinstall drivers, clean registry) short of re installing my OS. I even flipped the BIOS switch on the card. Max temps I've seen have been 70c after putting on a 2nd fan.
> 
> I've got the latest drivers (you can see in afterburner), the CPU is an i5 4690k @ 4.5GHz, and I have 12GB 1600Mhz DDR3. I should be fine. But I'm not. I'm at a loss. I've seen all the great benchmarks people have put out for this card but I just can't get anywhere close to the same performance. 1440p is out of the question and that's what I bought this card for.
> 
> I don't want to have to RMA this card for "underperforming". I don't even see how that makes sense, and figure I'll just be hassled by the RMA department. I mean, how can a card be "defective" if it runs at the right clocks, doesn't overheat, but just doesn't perform right? Did they accidentally put 2 R9 270s in there or something? I just don't understand.


You're like assuming because it cost more money than a R9 290 it should be better - not that because it's 2x R9 290Xs that it should be better.

You clearly have something wrong with your system that is hampering performance. Go through the usual things of fresh re-install of OS + drivers + disable ULPS + check CPU+GPU usage + make sure OC is stable + etc.

Not to mention in your screenshots it lists 3 GPUs, wuh? Disable onboard graphics.


----------



## blarty

First time poster here...

I bought an r9 295x2 at the weekend along with a beQuiet 1000w supply that the company certifies will work with the card.
I've been running a few benchmarks with the card and I've started to get some weirdness in Unigine Valley

Basically, after a non-specific amount of time, I start to hear buzzing from my monitor, which then often turns into crackle or loud static like noises - this often, but not always, coincides with obvious drops in FPS and stutter on screen. This issue isn't related to one particular monitor or cable type - the sound crackles on both a mini-dp to HDMI passive adapter through to a HDMI monitor and on a mini-DP to DP monitor.

It also happens if I plug in a speaker into my motherboard - again as the fps fluctuates and the movement becomes choppier the static/cutouts/buzz noise becomes more prevalent.

My machine is a i7-2700k (8Gb mem) - I've tried running Valley and Prime95 going at the same time, and although bad fps - no stutter. I'm going to push the Prime95 and Valley a couple of times again to make sure, and possibly turn off the AMD Audio Devices.

I previously had a 7950 but used DIsplay Driver Uninstaller to get rid of it, but I'm wondering whether it is best to return the card because I can return it no issues within 7 days.

I've also tried enabling/disabling ULPS but to no avail, and the more monitors I put in Eyefinity the quicker the sound problems kick in

Has anyone else encountered this kind of issue?


----------



## Alex132

What is your full system specs?


----------



## blarty

Quote:


> Originally Posted by *Alex132*
> 
> What is your full system specs?


It's an older machine, but it checks out, in the main the machine has not been upgraded since buying except for storage and Graphics card

Gigabyte GA-Z68P-DS3 Mobo
Intel i7-2700k Sandy bridge running at stock
2x4 Gb Ram (running at 668.5 Mhz according to CPU-Z)
R9 295x2 connected to a AOC 2963pm 2560x1080 and a Samsung 23in 1920x1080 monitor in Eyefinity at 4480x10180
Each PCIE connector on PSU has 2 8-pin connectors (used 2 separate PCIE connectors so 2 cables with a redundant 8-pin on each cable)
beQuiet! Power Zone 1000w Modular PSU (single 12v Rail but enough amps for gfx card)
120Gb SSD Boot
256Gb SSD for Skyrim, Fallout etc
2x1Tb HDD for extra storage and DVD Drive
Coolermaster Haf912+ Case - Large front fan (intake), side fan across gfx card (intake), 120mm (I think) top fan and stock 295x2 cooler on exhaust at rear top.
Running on Windows 8.1 & Latest AMD Drivers/CCC


----------



## ltkhoi90

what the hell ! I need ax1500 for CF 295 x 2 ....


----------



## Orivaa

Quote:


> Originally Posted by *ltkhoi90*
> 
> what the hell ! I need ax1500 for CF 295 x 2 ....


Well, that _is_ a four 290x setup, soooo...


----------



## Mega Man

Quote:


> Originally Posted by *ltkhoi90*
> 
> what the hell ! I need ax1500 for CF 295 x 2 ....


.... I don't know who told you this I run mine from *two* 1 kw psus


----------



## xer0h0ur

Isn't the AX1500 the PSU that lies about being single rail? Probably want to be careful if you pick that one.


----------



## ltkhoi90

Which PSU are you using ?


----------



## xer0h0ur

So you're trying to use dual 295X2's? What else is this system running? You need to pick the PSU based on total system draw including any overclocks.


----------



## owlieowl

Installed OS on a different drive, got better performance overall. 1440p FC4 w/ Ultra preset ran smooth, dips below 60 now and then but not that bad, averages in high 70s, low 80s. Vsync at steady 60fps with no stutter. Not quite the benchmarks here, but I think that's my CPU: http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/11

I also had similar performance in the TW: Attila benchmark (53 FPS vs their 60, assuming difference is due to CPU). BF4 stayed above 60FPS @ 1440p in multiplayer pretty much all the time. The GPU usage also seemed more evened out (in BF4 especially). I went and booted into my main OS, lo and behold, crossfire is broken in FC4 (20fps), but I got a good benchmark in TW (54fps). Confusing, since I've done safe mode driver uninstalls and reinstalls. It's selectively broken, I guess.

Wondering if there's anything I can do to fix the issue without reinstalling the OS on my SSD? It's a real hassle to move around Skyrim/Fallout installations!

I think my CPU is holding me back about 5-6+ FPS from what an i7 would get, though. Gets maxed out quite a lot. Wishing I had gone with the 4790k, honestly.


----------



## ltkhoi90

I'm rebuild my rig . I'm using i7 4770k , gtx 980 , 4x4 kit ram bus 2400 now. Its still running great with Tx650 . I just buy an CF 295x2 , and I want to update to i7-5930K. I'm asking around and they said I should go with 1500w psu T_T


----------



## xer0h0ur

Quote:


> Originally Posted by *ltkhoi90*
> 
> I'm rebuild my rig . I'm using i7 4770k , gtx 980 , 4x4 kit ram bus 2400 now. Its still running great with Tx650 . I just buy an CF 295x2 , and I want to update to i7-5930K. I'm asking around and they said I should go with 1500w psu T_T


For a single 295X2? No way. You don't need that big of a power supply for only one 295X2. Now if you planned on using two 295X2's then that is a different ball game.


----------



## wermad

If your case supports dual psu's and both are recommended for a single 295x2, that's fine. Just get a dual psu jumper cable.

If you only have room for one psu, then look for the crossfire units in the list.

Keep in mind extra wattage for other items you'll be running.

http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4890#post_23654185

Edit: this is for two 295x2 cards btw.

For one, any on the list is fine just also take into consideration additional items and possible overclocking for total power needs.


----------



## ltkhoi90

Thank you buddy !


----------



## Alex132

Yeah even I don't feel that safe pushing my HX850 to be honest.


----------



## joeh4384

I would get the EVGA G2 1600 watt one if I was going to run 2 295x2s.


----------



## Alex132

Does anyone have the link to the Omega 2 beta drivers? My searching skills suck.


----------



## wermad

I'm sketchy on evga, I would go with corsair, lepa/enermax, or dual psu's if possible. Their mobos have left a bad taste with me and any product other then a gpu I just can't see myself buying it.


----------



## xer0h0ur

Quote:


> Originally Posted by *joeh4384*
> 
> I would get the EVGA G2 1600 watt one if I was going to run 2 295x2s.


That or the Superflower variant would be what I would go with as well if I had to upgrade from the current PSU.
Quote:


> Originally Posted by *wermad*
> 
> I'm sketchy on evga, I would go with corsair, lepa/enermax, or dual psu's if possible. Their mobos have left a bad taste with me and any product other then a gpu I just can't see myself buying it.


That PSU were talking about is made by Superflower and its an outstanding power supply.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> I'm sketchy on evga, I would go with corsair, lepa/enermax, or dual psu's if possible. Their mobos have left a bad taste with me and any product other then a gpu I just can't see myself buying it.


EVGA G2 is a really good PSU. Their G2 P2 and T2 are Super Flower units that have been rebranded as EVGA. Basically, they're extremely good.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> I would get the EVGA G2 1600 watt one if I was going to run 2 295x2s.
> 
> 
> 
> That or the Superflower variant would be what I would go with as well if I had to upgrade from the current PSU.
Click to expand...

Same thing really. I'd go SF for PSU upgrade because they are great quality and cheap locally.


----------



## Bludge

Quote:


> Originally Posted by *owlieowl*
> 
> Is this card a joke?? In Far Cry 4 I don't even get better FPS at 1080p Ultra than an R9 290. Same with WoW & BF4. ..


Well, I don't know what's going on in BF4, but FC4 does not have a crossfire profile, and can't be forced to run in crossfire for that game, most reviews pointed that out. Check the latest hardocp review where they compare the 980 platinum to the 295X2, fairly scathing towards and drivers at the end

Sent from my LG-D802T using Tapatalk


----------



## Alex132

Quote:


> Originally Posted by *Bludge*
> 
> Quote:
> 
> 
> 
> Originally Posted by *owlieowl*
> 
> Is this card a joke?? In Far Cry 4 I don't even get better FPS at 1080p Ultra than an R9 290. Same with WoW & BF4. ..
> 
> 
> 
> Well, I don't know what's going on in BF4, but FC4 does not have a crossfire profile, and can't be forced to run in crossfire for that game, most reviews pointed that out. Check the latest hardocp review where they compare the 980 platinum to the 295X2, fairly scathing towards and drivers at the end
> 
> Sent from my LG-D802T using Tapatalk
Click to expand...

Just download the Omega 2 beta drivers?


----------



## Pandora's Box

Quote:


> Originally Posted by *Bludge*
> 
> Well, I don't know what's going on in BF4, but FC4 does not have a crossfire profile, and can't be forced to run in crossfire for that game, most reviews pointed that out. Check the latest hardocp review where they compare the 980 platinum to the 295X2, fairly scathing towards and drivers at the end
> 
> Sent from my LG-D802T using Tapatalk


latest beta driver has crossfire working in FarCry 4. you cant use SMAA though apparently.


----------



## wermad

I know the oem...I just can't trust evga with anything outside their gpu's.

My


----------



## Ragingun

Quote:


> Originally Posted by *Pandora's Box*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bludge*
> 
> Well, I don't know what's going on in BF4, but FC4 does not have a crossfire profile, and can't be forced to run in crossfire for that game, most reviews pointed that out. Check the latest hardocp review where they compare the 980 platinum to the 295X2, fairly scathing towards and drivers at the end
> 
> Sent from my LG-D802T using Tapatalk
> 
> 
> 
> latest beta driver has crossfire working in FarCry 4. you cant use SMAA though apparently.
Click to expand...

This^^ however the GPU's only have 50% usage so the game still runs better in single GPU mode.

Sent from my iPhone using Tapatalk


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> I'm sketchy on evga, I would go with corsair, lepa/enermax, or dual psu's if possible. Their mobos have left a bad taste with me and any product other then a gpu I just can't see myself buying it.


It is the same psu I use (super flower leadex ) and I have yet to hear anything bad about them especially long term care.

And it is beyond cheap and one of the highest recommended psu by shilka atm...


----------



## wermad

It's evga...I would just get a superflower (or any other brand with the same oem) tbh. I can't see myself supporting them other then their gpu's. Blame z77









/topic


----------



## Mega Man

The problem is they don't sell super flower in the us. Nor are there other people using leadex that I know of.

I imported mine from China


----------



## SLK

Mine will be here tomorrow and my Rosewill Photon 1200w PSU will be here Thursday. Going to be fun. I will be using all my exhausts for Rads now, hopefully I don't have an airflow issue.


----------



## fishingfanatic

If you're going to buy a 1600w psu, plz get a Corsair. I haven't had 1 fail yet unless I did something to it. I bought the nova 1650 from EVGA and after 4 replacements I got a credit.

I own 1 of their mobos and have owned several EVGA gpus as well as plenty of other makes and their RMA process was exemplary.

As for psu, just bought the 1200i and I like the fact the cables are better overall. Longer cpu cables, no issues with multiple plugs being useless due to their layout, and now they have a monitoring capability.

Mind you the price seems pretty high compared to some other brands but well worth it imho.

Made the mistake of running 2 690s with a TX850M without issue. I was benching with a friend and we both thought the other had changed the psu already....

Distractions


----------



## wermad

Would luv an ax1500i









10/10 jg review....impressive.


----------



## Mega Man

you can have it

the price is ridiculous for what you get

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story6&reid=391

the reason it didnt get all 10s is one minor pin issue that i have yet to see on another unit from them

http://www.newegg.com/Product/Product.aspx?Item=N82E16817139079

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=evga+1600&N=-1&isNodeId=1

from 309-419 all beating the corsair in value

also to note the g1s are junk ( nova ) and not worth anything

g2/p2/t2 only ones worth while

besides ... i can get the 2kw leadex for less that that 1500w

http://www.super-flower.com.tw/products_detail.php?class=2&sn=16&ID=119&lang=

that is imported from UK


----------



## electro2u

The Corsair units are 1 to 1 pinouts right? My next psu will be 1 to 1. Seasonic cables are messy.


----------



## xer0h0ur

I would happily import that 2KW unit if I had the wiring in my house for it. Since I don't I would just settle for the 1.6KW G2 SuperNova.


----------



## Mega Man

yea i am running 220v for this stuffs !~ 80ish amp breaker i will then feed all my pcs from a sub panel


----------



## wermad

Think if you can fish a couple of 12awg lines you can hook up a 240v. Or is it triple line 8awg...


----------



## Mega Man

heh

i deal with equip that pulls 300+a @460v

80a/220 is cake ! but yes i can easily pull wiring


----------



## wermad

I have a fishing kit as i dabbled a bit in the cable-tv industry. I'm pretty familiar running standard rg6 so a 12awg line should be a bit more involved but not too tough.

As much as the 2kw is a drool=tastic, I'll stick w/ my "lil" lepa







. I think i haven't pulled more then 1300w at the kill-a-watt.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> I would happily import that 2KW unit if I had the wiring in my house for it. Since I don't I would just settle for the 1.6KW G2 SuperNova.


< Australian.....240v plugs









We just have crappy internet and high hardware prices is all....was so happy i didnt pay for my 1200i. B9a 3aq-


----------



## DividebyZERO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> < Australian.....240v plugs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We just have crappy internet and high hardware prices is all....was so happy i didnt pay for my 1200i. B9a 3aq-


America is one of those wierd nanny states ordeals.... 240v might kill someone sticking a paper clip in the wall socket over 120v. So we don't have to worry about dying from 240v here, just drunk drivers and crazy people with guns. Oh i forgot we have hot warning labels on coffe cups too. Okay okay, im off topic here.

I am moving into a new home soon and you can bet that one of my first goals is to install 240v hookup for my PC


----------



## Mega Man

you would need romex ( 12/2, that is 12ga 2 conductor [not including ground ] ) or armored cable you cant just put ( * your not supposed to * ) 12ga in the walls


----------



## DividebyZERO

Quote:


> Originally Posted by *Mega Man*
> 
> you would need romex ( 12/2, that is 12ga 2 conductor [not including ground ] ) or armored cable you cant just put ( * your not supposed to * ) 12ga in the walls


silly me, i read your post and kept thinking of shooting 12gauge buckshot or slugs into the walls.

I actually might get lucky and bribe my Uncle to do it for me. Hes a Master Electrician by trade.


----------



## wermad

Is it safer to run 80amp breaker w/ a 220 vs a 60amp breaker on a 240v line?


----------



## Mega Man

you complain about dieing from guns i say i wont die because of my guns, and we can run 220 if you really want ( same as 240 more or less ) 120v has as much possibility of killing you.

120v is just the standard in the us .... nothing more

it isnt voltage that kills it is the amperage and placement of said current on your body


----------



## fishingfanatic

I've never heard of those psus b4. I'm sticking with what has worked for me.

I understand Enermax are pretty good. I had one briefly, acquired in a trade. No problems, but it was a few yrs old and only a 650 if I remember right.

There r some things I won't cheap out on, and the psu is one of them.

FF


----------



## xer0h0ur

I finally started ordering all the stuff I needed to revamp and expand my water cooling loop. I seriously underestimated how big the 360mm Monsta is. Thing is a beast. Just waiting on the EK block for the 290X, EK compression fittings and extra Fujipoly Ultra Extreme pads. Still haven't made up my mind on what pump/reservoir I want to go with. I just know I want it to be a PWM controlled pump with a higher than necessary flow rate.

Since frozencpu.com is down and out for the count till god knows when, I ordered from performance-pcs.com


----------



## wermad

Try three 480mm Monsta's


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Try three 480mm Monsta's


Now try picking up your case without heaving


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> I finally started ordering all the stuff I needed to revamp and expand my water cooling loop. I seriously underestimated how big the 360mm Monsta is. Thing is a beast. Just waiting on the EK block for the 290X, EK compression fittings and extra Fujipoly Ultra Extreme pads. Still haven't made up my mind on what pump/reservoir I want to go with. I just know I want it to be a PWM controlled pump with a higher than necessary flow rate.
> 
> Since frozencpu.com is down and out for the count till god knows when, I ordered from performance-pcs.com


After the pathetic face book posts mark made I refuse to ever use fcpu again. ...
But yea 13 (13.6iirc) cm thick with push-pull
Quote:


> Originally Posted by *Feyris*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Try three 480mm Monsta's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now try picking up your case without heaving
Click to expand...

Yea. .. my case weighs more then my wife. ... and my case is all al... imagine a steel one


----------



## xer0h0ur

I keep fighting with myself on whether or not I just want to ditch this case altogether and go full hog with a large case like the Lian-Li PC-D600 but I have purposely tried to stay with this case just for the lighting.


----------



## Mega Man

Only one option. ... caselabs :b


----------



## xer0h0ur

Caselabs makes a mighty fine variation of the case I am thinking of as well.


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Yea. .. my case weighs more then my wife. ... and my case is all al... imagine a steel one


That wouldn't be very hard for me even with an 800D it might be already.

No my girlfriend isn't 7 years old lmao


----------



## Mega Man

I honestly think if it were steel it would weight more then me. ..


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> I honestly think if it were steel it would weight more then me. ..


Friends ask if I go to gym, I reply: "Nah, I just go to LANs







"


----------



## xer0h0ur

My case is made of steel and its heavy as hell.


----------



## wermad

Lol, my x9 is a heavy bastard too. Think I'm over 75lbs. Wife helped with lifting this sucka.

Case Labs is the epitome of wc cases imho.


----------



## figgie

Mind if I join this party?

ASUS Ares iii. So not really a 295x2...but close enough.


----------



## xer0h0ur

Its just a custom design but still a 295X2 nonetheless.


----------



## wermad

It's a 290Xx2....confusing nah?









It's a non-reference (custom) 295x2 imho. Welcomes









Anyone wanna take over the thread as curator? No reply from Nav in a bit so I think the club needs to be ran by an active owner/member to keep the op updated. I just took over (by vote ) another club and I wouldn't mind with this one. But anyone else is more then welcome to step in.


----------



## Feyris

I would take it if you do not want it wermad.


----------



## kayan

My 2 cents, I'm cool with either of you taking over the thread. I'd like to hop onto the owner's list.

Anyway, I seem to remember some information some pages back about better setups for certain types of gaming, so sorry for asking again. I currently have a 8320e (with an unopened 8370e) and a CH V-Z in my PC and I'm contemplating upgrading my display from a 60hz 1440p to 144hz 1440p, or a 4k display. Do I remember correctly that Intel is better for this, or will it not make much difference since it's just one monitor?

Also, I think I remember somebody saying that certain boards with PLX chips don't play nice with the 295x2? Is this still the case, and if so, what brand mostly uses those chips?

Everything runs alright on my current display and system, except for open world games which struggle on my cpu unless it's mad overclocked, and even then it's still not as good as my old 9370, so that's why I'm contemplating switching to Intel.


----------



## Mega Man

it would be better to tell you if the mobo you are thinking about has one

when you know or think you know what you want let us know

basically if the cpu/chipset has "x" pcie lanes

if the board has "more" there is a plx chip


----------



## cmoney408

there is a lot of info in this thread. is there anywhere that has info specifically on Catalyst settings for the 295x2?


----------



## electro2u

I second either wermad or Feyris. They are both reliable. I don't think I ever got added to the official roster but I'm still in the club even though my 295x2 has been in the closet for months and months. Id sell it but I can't find the original screws for the shroud. So I think I'm going to build another rig with it at some point


----------



## Ragingun

Quote:


> Originally Posted by *cmoney408*
> 
> there is a lot of info in this thread. is there anywhere that has info specifically on Catalyst settings for the 295x2?


That's been discussed already. The conclusion that was made is all the reviews written never posted the catalyst settings so we've all had to figure out our own unfortunately. It'd be nice if the card reviewers would post these settings to at least give us a base on each game tested.


----------



## wermad

I don't mind running it but I will be honest and say my banner making skills suck. So I would need someone else to make it. I'm almost done w/ the X1/X2/X9 club but if Feyris would like to take over, I don't mind at all. All I really would like is to add the psu spreadsheet to the op.

What do you guys think? Also, Mega has been very helpful and I would nom him as well


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> I don't mind running it but I will be honest and say my banner making skills suck. So I would need someone else to make it. I'm almost done w/ the X1/X2/X9 club but if Feyris would like to take over, I don't mind at all. All I really would like is to add the psu spreadsheet to the op.
> 
> What do you guys think? Also, Mega has been very helpful and I would nom him as well


PSU List, Updated Member list, More current information, Fan configuration findings (an idea), Profile settings (another idea) and so forth, along with links to OC Bios & TuT for those with locked down cards may help.

(I dont mind helping where I can even if it isnt me, stuff like graphics too)


----------



## wermad

Quote:


> Originally Posted by *Feyris*
> 
> PSU List, Updated Member list, More current information, Fan configuration findings (an idea), Profile settings (another idea) and so forth, along with links to OC Bios & TuT for those with locked down cards may help.
> 
> (I dont mind helping where I can even if it isnt me, stuff like graphics too)


Go for it


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Go for it


Got bored.

for example~


----------



## wermad

Kewl, you're op then









Hit up a mod to make the switch. Let them know we've already made attempts to contact NavDigitalStorm and he has not replied and you will be the active member taking over to update the thread. Thanks


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Kewl, you're op then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hit up a mod to make the switch. Let them know we've already made attempts to contact NavDigitalStorm and he has not replied and you will be the active member taking over to update the thread. Thanks










I've been thrown to the sharks!!






Hows this styling anyways


----------



## wermad

Sweet









Wish i had those skills


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Sweet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wish i had those skills


Its just finding right Font, and way of combining images.
(Im novice)


----------



## wermad

I had a, ahem, "free" copy of an older shop. I lost it somewhere or its in the depths of my external backup drive.

Tired....







...spent a few hours getting the X9 club op up to date and had a long day at work. Ppcs.com is down and i can't get my order in







. I have a few more mods up my sleeve for this case to help tame those hot Hawaiians


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> I had a, ahem, "free" copy of an older shop. I lost it somewhere or its in the depths of my external backup drive.
> 
> Tired....
> 
> 
> 
> 
> 
> 
> 
> ...spent a few hours getting the X9 club op up to date and had a long day at work. Ppcs.com is down and i can't get my order in
> 
> 
> 
> 
> 
> 
> 
> . I have a few more mods up my sleeve for this case to help tame those hot Hawaiians


If you want any help just let me know, I can do a decent amount of stuff in PS - but not amazing things








I've been using it since PS 7 lol... back in the old OS 8 days


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> I don't mind running it but I will be honest and say my banner making skills suck. So I would need someone else to make it. I'm almost done w/ the X1/X2/X9 club but if Feyris would like to take over, I don't mind at all. All I really would like is to add the psu spreadsheet to the op.
> 
> What do you guys think? Also, Mega has been very helpful and I would nom him as well


----------



## wermad

lol

ppcs.com is back on...slow as molasses...meh, gotta get that order in asap!


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> lol
> 
> ppcs.com is back on...slow as molasses...meh, gotta get that order in asap!


Order for?


----------



## wermad

One more upgrade to my build


----------



## Orivaa

So does this mean I finally get to be on the owners' list?


----------



## wermad

As soon as a mod makes the switch, the new op can update the info.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> As soon as a mod makes the switch, the new op can update the info.


How are your blocks working? I missed buying it before it went out of stock.


----------



## wermad

Quote:


> Originally Posted by *kayan*
> 
> How are your blocks working? I missed buying it before it went out of stock.


They load ~50c with the fans ~ low to medium. Case had to go up on my desk as it was causing the cards to load in the 60s (not good air circulation down there).


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> They load ~50c with the fans ~ low to medium. Case had to go up on my desk as it was causing the cards to load in the 60s (not good air circulation down there).


Picking it up on the desk must have sucked! Your x9 is mad heavy! And that's without anything in it, haha!


----------



## wermad

Its about to get heavier


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Its about to get heavier


Ohgeez!!

Btw pm philwir your info about how uve tried to contact nav. needs to be 200% sure op died so its not thread hijack xP

Otherwise ive pmd nav too over it to get permission normal way.


----------



## Alex132

Working on new OP.

It looks pretty damn good so far if I do say so myself


----------



## wermad

....







ok, got a message out to PhilWrir to get the op switched.

edit: noticed Nav is back online today. I still haven't gotten a reply. I'm sure PhilWrir will reach out to him as well about the change if he cannot continue with the club.


----------



## cennis

Anyone have power draw data for a 295x2 overclocked 1100mhz+ with i7 cpu on benchmarks?

my xfx seasonic KM3 850w is shutting down with

4.7ghz 4770k and 1100mhz oc on valley/3dm11 after 10mins
4.7ghz 4770k and 1200mhz oc instantly shut off everytime when benchmark loads
4.7ghz 4770k and stock, works
stock 4770k, 1200mhz oc works.

I had the same setup on a AX1200i before and I did not encounter any issues with the same clocks.

I thought the KM3 850w psus would be plenty enough since reviews say they pull up to 1000w without shutting down.

Even if 850w is the limit, I doubt I am pulling that much


----------



## Alex132

Quote:


> Originally Posted by *cennis*
> 
> Anyone have power draw data for a 295x2 overclocked 1100mhz+ with i7 cpu on benchmarks?
> 
> my xfx seasonic KM3 850w is shutting down with
> 
> 4.7ghz 4770k and 1100mhz oc on valley/3dm11 after 10mins
> 4.7ghz 4770k and 1200mhz oc instantly shut off everytime when benchmark loads
> 4.7ghz 4770k and stock, works
> stock 4770k, 1200mhz oc works.
> 
> I had the same setup on a AX1200i before and I did not encounter any issues with the same clocks.
> 
> I thought the KM3 850w psus would be plenty enough since reviews say they pull up to 1000w without shutting down.
> 
> Even if 850w is the limit, I doubt I am pulling that much


4770k OC like that would be pulling ~230-250w.
The SeaSonic 850w uses a single rail and will be sharing that with the GPU - therefore you're left with about 600w for the GPU (assuming no other 12v devices). That's barely enough for stock boost.

I would recommend a ~1200w unit for overclocking an i7 and 295X2. You're looking at about 900-1000w pull.
No reason to keep pushing the limit of your 850w and making it cry, it's much better to upgrade to a 1200w unit if you plan on overclocking.

And the units can pull ~1000w - but for a very short time, and they get very hot doing so. So it might be getting too hot and shutting down.


----------



## cennis

Quote:


> Originally Posted by *Alex132*
> 
> 4770k OC like that would be pulling ~230-250w.
> The SeaSonic 850w uses a single rail and will be sharing that with the GPU - therefore you're left with about 600w for the GPU (assuming no other 12v devices). That's barely enough for stock boost.
> 
> I would recommend a ~1200w unit for overclocking an i7 and 295X2. You're looking at about 900-1000w pull.
> No reason to keep pushing the limit of your 850w and making it cry, it's much better to upgrade to a 1200w unit if you plan on overclocking.
> 
> And the units can pull ~1000w - but for a very short time, and they get very hot doing so. So it might be getting too hot and shutting down.


Thanks for the reply,

http://www.bit-tech.net/hardware/2013/06/01/intel-core-i7-4770k-cpu-review/7

We can see that even under Prime95 smallffts (more intensive than 3dmark11 graphics test 1/valley) the system power draw is only 171w, not sure how you came to that 230w~250w figure.

realistically I think 150w is already a conservative estimate under these types of benchmarks

that leaves close to 700w or 800w(short period) for my 295x2s. I have no seen any 295x2 pull more than under 1200mhz clocks.

It instantly shuts down, so it is not an issue of temps.

If 1000w is the real limit, my benchmark should run for some time before shutting down (at 4.7/1200mhz)


----------



## Alex132

Quote:


> Originally Posted by *cennis*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> 4770k OC like that would be pulling ~230-250w.
> The SeaSonic 850w uses a single rail and will be sharing that with the GPU - therefore you're left with about 600w for the GPU (assuming no other 12v devices). That's barely enough for stock boost.
> 
> I would recommend a ~1200w unit for overclocking an i7 and 295X2. You're looking at about 900-1000w pull.
> No reason to keep pushing the limit of your 850w and making it cry, it's much better to upgrade to a 1200w unit if you plan on overclocking.
> 
> And the units can pull ~1000w - but for a very short time, and they get very hot doing so. So it might be getting too hot and shutting down.
> 
> 
> 
> Thanks for the reply,
> 
> http://www.bit-tech.net/hardware/2013/06/01/intel-core-i7-4770k-cpu-review/7
> 
> We can see that even under Prime95 smallffts (more intensive than 3dmark11 graphics test 1/valley) the system power draw is only 171w, not sure how you came to that 230w~250w figure.
> 
> realistically I think 150w is already a conservative estimate under these types of benchmarks
> 
> that leaves close to 700w or 800w(short period) for my 295x2s. I have no seen any 295x2 pull more than under 1200mhz clocks.
> 
> It instantly shuts down, so it is not an issue of temps.
> 
> If 1000w is the real limit, my benchmark should run for some time before shutting down (at 4.7/1200mhz)
Click to expand...

Wow the 2500k pulls 300w









That explains a lot









How old is your PSU?
Have you tried using the other physical modular connectors on the PSU itself?


----------



## wermad

X79 and x99 pull 200+. Lga1155/1150 usually do 150-200w when oc,d. I'll run some ibt and prime to gauge my i5


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> X79 and x99 pull 200+. Lga1155/1150 usually do 150-200w when oc,d. I'll run some ibt and prime to gauge my i5


My 2500k pulls ~250w easily. Especially at 1.525v + 5Ghz









That + dying HX850 explains why my computer keeps shutting down (GTX 690).

Speaking of which, this is my 690s last day with me







Gonna have to run on a 8600 GT for a few weeks


----------



## wermad

My old 2700k did 5.1 w/1.45 and 5.0 stable w. 1.425v. That's a lot voltage on that SB


----------



## Feyris

Anyone wanting to be in updated op later go to:

http://goo.gl/forms/L8uMY5RsZP

Go sign up now!


----------



## wermad

Did you need gpuz validation link for the form?


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> Did you need gpuz validation link for the form?


I figured either photographic evidence or a GPU-Z Validation with OCN name would be proof in proof section


----------



## cmoney408

my gpu validation says "GPU: Vesuvius" , is that right? it seems like its only showing (1) 290?


----------



## Feyris

Aslong as it says Vesuvius for stock pcb 295x2 yes~


----------



## drchoi21

Hello guys, I have a Sapphire R9 295X2, and frankly it's having serious problems with dual CPU System (other systems such as X58 and Z77 are fine), It BSODs at logon in Windows 7 Ultimate with 0x0000007E error relating to atikmdag.sys. can anyone actually find what's the issue on this card? specs are
Windows 7 Ultimate x64
2x X5550 Xeon Intel
Supermicro X8DTi-F
Sapphire R9 295X2
Corsair GT 240GB SSD
Seagate SSHD 750GB 2.5"
Hynix ECC Reg 192GB
EVGA 1000G1


----------



## Pandora's Box

CPU's overclocked?

I suspect your power supply might be having issues powering 2 Xeons and the 295


----------



## drchoi21

nope those are 95W each, It works well in Linux and I have stress tested it for over 10 hours


----------



## Alex132

Quote:


> Originally Posted by *drchoi21*
> 
> Hello guys, I have a Sapphire R9 295X2, and frankly it's having serious problems with dual CPU System (other systems such as X58 and Z77 are fine), It BSODs at logon in Windows 7 Ultimate with 0x0000007E error relating to atikmdag.sys. can anyone actually find what's the issue on this card? specs are
> Windows 7 Ultimate x64
> 2x X5550 Xeon Intel
> Supermicro X8DTi-F
> Sapphire R9 295X2
> Corsair GT 240GB SSD
> Seagate SSHD 750GB 2.5"
> Hynix ECC Reg 192GB
> EVGA 1000G1


X58 mobos don't like Hawaii GPUs.

Not sure how to fix it to be honest.


----------



## DividebyZERO

The only x58 with issues i tested Hawaii cards with is ones that had Nvidia NF200 PLX. I had 290x and used PT1 BIOS to get around it. It appears to be an issue with power states and PCIE switching. Since he is on a 295x2 and non NF200 board i wonder if its more of a BIOS issue and GPU compatibility.

I would perhaps try the latest bios for the mainboard. Try using ACPI 3.0 in bios as well maybe.

are the CPU's overclocked or stock?


----------



## Alex132

Quote:


> Originally Posted by *DividebyZERO*
> 
> The only x58 with issues i tested Hawaii cards with is ones that had Nvidia NF200 PLX. I had 290x and used PT1 BIOS to get around it. It appears to be an issue with power states and PCIE switching. Since he is on a 295x2 and non NF200 board i wonder if its more of a BIOS issue and GPU compatibility.
> 
> I would perhaps try the latest bios for the mainboard. Try using ACPI 3.0 in bios as well maybe.
> 
> are the CPU's overclocked or stock?


95w each probably means stock, but that would be a good idea - updating BIOS.


----------



## DividebyZERO

Quote:


> Originally Posted by *Alex132*
> 
> 95w each probably means stock, but that would be a good idea - updating BIOS.


Yeah thats what i figured, esp since its a server board. However there is a guy whos modded his supermicro board, and overclocked his cpus. Thats why i was asking, just never know here on OCN


----------



## Pandora's Box

I posted this in the HAF XB Case thread but figured you guys would like this:

Moved my PC over to my CoolerMaster HAF XB case. Not sure why I didn't use this case when I got the 295X2. I think I thought it wouldn't fit, nope, it fits


----------



## drchoi21

I've already updated my bios to R2.1, the latest from their site. I tried changing around the jumper settings, but no dice.


----------



## DividebyZERO

Dumb question but you do have onboard gpu off?


----------



## wermad

I don't think lga1366 has igpu.


----------



## DividebyZERO

Pretty sure his board did, but i looked at a generalized mobo pdf from supermicro. so i guess not all have it?

(mainboard has onboard gpu?)


----------



## wermad

You're correct, just looked it up and it shows it has a vga port on the rear i/o.

I know for sure most of the consumer lga1366 (and lga2011/v3) most don't have it.


----------



## drchoi21

I disabled the onboard matrox GPU just in case but it didn't really help.


----------



## wermad

Removed the igpu's driver btw?

I know it's a pita, have you done a clean install of the os?


----------



## drchoi21

yep, I did 5 installs of Win 7 and Win 8


----------



## DividebyZERO

You need some way to stop ulps/pcie lane switching. I was able to use Asus PT1 bios on my 290x . Since you have a 295x2 I dont know what you can do. I was getting same bsods when I switch to my stock bios on my 290x. The good thing at least from what I have observed on 2 boards with this issue(x58 evga 4way, SR2) is that it only affects the primary gpu. So I have pt1 bios on my primary gpu and stock on the other 3. With 295x2 I have no clue what to suggest


----------



## wermad

Try an older driver if you haven't done so.


----------



## drchoi21

I've already tried 14.4, 14.7, 14.9, 14.11, 14.12


----------



## wermad

Try the other bios on the 295x2?


----------



## xer0h0ur

Quote:


> Originally Posted by *drchoi21*
> 
> Hello guys, I have a Sapphire R9 295X2, and frankly it's having serious problems with dual CPU System (other systems such as X58 and Z77 are fine), It BSODs at logon in Windows 7 Ultimate with 0x0000007E error relating to atikmdag.sys. can anyone actually find what's the issue on this card? specs are
> Windows 7 Ultimate x64
> 2x X5550 Xeon Intel
> Supermicro X8DTi-F
> Sapphire R9 295X2
> Corsair GT 240GB SSD
> Seagate SSHD 750GB 2.5"
> Hynix ECC Reg 192GB
> EVGA 1000G1


Dude, my cousin and I are doing battle with this exact same issue. The only difference is he has a brand spanking new system fresh out of the box and its an MSI R9 290X Gaming and any driver at all is giving either a perpetual black screen or an eventual BSOD relating to atikmdag.sys.

Full system specs:

MSI X99S Gaming 7 motherboard
MSI R9 290X Gaming
Corsair Vengence 16GB DDR4
Core i7 5820K
Corsair RM1000
Samsung Evo 840 256GB
Some random 2TB storage drive
Windows 7 Pro 64bit

The system booted up fine only once then the driver started crashing. This was before ever connecting it to the internet or updating anything at all. We DDUed the 14.9 driver it shipped with and installed the Omega. Still kept crashing. DDUed again. We figured perhaps Windows updates needed to be applied so connected it to the internet and did all available updates. Re-installed Omega crash, DDUed, re-installed 14.9, crash.

We still have to update the motherboard's BIOS and have to try disabling "driver signature enforcement" which was suggested by someone else. Other than that the only other thing I can imagine is that the video card is faulty so after we try the aforementioned two things I will pull my 290X from my system and try it in his system to see if the video card is fubared.


----------



## drchoi21

Quote:


> Originally Posted by *wermad*
> 
> Try the other bios on the 295x2?


Already tried that, doesn't work


----------



## wermad

Very unique issue. ..







.

Would recommend to open a ticket with amd and sapphire. At least to get their take.

You have any one that can let you borrow another current card?


----------



## xer0h0ur

If you google hard enough you can find people who have experienced this with the R9 2XX series since late 2013. For some disabling the "driver signature enforcement" did the trick while for at least one person it required a video card swap.

I didn't find any instance in which a windows re-install helped at all so don't waste your time there.


----------



## drchoi21

Hello overclock.net, for some time I had a R9 295X2 and Titan Z in my possession to run it with my LGA 1366 Dual Xeon Platform. However Windows (All versions) simply blue screen because it has to do with how 1st gen core i7/nehalem/westemere handles PCIe lanes. Nehalem/Westemere was the last Intel chipset to actually have a true Northbridge, as it contained PCIe lanes and not in the CPU and used QPI links to connect with CPUs and reduce latency. The problem kicks in on How windows OS handles PCIe lanes and the protocol following it with the Dual GPU cards. R9 295X2/Titan Z both use PLX8487, which doubles the PCIe lane taken in from the PCIe slot provided. However, this type of configuration has conflicting protocols since the PCIe lanes are located in Northbridge, not on CPU. Because of this, when windows loads drivers for the GPU, it becomes unstable and crashes 10 seconds after the driver has loaded since it cannot handle conflicting PCIe protocols. Unix Distributions, such as FreeBSD or Linux mint has no problems with the drivers because their way of handling PCIe protocol and data is significantly different from Windows. TLR, do not use R9 295X2 or Titan Z with Nehalem/Westemere Xeon 5520/5500 chipset in Windows, it will crash due to conflicting PCIe protocols.


----------



## xer0h0ur

Still doesn't explain at all why this happens with single GPU cards like the 290X/290/280X/280


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> Still doesn't explain at all why this happens with single GPU cards like the 290X/290/280X/280


I agree. First thing I would do is when loading the video drivers, do a custom install and remove everything but display driver,CCC, install manager,and acp. Forget the drag and drop crap or even hdmi audio in my case I dont use. But his case is slightly different, I think its more a bios compatibility issue and maybe it is hardware somewhere but for me not having power states solved it. When I would be at desktop and gpu downclocked, which also reduced pcie speed at the same time I got BSOD. If had some patience I could try to find more information on my setup as we both have the same hawaii gpu and intel chipset.

If it was simply a pcie protocol issue then shouldn't it fail even without a driver loaded?


----------



## xer0h0ur

I was thinking the same thing myself. If it was a PCI-E protocol issue it shouldn't work no matter what. In my particular instance its only when the drivers are installed that the system fails to get past the 4 piece spinning windows logo. Other than that soon as I DDU the installation I can boot perfectly fine.


----------



## drchoi21

But R9 290X reference works on my computer perfectly, while R9 295X2 simply refuses to.


----------



## DividebyZERO

Quote:


> Originally Posted by *drchoi21*
> 
> But R9 290X reference works on my computer perfectly, while R9 295X2 simply refuses to.


can you gpuz screenshot your 290x? Also do you have the available pcie slots to try 290x primary and 295x2 secondary and run trifire as a test? Funny though you never mentioned a 290x or I simply forgot.

Edit: I should specify I was curious about your gpuz shot of the 290x to see the stats like identifier, version etc. Not because I didn't think you had one.(if it sounded that way)


----------



## drchoi21

it's kinda old, when I first got the R9 290X back in 2013

I also can't do Trifire since there is only 1 PCIe x16 slot


----------



## xer0h0ur

Your motherboard only has one full size PCI-E slot?


----------



## drchoi21

X8DTi-F has only one, others are all Physical PCIe x8 & x4s


----------



## xer0h0ur

Well crap.


----------



## drchoi21

Oh well time to move to C612/Haswell-EP


----------



## Its L0G4N

I have an XFX 295x2, but haven't joined this group, but I was wondering if anyone has heard about the driver support for PLP (portrait-landscape-portrait) gaming on the r9 285? Do you think it'll be coming the the 295x2 anytime soon?


----------



## Pandora's Box

Quote:


> Originally Posted by *Its L0G4N*
> 
> I have an XFX 295x2, but haven't joined this group, but I was wondering if anyone has heard about the driver support for PLP (portrait-landscape-portrait) gaming on the r9 285? Do you think it'll be coming the the 295x2 anytime soon?


Doubtful, the 285 is using a newer gpu than the 295x2 uses.


----------



## Its L0G4N

Quote:


> Originally Posted by *Pandora's Box*
> 
> Doubtful, the 285 is using a newer gpu than the 295x2 uses.


You don't think it could just be a driver update?


----------



## Alex132

Quote:


> Originally Posted by *Pandora's Box*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Its L0G4N*
> 
> I have an XFX 295x2, but haven't joined this group, but I was wondering if anyone has heard about the driver support for PLP (portrait-landscape-portrait) gaming on the r9 285? Do you think it'll be coming the the 295x2 anytime soon?
> 
> 
> 
> Doubtful, the 285 is using a newer gpu than the 295x2 uses.
Click to expand...

Same arch. Doesn't matter if it was released later, it's essentially a cut-down Hawaii GPU. GCN1.1.
Quote:


> Originally Posted by *Its L0G4N*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pandora's Box*
> 
> Doubtful, the 285 is using a newer gpu than the 295x2 uses.
> 
> 
> 
> You don't think it could just be a driver update?
Click to expand...

Probably, I tend to stick far away from multi-monitor gaming due to bugs'n'bezels


----------



## xer0h0ur

No its not. 285 is GCN 1.2 with color compression


----------



## DividebyZERO

Quote:


> Originally Posted by *drchoi21*
> 
> it's kinda old, when I first got the R9 290X back in 2013
> 
> I also can't do Trifire since there is only 1 PCIe x16 slot


Yeah, so your 290x bios is not modified? I checked it against some of my ref cards, the bios looks older than mine i think. Wish i had a 295x2 to test but i'm not sure why my 290x ref stock bios bsods constantly until its locked @ 16x2.0pcie.



Perhaps there are more than one issue affecting these newer gpu's and the older boards/bios


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> No its not. 285 is GCN 1.2 with color compression


Oh? Well I assume it wouldn't be that different compared to 290X in terms of features then.

Also color compression? As in something like Maxwell's compression? Or dithering...?


----------



## Pandora's Box

Quote:


> Originally Posted by *Alex132*
> 
> Oh? Well I assume it wouldn't be that different compared to 290X in terms of features then.
> 
> Also color compression? As in something like Maxwell's compression? Or dithering...?


http://www.anandtech.com/show/8460/amd-radeon-r9-285-review/3


----------



## xer0h0ur

Yeah its why they went with a 256 bit bus on the 285, since the color compression meant it needed less bandwidth.


----------



## xer0h0ur

There is also a rumor floating around that a full uncut version of Tonga will be used in the 3XX lineup under a different name, Antigua?


----------



## Alex132

Oh cool, thanks!

So it is like Maxwell's compression


----------



## xer0h0ur

Well sure as poop turned out his MSI 290X Gaming was defective. My 270 and 290X both worked in his system. If anything I am pissed at myself I didn't just try a card swap from the get go instead of wasting hours trying to "fix" a crashing driver.


----------



## wermad

Quote:


> Originally Posted by *Pandora's Box*
> 
> I posted this in the HAF XB Case thread but figured you guys would like this:
> 
> Moved my PC over to my CoolerMaster HAF XB case. Not sure why I didn't use this case when I got the 295X2. I think I thought it wouldn't fit, nope, it fits
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice setup. I luv how simple things can be before I tear into them and make em more complicated









Curious, your aio's are set to intake from the front and rear. How are your temps? I was gonna buy a cheap xb I found locally but the matx Aeneas was still holding my attention. I had an xb with dual 690s on blocks. Had to install the heavy Monsta 480 with brackets on the rear. Looked funny though







.


----------



## Feyris

The raijintek case is pretty nice. and well priced too so ide try it


----------



## wermad

The one thing holding me back was the front rad support. Once you drop in the 295x2, there's only space for fans. You can tuck two fans in in the front fascia but it kinda goes against the 4x120 layout it has. I could have dropped in a 240 in the bottom and a thin one on top, while the rear fan openings would get 120s. After adding the cost (both new and use), it ended up being about the same if not more then what I paid for my three Monsta's. Luckily, the X9 showed up and it gave me the horizontal mb layout I wanted and still keep the 480s.


----------



## Pandora's Box

Quote:


> Originally Posted by *wermad*
> 
> Nice setup. I luv how simple things can be before I tear into them and make em more complicated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Curious, your aio's are set to intake from the front and rear. How are your temps? I was gonna buy a cheap xb I found locally but the matx Aeneas was still holding my attention. I had an xb with dual 690s on blocks. Had to install the heavy Monsta 480 with brackets on the rear. Looked funny though
> 
> 
> 
> 
> 
> 
> 
> .


Actually I have the radiator fans set to blow the hot air out of the case. I just have 1 140mm fan blowing cold in in directly infront of the 295x2. Max temp I see on the 295x2 is 72C in Heaven Bench @ 3440x1440.

Most of the time I switch the window cover on the case with the regular cover that has a 200mm fan on it, I have the fan blowing air into the case. This drops temps about 8C across the board, the 72C temp I get in Heaven is with the window panel on.


----------



## SLK

My idle temperature heat build up is insane now since I no longer have a 140mm exhaust fan. (AIO replaced that) I am switching to the Phanteks Enthoo Primo case for better cooling options. If I buy any 3 pin 120mm fan, can I attach it to the 295x2's 3 pin connection or do I have to match power and RPM requirements of the stock fan?


----------



## Pandora's Box

I believe the max ampage the card can support is 0.2 amps for the fan.


----------



## xer0h0ur

I hate AMD for not creating a definitive user guide for how to hook up 3, 5 or 6 monitors but after trial and error we got the 6 monitor setup working properly and after having his 290X swapped out for a new one everything is working cherry. 3 monitors on the 290X (Displayport to active Displayport adapter to HDMI, HDMI, DVI) and 3 monitors on the 270 (Displayport to active Displayport adapter to HDMI, HDMI, DVI) as a single large desktop. When AMD says all 6 monitors must be connected to a single video card for "SLS" mode (Single Large Screen) its a half truth. They mean for gaming purposes it needs to be connected like that. Not for general use. If you were to do that on a 290X then it would have to be 1080p monitors with a 4-port Club3D MST hub then HDMI and DVI for the other two monitors. That is unless you flat out have a video card with 6 miniDP outputs then you can of course go 1440p, 4K etc.

This was me clowning around throwing up Netflix (not in full screen) in IE to see what it would look like across 6 monitors. My cousin wanted a Forex trading setup and this fits the bill perfectly.



If I am correct its not possible to hook up 6 4K monitors to the 295X2 alone right? Can't go past 4 of them right?


----------



## electro2u

If anyone has a dead card or just doesn't need their shroud screws, I can't find mine and I'm wanting to sell my 295x2. Little tiny GPU PCB screws with big fat star screw heads...

Where do I get them, or where can I buy them, or will you sell me yours lol


----------



## wermad

I think Eyefinity max is 6x wqhd 1440/1600 3x2 (someone has already pulled this off) or 3x 4k (haven't heard 5k but it might possible ) it's typically driver limited. Amd previously set 1200 6x as the max. DisplayPort and dp hubs are a must.

I'm sure for 5x1 or 3x2 4k amd is gonna wait for dp 1.3.


----------



## xer0h0ur

Perhaps we can work something out electro2u, do you by chance still have either of your waterblock bridges? I don't even know if I am calling it the right thing. The freakin thingamabobber that connects the 290X's block with the 295X2's. I am barely about to waterblock my 290X and need that to connect both blocks together. I assume both blocks won't lineup straight with eachother as I have yet to check.

Either way though I have everything for the Asetek cooler laying around and I don't plan on going back to the stock cooler ever.


----------



## wermad

Hmm...mine are all ph







. I'll check em when I get home


----------



## electro2u

Sweet I prefer barter anyway. I tossed the acrylic bridge I made but I would want to make you a new one anyway. Just need to know the slot spacing and measurement from center to center of. The two ports you want to use.








I have a ton of spare tubing sitting around


----------



## xer0h0ur

Cool, I haven't installed the block on the 290X just yet but soon as I have that done I can measure the spacing from block connection to block connection. Normally I would have done it today since its supposed to be my day off but ended up having to work 6 days this week so Friday I think I will get a chance to slap on the block. Please excuse my ignorance but what do you mean by the measurement from center to center of? I just want to make sure I measure correctly.


----------



## Bogusburger

Hi guys. I just bought an r9295x2 and I don't seem to be getting the performance of other benchmarks. I'm wondering if it's a bottleneck in another part of my system as my motherboard and cpu aren't exactly amazing although it does go to 100% load. If anyone could help me out that would be great. My specs are:
r9295x2
i5 3570
GIgabyte Z77-D3H
16GB corsair vengeance 1600MHz

Unigine Valley, Ultra, 1080p
FPS: 88.1
Score: 3686
Min FPS: 22.9
Max FPS: 136.6

Unigine Heaven, Ultra, 1080p
FPS: 52.3
Score: 1318
Min FPS: 8.3
Max FPS: 109.2

Arma 3
High infantry showcase - 62 average
ultra infantry showcase - 39 average


----------



## electro2u

@bogusbuger is crossfire enabled?


----------



## Bogusburger

Yeah it is. I get the crossfire symbol in the top right and both cores are at high load.


----------



## wermad

One 295x doesn't show x fire in ccc. Two will though. Just verify both cores are loading in AB.


----------



## Bogusburger

Both cores do show up


That is running metro last light. On arma it does the same thing but the load only goes to about 40%


----------



## wermad

That's gpuz







. Ccc = catalyst control center


----------



## Bogusburger

Can you get graphs in catalyst control centre? And i was using gpuz because afterburner seems to cause weird system crashes.


----------



## xer0h0ur

Quote:


> Originally Posted by *Bogusburger*
> 
> Can you get graphs in catalyst control centre? And i was using gpuz because afterburner seems to cause weird system crashes.


So far in my experience the crashing is caused by RivaTuner aka Afterburner's OSD. Have you tried disabling the OSD when it crashes on you?


----------



## mah111

Hi,
Just saying "hello"!
Another happy owner here.
Got my XFX 295x2 recently, tried many different ways to mount it in my case (Thermaltake core V51)
The final solution is push/pull config with Noctua NF F12 fan as an intake (600-1000RPM) from the front (as seen on the picture).
This solution lets me keep my RIG quiet enough (stock fan does not go over 34% of it's speed) and the temps are fine (67 Celcius max on cores).
Got another fan to blow on the card from the bottom of the case to calm down VRM fan a bit (and it kind of worked)

I do not recommend Thermaltake PSUs for 295x2 setup though. My 1200W Thermaltake Grand Power PSU just wines like a b....
Changed it for 1000W Super Flower Gold (as suggested somewhere here on the forum) and now it's silent. It was not the card that whined...

My setup:
Thermaltake V51 case
Gigabyte Gaming 5 motherboard
Intel i5 4790 CPU
XFX
128GB SSD for system
2x750GB RAID0 7200RPM HDD

Now waiting for ATI new API for VR (and Directx12) to let Oculus Rift fly on this setup.

So one more time:
Hi there


----------



## xer0h0ur

For what its worth AMD is pushing the 3XX series for VR and they have been saying its all about dedicating a GPU per eye to keep the framerate high enough so it doesn't induce nausea.


----------



## Feyris

VR compute cores have been in gcn since Tahiti. Hawaii will do fine~ guessing 3xx will just again have way more too


----------



## mah111

What I understand from new AMD LiquidVR API leaks, 2-gpu setup will be perfect for future VR applications, because, like you said, eah GPU will be dedicated to one eye.
I beliveve even after3XX series release 2-gpu 295x2 will do better than one core 3XX card.


----------



## Alex132

Quote:


> Originally Posted by *mah111*
> 
> What I understand from new AMD LiquidVR API leaks, 2-gpu setup will be perfect for future VR applications, because, like you said, eah GPU will be dedicated to one eye.
> I beliveve even after3XX series release 2-gpu 295x2 will do better than one core 3XX card.


Probably, better than 390X yeah. It's all up to drivers.


----------



## Orivaa

2 395x2 will be even better, though.


----------



## mah111

Quote:


> Originally Posted by *Orivaa*
> 
> 2 395x2 will be even better, though.


Evidently it will, no doubt.

Costing at least 3 times as much, though.


----------



## xer0h0ur

I am already afraid that the 390X will be quite expensive. In the $750-800 range. The 395X2 won't be cheap. Particularly if sporting 8GB or 16GB of HBM.


----------



## wermad

Probably $2k for the 395x2.


----------



## Pandora's Box

Had an upgrade itch, scratched it and bought a Sapphire R9 290X Tri-X to get some TriFire action going.

http://www.amazon.com/gp/product/B00HJOKARI/ref=od_aui_detailpages00?ie=UTF8&psc=1


----------



## xer0h0ur

Well you literally opened pandora's box now. I have run into far more issues with tri-fire than I ever did with just the 295X2 alone. Welcome to the tri-fired club but proceed with caution.


----------



## mah111

I did not have 3-fire setup myself but I heard it's very issue-friendly.

But man, just wait for Directx12, I believe it will be the game changer for multi-gpu setups.
Good luck for now, You could need it...

By the way...
Did anyone succeed with 3DMark API Overhead with Directx12 on 295x2?


----------



## Orivaa

Quote:


> Originally Posted by *mah111*
> 
> I did not have 3-fire setup myself but I heard it's very issue-friendly.
> 
> But man, just wait for Directx12, I believe it will be the game changer for multi-gpu setups.
> Good luck for now, You could need it...


It will only be a game changer if developers will actually use the memory pool feature.


----------



## Pandora's Box

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well you literally opened pandora's box now. I have run into far more issues with tri-fire than I ever did with just the 295X2 alone. Welcome to the tri-fired club but proceed with caution.


I spent a while searching forums for any issues with trifire setups with not much luck. What sort of issues did u have?


----------



## xer0h0ur

Low core usage and crashing in tri-fire when loading a game versus not crashing in crossfire are the main issues I come across.


----------



## wermad

I had tri 290s tri-x and it was pretty solid. I did briefly xfire one of them with my first 295x2. It ran the benchies without issues. The triX cooler is very good, albeit a tad long (~300mm)


----------



## xer0h0ur

Benchmarks won't tell you how games will act. They only tell you your theoretical best case scenario when everything is scaling perfectly, clock speeds aren't going up and down and all over the place and its optimized to its peak. In other words its not even remotely realistic to actual gaming.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Benchmarks won't tell you how games will act. They only tell you your theoretical best case scenario when everything is scaling perfectly, clock speeds aren't going up and down and all over the place and its optimized to its peak. In other words its not even remotely realistic to actual gaming.


I ran Trifire for a short time and i managed to get some game time in with DA:I, BF4, Crysis 3, Grid 2 and Dirt Showdown.

They all worked pretty well for me but i do agree benchmarking is an unrealistic scenario in this case.


----------



## xer0h0ur

I think I can count on one hand the amount of games I have in my Steam library that will actually utilize the GPUs to full usage or near full usage. Practically every AAA title nowadays has horrid scaling and horrid GPU usage. Its quite pathetic but I blame developers more so than I do the hardware or even AMD's drivers.


----------



## wermad

i believe metro ll has full utilization in 4k. I haven't checked but as long as you turn off advance physics, it very smooth. I'll have to pull up ab monitoring next time. Right now, rig is in progress of tear down (adding one more rad







. Other then this, I only played the opening pieces of TR. Titan Fall tutorial ran smooth as well.\

Btw, in 1200, the triple 290s ran many games without any issues. Albeit I was sticking w/ older games as the G3258 would have struggled with something more recent.


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> I think I can count on one hand the amount of games I have in my Steam library that will actually utilize the GPUs to full usage or near full usage. Practically every AAA title nowadays has horrid scaling and horrid GPU usage. Its quite pathetic but I blame developers more so than I do the hardware or even AMD's drivers.


Well it appears 290x and probably 295x2 do really well against 980gtx in highend gaming. Scaling doesn't seem to be an issue there at least on 2 cards.
Quote:


> Source for TPU's bench
> 
> CPU: Intel Core i7 5820K processor w/Corsair H110 cooler
> GIGABYTE X99 Gaming G1 Wi-F
> 16GB Corsair Vengeance 2666MHz DDR4
> Windows 7 Ultimate x64
> 
> 
> 
> My setup
> EVGA SR2 - PCIE 2.0
> R9 290x Reference x2 CF
> AMD 15.3 beta drivers
> Intel x5650 xeon OC 4.0/4.4
> 24GB DDR3, @1600mhz
> Win7 Pro 64bit


Aside from TitanX assuming your considering either camp, looks like 290x is doing rather well in dx11 if you ask me.


----------



## xer0h0ur

I am talking about tri-fired performance. I am not complaining about 295X2 performance alone. Its only since I have added the 290X to the equation that I have seen complications or low usage. I also am not exactly overclocking my 4930K very hard so I wouldn't be surprised if I am bottlenecking there.


----------



## DividebyZERO

Not sure anyone will post tri fire triple 4k benchmarks on a review site. If your running 4k, I dont see you getting low gpu usage on a regular basis. I dont see your cpu bottlenecking 4k really either.


----------



## wermad

Nav did one (calling it "hybrid crossfire"). It should be in the op.


----------



## xer0h0ur

Hugh? I am on a single 4K monitor. Ain't a chance in hell I would try to push 2 or 3 4K monitors for gaming. That is asking for a slideshow.


----------



## DividebyZERO

My point wasn't to imply you should do any of that. My point was 290x/295x2 in CF and Tri-fire as well is kicking ass for it's age. I don't have a 295x2 to run in tri-fire with a 290/290x. You are running 4k and you stated your CPU might be holding you back. I was replying that @ 4k your cpu most likely isn't even affecting your FPS even in Tri-fire. I guess i wasn't being clear.


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> Hugh? I am on a single 4K monitor. Ain't a chance in hell I would try to push 2 or 3 4K monitors for gaming. *That is asking for a slideshow*.


Honestly i am tired of people talking down the capability of triple 4k and or 295x2/290x. My panels only run 30hz at 4k, on top of that i have no way to capture the quality of what it looks like and show it to anyone online. I only have a crappy cellphone camera(24fps if im lucky) and youtube compression to further assualt the quality. Yeah you may not be able to run max settings in all your games. Yes some games will run better and others may not at all. To call it a slideshow is what bothers me. I play quite a few games at triple 4k and yes at 30hz i am only getting half the visual speed it could go.

In my sad attempt to show its not a slideshow i made a video that even 1080p the max my phone will go it still doesnt even convey how amazing triple 4k is in visual fidelity. It's like trying to tell a story thats really awesome but you have to be there to witness it.. at the very least it shows gpu usage and fps. Yes i know i am not moving around but i am instead holding the camera. when i get some time after i move to my new place maybe i can invest in some way to demo it more.

this was [email protected] bioshock infinite


----------



## Pandora's Box

Pretty impressive, though the bezels drive me insane. Much prefer 21:9 3440x1440.


----------



## F4ze0ne

Definitely impressive.









I unfortunately can't deal with bezels either or I'd have eyefinity by now.

I'm going to be looking at a 1440 144hz monitor instead since these cards are stuck at 60hz anyways for 4k.

Once you go high hz monitor, it's tough to go back.


----------



## xer0h0ur

With all due respect I have every right to my opinions and frankly I don't give a damn if it bothers you. I didn't spend $1500 on a 295X2 and another couple hundo on the 290X plus a few hundos to water cool everything to have to accept your opinions as mine.


----------



## Alex132

Quote:


> Originally Posted by *DividebyZERO*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Hugh? I am on a single 4K monitor. Ain't a chance in hell I would try to push 2 or 3 4K monitors for gaming. *That is asking for a slideshow*.
> 
> 
> 
> Honestly i am tired of people talking down the capability of triple 4k and or 295x2/290x. My panels only run 30hz at 4k, on top of that i have no way to capture the quality of what it looks like and show it to anyone online. I only have a crappy cellphone camera(24fps if im lucky) and youtube compression to further assualt the quality. Yeah you may not be able to run max settings in all your games. Yes some games will run better and others may not at all. To call it a slideshow is what bothers me. I play quite a few games at triple 4k and yes at 30hz i am only getting half the visual speed it could go.
> 
> In my sad attempt to show its not a slideshow i made a video that even 1080p the max my phone will go it still doesnt even convey how amazing triple 4k is in visual fidelity. It's like trying to tell a story thats really awesome but you have to be there to witness it.. at the very least it shows gpu usage and fps. Yes i know i am not moving around but i am instead holding the camera. when i get some time after i move to my new place maybe i can invest in some way to demo it more.
> 
> this was [email protected] bioshock infinite
Click to expand...

30Hz panels? No way in hell could I use those








60Hz annoys me enough, I've tried 30Hz before - I really really hated it.

I'd rather have a single 4k 60Hz over 3x4k 30hz to be honest. But I'd rather have 3440x1440 IPS 120Hz over any of those


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Hugh? I am on a single 4K monitor. Ain't a chance in hell I would try to push 2 or 3 4K monitors for gaming. *That is asking for a slideshow*.
> 
> 
> 
> Honestly i am tired of people talking down the capability of triple 4k and or 295x2/290x. My panels only run 30hz at 4k, on top of that i have no way to capture the quality of what it looks like and show it to anyone online. I only have a crappy cellphone camera(24fps if im lucky) and youtube compression to further assualt the quality. Yeah you may not be able to run max settings in all your games. Yes some games will run better and others may not at all. To call it a slideshow is what bothers me. I play quite a few games at triple 4k and yes at 30hz i am only getting half the visual speed it could go.
> 
> In my sad attempt to show its not a slideshow i made a video that even 1080p the max my phone will go it still doesnt even convey how amazing triple 4k is in visual fidelity. It's like trying to tell a story thats really awesome but you have to be there to witness it.. at the very least it shows gpu usage and fps. Yes i know i am not moving around but i am instead holding the camera. when i get some time after i move to my new place maybe i can invest in some way to demo it more.
> 
> this was [email protected] bioshock infinite
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 30Hz panels? No way in hell could I use those
> 
> 
> 
> 
> 
> 
> 
> 
> 60Hz annoys me enough, I've tried 30Hz before - I really really hated it.
> 
> I'd rather have a single 4k 60Hz over 3x4k 30hz to be honest. But I'd rather have *3440x1440 IPS 120Hz* over any of those
Click to expand...

That's the one I'm waiting on.....hopefully there will a be a Freesync model as well


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> With all due respect I have every right to my opinions and frankly I don't give a damn if it bothers you. I didn't spend $1500 on a 295X2 and another couple hundo on the 290X plus a few hundos to water cool everything to have to accept your opinions as mine.


I think we have a communication disconnect. If you dont find hawaii trifire(stock clocks) running triple4k at 60fps impressive. I was merely showing gpu usage which you said is an issue for you. I am not trying to tell you to accept anything I say as your own. You made your own observations and expressed it here. Everyone I guess will have different experiences with their own hardware.

Im out good luck.


----------



## mah111

I played some Metro LL yesterday and I noticed some significant frame drops using Advanced Physics.
I have decided to backup my 295x2 with NVidia Gforce GTX 460 or GT740 GDDR5 with hybrid physX mod.

Have any of you guys tried this kind of config?
What is your experience on this?


----------



## joeh4384

Quote:


> Originally Posted by *mah111*
> 
> I played some Metro LL yesterday and I noticed some significant frame drops using Advanced Physics.
> I have decided to backup my 295x2 with NVidia Gforce GTX 460 or GT740 GDDR5 with hybrid physX mod.
> 
> Have any of you guys tried this kind of config?
> What is your experience on this?


I know the latest Nvidia drivers automatically shut this down when they detect a radeon card installed.


----------



## mah111

Quote:


> Originally Posted by *joeh4384*
> 
> I know the latest Nvidia drivers automatically shut this down when they detect a radeon card installed.


I believe I'm not forced to use the latest available drivers?


----------



## Alex132

Quote:


> Originally Posted by *mah111*
> 
> I played some Metro LL yesterday and I noticed some significant frame drops using Advanced Physics.
> I have decided to backup my 295x2 with NVidia Gforce GTX 460 or GT740 GDDR5 with hybrid physX mod.
> 
> Have any of you guys tried this kind of config?
> What is your experience on this?


PhysX is for chumps









But seriously, what CPU do you have - and have you overclocked it? IIRC with my 5870 PhysX wasn't an issue with the original Metro. I played Metro LL with my GTX690 so I can't say there, and I am still waiting on my R9 295X2









A GTX 460 will be faster than a GT740 for PhysX - and it will work with drivers that allow it.
Quote:


> Originally Posted by *mah111*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> I know the latest Nvidia drivers automatically shut this down when they detect a radeon card installed.
> 
> 
> 
> I believe I'm not forced to use the latest available drivers?
Click to expand...

This too.

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> But I'd rather have *3440x1440 IPS 120Hz* over any of those
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's the one I'm waiting on.....hopefully there will a be a Freesync model as well
Click to expand...

This man knows whats up


----------



## wermad

Is advance physics option in metro LL the same as Nvidia physx? If i recall, I had to turn this off with my tri 780s SC.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Is advance physics option in metro LL the same as Nvidia physx? If i recall, I had to turn this off with my tri 780s SC.


Yes.
It might use (or have used) CPU only.

You might have had the NV control panel to use CPU for PhysX instead of a GPU.

Out of interest, do you think the GT 430 could do PhysX and actually not slow down my system


----------



## Alex132

Looking at that video I definitely had it disabled too - or they massively downgraded it when the actual game came out.



Spoiler: Warning: Spoiler!











http://physxinfo.com/news/11443/gpu-physx-in-metro-last-light/


----------



## mah111

Quote:


> Originally Posted by *wermad*
> 
> Is advance physics option in metro LL the same as Nvidia physx? If i recall, I had to turn this off with my tri 780s SC.


I think this video explains the difference in Metro LL






My only concern is that newer games like Witcher 3 will most probably use physX 3.0 instead of 2.0 which will not be patched any more


----------



## Mega Man

I just don't get physx.. saw it. And was less then not impressed


----------



## DividebyZERO

Quote:


> Originally Posted by *Mega Man*
> 
> I just don't get physx.. saw it. And was less then not impressed


Batman series is one of the best games to showcase physx. It makes the game feel totally different. Its one of the rare examples though.


----------



## polo81920

Did you ever get the R9 295x2 working on the hackintosh? I'm currently on standby because I want to build a hackintosh, yet I cannot find enough info on being able to run this card in a hackintosh or if there are any work arounds. I would hate to get rid of it because it's a great card, but would like to get away from Windows other than for gaming.


----------



## divotion

Hi Polo81920

I have a Hackintosh build and the Card with Yosemit. Work but verry bad. While QE/CI no acceleration. Have tried to change 0x100267b9 Card ID to kext driver. Wont.

The Card is good. And for Hackintosh i added a low Budged Card Nvidia gt210 and only switch Input at Monitor.


----------



## Mega Man

Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> I just don't get physx.. saw it. And was less then not impressed
> 
> 
> 
> Batman series is one of the best games to showcase physx. It makes the game feel totally different. Its one of the rare examples though.
Click to expand...

Still that is the game I tried. And the above statement came from.


----------



## polo81920

Hey thanks for taking the time to respond to my question.
i understand that you are running both a r9 295x2 and a GTX card on the same motherboard and switch input settings on the monitor for the hackintosh? I want aware that you could run different cards on the same MoBo?


----------



## divotion

Hi again polo81920

Your Welcome

Not exactly correct. My Story:

I had buy a 295x2 and hoping that i can modify to run it over windows for games and hackintosh.

Gaming on Hackintosh is nothing on comparsion to windows. The Drivers are verry bad on osx in comparison. For amd and nvidia. But to work and have stability and scope, osx for me go better way while is unix system. (rsyncd, rsync, Datasecure. tricky things same as linux but with usability.)

I like overall hackintosh more that one original. go faster for better money. win win. But Hand Work on it. 

I know that the 290x Chip is going out of the box with hackintosh yosemite. For recognized the 295x2 in osx i have tried a lot of ways. With qe/ci to understand. without isnt possible to normal work or something else on osx.

In past with the 6970x2 it wasnt a problem. 2x 6970 amd chips. osx regognized it at 2 single 6970 chips and can access more monitors over the card as usual in hackintosh.

Therfore i think is the same with the 295x2. 2x 290x. BUT IT ISNT.

Over spend a mutch time and try to modify kext and bootloader Framebuffer / clover and chameleon with combinations and the backward compatibility kext amd8000 from netkas for mavericks. it want go. qe/ci i i cant enable it on my card and yosemite. And the youtube clover framebuffer try i have tried too.

Then the idea to add a 290x for hackintosh and in windows to have crossfire 3x? That Idea was great. I buy a 290x xfx dp. The Problem was then another.

Only 290x perfect. with qe/ci. Awesome. But with attached the 295x2 the 290x take problems too. Attached with my 3 Monitors that wasnt go too in combinations with windows and yosemite. And attache / reattach for dualboot i wont go that way. nonono.

3x Crossfire have make problem when was attached a monitor on 290x. yosemite give problems when was attached a monitor on 295x2. Tried to chang gpu priority on my bios.

I have spend a lot of time. Gived up that. Make no sense.

Then i added too the low budget gt210   and putted in 2nd input on my main Monitor. Now are 4 gpus in my Computer.

The gt210 is disabled in windows in hardware / device manager. when i enable that then i have fps lower. therefore i disable the low budget gpu in windows and have 3x crossfire 290x chips. Everything normal. nvidia amd together np with the current nvidia driver suite. But need to do: Disable it in device manager for have not losses in games.

In hackintosh to go the secure way is remove all amd and ati kexts with rm -rf in /system/library/extensions/.

or let it there... isnt a problem too. while that gpus recognized osx as vesa ( 295x2 ) and 290x as right. But over described in Combination wont work.

In this way i have the best solution that i can go. I do everything. I havnt power on qe/ci in yosemite with my gt210. But i dont need it.

you can go with a better low budget card in yosemite for qe/ci. But i dont know when amd for this is a good way, when i have that problems with the 290x. you can try that too. But i recommended a nvidia to split kexts and drivers completely! Not only a devid or plugin inside. Completely different. Only need to have is a nvidia that is with qe/ci out of the box with yosemite.

less price good qe/ci: http://www.tonymacx86.com/yosemite-desktop-guides/158163-success-rewrite-yosemite-10-10-2-msi-z97-gaming-7-i5-4460-asus-strix-gts-960-silent-beast.html

nvidia gtx 960. maybe for yosemite. but my gt210 for 25euro do that job what i need in osx.

A better gpu for osx to have gpu power is cinema4d or make movies... Finalcut... bla bla. For working only make it no sense.

therefore i have a good dualboot system for every need i want. gpu windows / work stability on mac. And linux to repair software things.

finaly your choice to do. i am not you. But you can combinations gpus driver when you disable other in operation systems.

ADDITIONAL: Install in Windows the macdrive driver suite to access you hfs partitions in windows. The bootcamp driver isnt a way to go. make more problems. In this way you can access on every operation system you data.

But stay attention. macdrive for windows is good. better then the driver from apple. but they do problems too. in a raid will go problems. maybe then your system yosemite wont mount that harddisk. Inodes Problems.

good way to go repair without issues is going then in yosemite and fire up diskwarrior software and repair your volume. without data loss. I do it now in 2 years 5 Times and with a data raid 6 support of 12tb i havnt loss nothing.

only to know that when you need to Access hfs partitions on windows / Linux too. 

regards
divotion


----------



## mah111

Quote:


> Originally Posted by *Mega Man*
> 
> Still that is the game I tried. And the above statement came from.


Hi,
I think you might be right.
PhysX seems to die slowly and only lazy developers use it now (maybe 2 or 3 titles driven by NVidia does it)
Good engines like CryEngine or Unreal Engine 4 has it's own physics implementations running good on every GPU (having the same or better effects than PhysX).

I came to the conclusion one game is not worth the effort and money spent on NVidia card.
I wish NVidia stayed with the BFG Ageia plan to use dedicated hardware accelerated physics card with all the other hardware compatibility.
Unfortunately they had their own interest in implementing BFG's solution into their cards only.


----------



## cennis

This is my second water-cooled build with the focus of top-tier performance/cooling in a compact case. This machine is used mainly for benching, schoolwork and gaming. The highlight of the build is the water cooling loop. The emphasis is placed on using fittings and adapters instead of tubing to connect in tight spaces while maintaining nice lines. Total tubing used was probably less than 10 inches. Total fittings used was around 40. Had a great experience modding this.


Spoiler: Warning: specs!



CPU: 4.7ghz i7-4770k
GPU: R9 295x2
RAM: Trident X 2x4GB
MOBO: ASUS Maximus Impact VI

Watercooling:
Aquacomputer 295x2 nickel plexi block with active backplate
Maximus VI impact full cover nickel plexi polished
EK, Enzotech, Bitspower, swiftech bits






Spoiler: Warning: More Pics!














BENCH


Spoiler: Warning: Spoiler!






*

If you guys liked this build, head over to Mod of the Month to show some support!*


----------



## DividebyZERO

Quote:


> Originally Posted by *cennis*
> 
> This is my second water-cooled build with the focus of top-tier performance/cooling in a compact case. This machine is used mainly for benching, schoolwork and gaming. The highlight of the build is the water cooling loop. The emphasis is placed on using fittings and adapters instead of tubing to connect in tight spaces while maintaining nice lines. Total tubing used was probably less than 10 inches. Total fittings used was around 40. Had a great experience modding this.
> 
> 
> Spoiler: Warning: specs!
> 
> 
> 
> CPU: 4.7ghz i7-4770k
> GPU: R9 295x2
> RAM: Trident X 2x4GB
> MOBO: ASUS Maximus Impact VI
> 
> Watercooling:
> Aquacomputer 295x2 nickel plexi block with active backplate
> Maximus VI impact full cover nickel plexi polished
> EK, Enzotech, Bitspower, swiftech bits
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: More Pics!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BENCH
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> *
> 
> If you guys liked this build, head over to Mod of the Month to show some support!*


Really nice man.


----------



## cennis

Quote:


> Originally Posted by *DividebyZERO*
> 
> Really nice man.


Thanks, these cards generate alot of heat overclocked, even with these rads the backplate gets scorching


----------



## mah111

That's a great looking mod man!
Grats!
What's the CPU inside?
Did you manage to measure noise level of the RIG under load?


----------



## cennis

Quote:


> Originally Posted by *mah111*
> 
> That's a great looking mod man!
> Grats!
> What's the CPU inside?
> Did you manage to measure noise level of the RIG under load?


CPU: 4.7ghz i7-4770k delidded
GPU: R9 295x2 1200mhz
RAM: Trident X 2x4GB
MOBO: ASUS Maximus Impact VI

Noise levels depends on how high I turn the AP182 180mm fans. They come with a built in fan controller knob that lets me adjust them from 0~2500rpm. Under a fully overclocked benchmark load I have to run them at 1500rpm~ maybe close to 40db?

During normal work or gaming its very silent since all fans will be < 500 rpm (or off)

Vote for me http://www.overclock.net/t/1546811/ocn-mod-of-the-month-march-2015-amateur-class-vote-now if you enjoyed this build!


----------



## wermad

What's your x score in 3d11?


----------



## mah111

Quote:


> Originally Posted by *cennis*
> 
> CPU: 4.7ghz i7-4770k delidded
> GPU: R9 295x2 1200mhz
> RAM: Trident X 2x4GB
> MOBO: ASUS Maximus Impact VI
> 
> Noise levels depends on how high I turn the AP182 180mm fans. They come with a built in fan controller knob that lets me adjust them from 0~2500rpm. Under a fully overclocked benchmark load I have to run them at 1500rpm~ maybe close to 40db?
> 
> During normal work or gaming its very silent since all fans will be < 500 rpm (or off)
> 
> Vote for me http://www.overclock.net/t/1546811/ocn-mod-of-the-month-march-2015-amateur-class-vote-now if you enjoyed this build!


That's a quite impresive overclock for 295x2. (I managed to push mine to 1055 without artefacts visible)
Also for CPU which must generate a lot of heat with this overclock, noise level near 40dB is a good score out there.

What was the cost of the watercooling itself?


----------



## cennis

Quote:


> Originally Posted by *wermad*
> 
> What's your x score in 3d11?


I didn't run that benchmark much, I can run that tonight though.

I looked at your rig and for 3dmark11 CPU skews the Graphics test 1 score ALOT

Here is my 3dmark 2013 run, http://www.3dmark.com/3dm/4118697 1175mhz

Tess disabled.
Quote:


> Originally Posted by *mah111*
> 
> That's a quite impresive overclock for 295x2. (I managed to push mine to 1055 without artefacts visible)
> Also for CPU which must generate a lot of heat with this overclock, noise level near 40dB is a good score out there.
> 
> What was the cost of the watercooling itself?


If you up the voltage and power limit, you should be able to get 1100 easily. For 1200mhz, there are artifacts but it is benchable. In games I usually dial down to 1150 for stability

My watercooling parts are mostly used, except the aquacomputer block. I would say around 500 USD, it really adds up.


----------



## wermad

P scores are passe imho...we gotta move on to X/1080 or FS. It's same with Vantage...it's just not up to modern bench expectations. I run p but typically for stability and then I run x or FS. I need to run FS 1440 and 2160 with my setup. My









Nice oc on the card







. I've yet to touch mine







.


----------



## cennis

Quote:


> Originally Posted by *wermad*
> 
> P scores are passe imho...we gotta move on to X/1080 or FS. It's same with Vantage...it's just not up to modern bench expectations. I run p but typically for stability and then I run x or FS. I need to run FS 1440 and 2160 with my setup. My
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice oc on the card
> 
> 
> 
> 
> 
> 
> 
> . I've yet to touch mine
> 
> 
> 
> 
> 
> 
> 
> .


cool, yea 3dm11 P used to be the standard since its free


----------



## wermad

Freebie does come with access to some higher settings I believe. I know they lock the custom "advance" option. I got both of mine when I bought some gpu's. Actually, I only redeemed one of my FS codes from one of my triple evga 780 SCs







.


----------



## steezebe

Ladies and Gentlemen, It's time:



You can see the 295x2 with the water block on it in there somewhere. It's a Corsair Air 240. So far I'm very impressed with the case.

From UPS delivery to leak test: 4 hours... wish me luck!


----------



## wermad

Schweetz


----------



## Medusa666

Hey guys, I just ordered a R9 295X2 and a 1600W power supply to go with it.

I only have one question, do I need / should i use the extra PCI-E lane power connector on my Gigabyte X99 Gaming 5 motherboard?

Would it be beneficial to hook the PCI-E lanes up with more power?

It is a small 4 pin connector located south of the PCI-E slots.

Let me know.

Thanks!


----------



## joeh4384

Quote:


> Originally Posted by *Medusa666*
> 
> Hey guys, I just ordered a R9 295X2 and a 1600W power supply to go with it.
> 
> I only have one question, do I need / should i use the extra PCI-E lane power connector on my Gigabyte X99 Gaming 5 motherboard?
> 
> Would it be beneficial to hook the PCI-E lanes up with more power?
> 
> It is a small 4 pin connector located south of the PCI-E slots.
> 
> Let me know.
> 
> Thanks!


I dont think you would need it. It might help if you ran 2.


----------



## wermad

^^^This









If you run 1 or 2 gpu's (cores), you don't need the gpu auxiliary power. If you're unsure, refer to the manual. Sin recommended to use it on their boards when you run three or four gpu cores. I actually forgot to plug mine in (Sniper5 Z87) once with my triple 290s and didn't notice a difference.


----------



## Mega Man

assuming trifire then yes plug it in


----------



## ViRuS2k

I have been fiddling around with my setup with watercooling *****
card is a beast now









1150/1690 max temps in crysis 3 @4k 60c



Also upgraded psu from 1250w ocz single modular rail to 8 pack 2000w psu again beast psu as i plan on getting another 295X2 for quad in the future. or 3xx cards when there out...

I7 4970k @4.9ghz delid









basically trying to get my system ready for the witcher 3 lol


----------



## cennis

Quote:


> Originally Posted by *ViRuS2k*
> 
> I have been fiddling around with my setup with watercooling *****
> card is a beast now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1150/1690 max temps in crysis 3 @4k 60c
> 
> 
> 
> Also upgraded psu from 1250w ocz single modular rail to 8 pack 2000w psu again beast psu as i plan on getting another 295X2 for quad in the future. or 3xx cards when there out...
> 
> I7 4970k @4.9ghz delid
> 
> 
> 
> 
> 
> 
> 
> 
> 
> basically trying to get my system ready for the witcher 3 lol


what rads do you have? and fan /speed


----------



## TooManyAlpacas

I just bought Cod: World at War and for some reason it puts up a small square screen and it will not load up I think it has to do with the GPU but I don't know any advice or does anyone own the game and know how to fix it


----------



## wermad

I had funs w/ that game in 5x1 Eyefinity (quad 7970 Lightnings). I'll install it and run it in 4k. Have you uninstalled it and re-install btw?


----------



## TooManyAlpacas

no I will try it right now


----------



## TooManyAlpacas

yeah still not working for me


----------



## wermad

latest directX installed? launching it now....


----------



## TooManyAlpacas

I got it to work I had to end up going and running it as an administrator


----------



## wermad

Cool, glad you got it working









Game crashed at first, it was just stuck on an 800x600 blank screen w/ the cursor. Reboot, launched properly (hit no for safe mode), switched to 3820x2160, launched the game solo campaign and it worked


----------



## ViRuS2k

Quote:


> Originally Posted by *cennis*
> 
> what rads do you have? and fan /speed


fans Yellow 120mm Arrow things lol
GPU to pump then to front radiator 360mm thick rad then to top rad 360mm again then to cpu then out of cpu then out of the case to a 420mm thick alphacool monster radiator then back into the case and into the gpu.
did it that way so that the heat from the cpu does not reach the graphic card









and im sure that setup will cool another 295x2 with ease


----------



## xer0h0ur

You guys want to have your GPUs loading in the 30's with setups like that lol. Wish I had a case with that much radiator real estate.


----------



## babeasaurousrex

HI guys, i was wondering if you could maybe help me out with a problem im having with my R9 295 x2?
im TRYING to play Farcry 4 but its keeps freezing up my whole system, gotta hard reboot to get any where.
iv deleted all drivers "even from registry" and reinstalled, made a profile for it in catalyst in an attempt to make it run on one GPU.
any idea? im only trying here because no one else seems to know anything about the 295x2


----------



## F4ze0ne

Did you run DDU for the driver uninstall?

And what PSU are you using?


----------



## babeasaurousrex

mmm idk what DDU is but my psu is a evga 800w 80+ bronze


----------



## F4ze0ne

Try running DDU to clean out the drivers and then start fresh with 15.3.

http://www.wagnardmobile.com/DDU/download/DDU%20v14.2.0.0.exe

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

Also, make sure each power port on the 295x2 has a dedicated pcie plug coming from the psu.

8 pin -> 8 pin
8 pin -> 8 pin
8 pin -> 6 pin


----------



## babeasaurousrex

ok iv tried 15.3 once be for and it seemed to make it worse so i went back to the latest none beta. BUT i did not try it with DDU so i will do that now and check more power fast while im at it


----------



## F4ze0ne

You want to use 15.3 because it contains the crossfire profile for Far Cry 4.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> You guys want to have your GPUs loading in the 30's with setups like that lol. Wish I had a case with that much radiator real estate.


Lol, I don't think that's possible unless its below zero in your area and you run a rad outside your home







. I idle in the 30s and for sure load in the 40-50s depending on fan speed and ambient.


----------



## babeasaurousrex

ok so i used DDU and installed 15.3 ..... i also took a look at my power going into my card and they are separate lines BUT i only have two 8 pin connecters no 6 pin, is that right my R9 290x on my other tower has two 8pins and a 6pin. i guess i just thought bigger card = more plugs







ill try playing to see if i still freeze up ill check back when i know. Tanks for now


----------



## xer0h0ur

You need to look at your PSU's 12v rail setup. If itd multi-rail then you need to make sure you're connecting each 8 pin to its own rail with nothing else plugged into those rails.


----------



## babeasaurousrex

ok still froze... and you lost me on that last one


----------



## wermad

which power supply you have? your rig specs only show "800w 80 plus"


----------



## babeasaurousrex

AZZA 800W 80 PLUS POWER SUPPLY PSAZ-800S12

i just opened it up to be sure, i was wrong earlier trying to go from memory, my other tower has the evga.


----------



## Mega Man

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cennis*
> 
> what rads do you have? and fan /speed
> 
> 
> 
> fans Yellow 120mm Arrow things lol
> GPU to pump then to front radiator 360mm thick rad then to top rad 360mm again then to cpu then out of cpu then out of the case to a 420mm thick alphacool monster radiator then back into the case and into the gpu.
> did it that way so that the heat from the cpu does not reach the graphic card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and im sure that setup will cool another 295x2 with ease
Click to expand...

Hate to tell you this but in 99% of cases loop order does not matter ( one exception is res/pump)
Quote:


> Originally Posted by *babeasaurousrex*
> 
> HI guys, i was wondering if you could maybe help me out with a problem im having with my R9 295 x2?
> im TRYING to play Farcry 4 but its keeps freezing up my whole system, gotta hard reboot to get any where.
> iv deleted all drivers "even from registry" and reinstalled, made a profile for it in catalyst in an attempt to make it run on one GPU.
> any idea? im only trying here because no one else seems to know anything about the 295x2


Remove ubicrap far cry 4 problem solved

@wearemad speak for yourself







( rad space/temps)


----------



## doctakedooty

So guys took a few months because of some issue fitting all this in the case but it's finally up and running. The case is a corsair 250d with some custom work done.






Made a custom mid plate to hide the wires below


----------



## babeasaurousrex

i know
Quote:


> Originally Posted by *Mega Man*
> 
> Remove ubicrap far cry 4 problem solved
> 
> @wearemad speak for yourself
> 
> 
> 
> 
> 
> 
> 
> ( rad space/temps)


yeah i know







my husband hates ubisoft


----------



## wermad

Quote:


> Originally Posted by *babeasaurousrex*
> 
> AZZA 800W 80 PLUS POWER SUPPLY PSAZ-800S12
> 
> i just opened it up to be sure, i was wrong earlier trying to go from memory, my other tower has the evga.


Can you check the unit again? I can't find anything on this model. Don't see it here either:

http://www.realhardtechx.com/index_archivos/Page13100.htm

Google shows nothing but 850w units. Is it the Titan 80+ model?

http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4890#post_23654185

Quote:


> single Azza Titan 850W 80+ 20A, 20A, 20A, 33A, 33A, 20A
> single Titan 900W BRONZE 74A
> single Titan 1000W BRONZE 83A
> single Ultima 850W GOLD 70A
> single Platinum 850W PLATINUM 70A
> single Platinum 1000W PLATINUM 83A


Quote:


> Originally Posted by *Mega Man*
> 
> @wearemad speak for yourself
> 
> 
> 
> 
> 
> 
> 
> ( rad space/temps)


Hehehe, and one more rad is pending








Quote:


> Originally Posted by *doctakedooty*
> 
> So guys took a few months because of some issue fitting all this in the case but it's finally up and running. The case is a corsair 250d with some custom work done.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Made a custom mid plate to hide the wires below


sweet looking setup


----------



## babeasaurousrex

here you go








i used bing







" yeah i bing what of it" most of what i found was people asking the same sort of questions, perhaps this power supply is poo


----------



## Bogusburger

I think the issues I have been having with my fps is a cpu bottleneck. my cpu is an i5 3570
This was running Crysis 3 at ultra 1080p, was getting around 30-40fps.
Would you guys say this is a cpu bottleneck?

Thanks for any help


----------



## wermad

Quote:


> Originally Posted by *babeasaurousrex*
> 
> 
> 
> here you go
> 
> 
> 
> 
> 
> 
> 
> 
> i used bing
> 
> 
> 
> 
> 
> 
> 
> " yeah i bing what of it" most of what i found was people asking the same sort of questions, perhaps this power supply is poo


Got it









Well, didn't see anything online, but the obvious is there on the pic you provided. Its got 64.5amps for your the single rail. Tbh, that's less then what the recommended is 70amps on a single rail for a single card. Also, are you using any adapters on the psu? I know these can cause issues.

Btw, no harm in using bing








Quote:


> Originally Posted by *Bogusburger*
> 
> I think the issues I have been having with my fps is a cpu bottleneck. my cpu is an i5 3570
> This was running Crysis 3 at ultra 1080p, was getting around 30-40fps.
> Would you guys say this is a cpu bottleneck?
> 
> Thanks for any help


Are you running stock? Looks like both are pegged but the gpu load seems erratic tbh. Try a bench and see if you get the same results. Typically, the gpu will load higher in graphics orientated tests.


----------



## babeasaurousrex

nope no adapters. welp this kinda sucks to have to be replacing parts on a tower less then a month old, shoulda built it myself instead of ordering from cyberpower lol.
oh well live and learn, any recommendations on a suitable PSU


----------



## Bogusburger

I am running stock and I get similar fluctuations in gpu load in everything.


----------



## wermad

Quote:


> Originally Posted by *babeasaurousrex*
> 
> nope no adapters. welp this kinda sucks to have to be replacing parts on a tower less then a month old, shoulda built it myself instead of ordering from cyberpower lol.
> oh well live and learn, any recommendations on a suitable PSU


The power requirements for 295x2 are very specialized. It sucks, I also was caught off guard, though I did get a unit that can run two (old unit was good for one). I ended up creating a list for this question. We're working to switch the op member to an active one so this list can be added soon the the first post. But here it is:

http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4890#post_23654185

Scroll down the list to see if any unit you're interested can do 295x2. The one's marked with * are very close to the limit. If you're gonna go w/ crossfire (quad fire basically), you need one that's flagged as "crossfire".
Quote:


> Originally Posted by *Bogusburger*
> 
> I am running stock and I get similar fluctuations in gpu load in everything.


Hmmm...you don't have any background programs running at the same time that could be utilizing your cpu? Crysis 3 is a demanding game, do you have anything else you can try? Maybe try lower ing the settings and see if the cpu load drops? Amd usually can have issues w/ lower resolutions (nvidia shines better here tbh), which can explain the erratic gpu load. Once you get to the extreme resolutions, gpu load is more important.


----------



## DividebyZERO

Triple 4k eyefinity 30hz slideshow demo, for those who doubt the power of Hawaii. sadly its only 720p and my camera kept whitening out the scope in shots, along with auto-focus being silly. Its far from a slide show and playable for those who aren't too picky. forgive me for the cell phone video and amateur camera work. I will fix all this soon enough i guess and maybe post proper quality videos of the graphics because this looks so pale in comparison to actual panels.

sniper elite 3 - 6480x3840 - ultra preset in the video around 35 seconds



metro ll redux - 6480x3840 - medium i think its in the video




also if this is too far off topic for this thread please let me know


----------



## Orivaa

You using one or two 295x2's?


----------



## DividebyZERO

Quote:


> Originally Posted by *Orivaa*
> 
> You using one or two 295x2's?


im on 290x quadfire, but there are some games that run on tri-fire or even CF(or a 295x2) It really depends on the title, and 295x2's have DP so 60hz is now within reach for 4k.

my 290x only has 1 DP, and im using HDMI x3 - dvi/dvi/hdmi for now. Waiting on 390x or whatever AMD has next


----------



## Feyris

Temporarily using a CS750M till SF gets here with rig so! living dangerously


----------



## Bogusburger

I did some more testing, it seems that with higher cpu usage the fluctuations increase. The fps in arma seems to be the same with CFX enabled as with it disabled.


*50% usage in heaven, not 5000%


----------



## Alex132

Quote:


> Originally Posted by *Bogusburger*
> 
> I did some more testing, it seems that with higher cpu usage the fluctuations increase. The fps in arma seems to be the same with CFX enabled as with it disabled.
> 
> 
> *50% usage in heaven, not 5000%


5000% CPU usage lol. What res are you at?

Even 50% CPU usage in Heaven is really high. I normally get around like... 5%?

Hang on let me test my CPU usage.... 8600 GT AWAY~

My CPU was about 12-15% and it wasn't up-clocking (stuck at ~1600mhz). When I forced it to 4.9Ghz it sat at about 4-6% usage.


----------



## kayan

Quote:


> Originally Posted by *Bogusburger*
> 
> I did some more testing, it seems that with higher cpu usage the fluctuations increase. The fps in arma seems to be the same with CFX enabled as with it disabled.
> 
> 
> *50% usage in heaven, not 5000%


This may be a stupid suggestion, but are you playing in full screen? Xfire doesn't work in windowed or even border less full screen window.


----------



## Bogusburger

These were all at 1080p. I might be getting a new cpu anyway. My current one is a i5 3570 and I might get a 4690k so I can overclock. That might be the issue as the cpu is maxed out quite a lot of the time and it does look like it limits the gpu's. To kayan, I am running fullscreen.


----------



## wermad

Sounds like a plan. Remember 3570 is lga1155 and 4690k is lga1150, so you'll need a new board. But these should plentiful and cheap now that z97 and z87 have been out for a while. Some z87 boards can get finicky with DC (ie 4690k, 4790k, etc.) btw.


----------



## Alex132

Quote:


> Originally Posted by *Bogusburger*
> 
> These were all at 1080p. I might be getting a new cpu anyway. My current one is a i5 3570 and I might get a 4690k so I can overclock. That might be the issue as the cpu is maxed out quite a lot of the time and it does look like it limits the gpu's. To kayan, I am running fullscreen.


I am really confused in Heaven 4.0 how you got 50% usage. I didn't even get that much usage with a 1600Mhz 2500k...

I am using MSI AB to monitor CPU usage, and setting the update speed to like 1000ms. Then running the benchmark looking at the average over the past like minute or so that it's been running.

I also remember doing a benchmark of Heaven 2.5 - and even 1 core at 2.4Ghz BARELY impacted performance.



GPU was a 5870.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Sounds like a plan. Remember 3570 is lga1155 and 4690k is lga1150, so you'll need a new board. But these should plentiful and cheap now that z97 and z87 have been out for a while. Some z87 boards can get finicky with DC (ie 4690k, 4790k, etc.) btw.


Z97 > Z87.

Z97 almost always yields better overclocks., especially on DC. MSI Gaming series IIRC isn't that expensive and offer decent looks and performance (but iffy BIOSs).

I would stay away from the cheaper ASUS boards (Z97-A, Z97-C, etc.) as I know many friends who have had awful experiences with them.
Just cheap basically.

Personally I'd go for the MSI Gaming 5/7 because of that matte black PCB


----------



## wermad

I got the mpower z87 for $50 last year and sold it for $100 on fleabay. Went with a sniper 5 z87 but that had issues with my 4690k. Picked up the z97 black g1 from the market. Though one of the bios quit on me it's been great since its come back from rma. My DC is a decent clocking chip tbh.


----------



## Bogusburger

Here is my cpu usage in task manager while running unigine heaven:

Could it be a Bios issue?
If I were to upgrade I was looking at an i5 4690k with an MSI 797 -G45, any thoughts?


----------



## PureBlackFire

is this a good valley score for stock?


----------



## Alex132

Quote:


> Originally Posted by *PureBlackFire*
> 
> is this a good valley score for stock?
> 
> 
> Spoiler: Warning: Spoiler!


R9 295X2 at 1080p?










Also disable ULPS.


----------



## PureBlackFire

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PureBlackFire*
> 
> is this a good valley score for stock?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> R9 295X2 at 1080p?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also disable ULPS.
Click to expand...

thanks for the reminder. I'm testing the card out for my friend. he's just bought it and is either getting a 144hz 1440p monitor or a 60hz IPS 4k monitor in the next few days.


----------



## SLK

Anyone's VRM fan have a little rattle at certain RPMs? Mostly at boot up I hear it rattling a little.


----------



## Alex132

Quote:


> Originally Posted by *PureBlackFire*
> 
> thanks for the reminder. I'm testing the card out for my friend. he's just bought it and is either getting a 144hz 1440p monitor or a 60hz IPS 4k monitor in the next few days.


I see.
It might be best to download 3DMark and compare it there. There are tons of R9 295X2 3DMark results around.
Quote:


> Originally Posted by *SLK*
> 
> Anyone's VRM fan have a little rattle at certain RPMs? Mostly at boot up I hear it rattling a little.


Ball bearing fans can do this, whether or not this is normal for the R9 295X2 I do not know. But I do know my PSU ball-bearing fan has done this for a long time.


----------



## wermad

Quote:


> Originally Posted by *Bogusburger*
> 
> Here is my cpu usage in task manager while running unigine heaven:
> 
> Could it be a Bios issue?
> If I were to upgrade I was looking at an i5 4690k with an MSI 797 -G45, any thoughts?


Try overclocking the cpu. If it's a non-K chip, your multi is locked to a specific one (might 42). You may able to get it to 4.0-4.2. That's healthy enough for 1080.

If you decide to upgrade, that's a good combo.
Quote:


> Originally Posted by *Alex132*
> 
> R9 295X2 at 1080p?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also disable ULPS.


For some reason, high end amd cards tend to stumble with 1080. Don't know why, but I've noticed this in reviews. My triplets 290 setup ran fine in 1200, though I didn't run crysis 3 or any demanding games (g3258 wouldn't have handled it properly ).


----------



## steezebe

Quote:


> Originally Posted by *PureBlackFire*
> 
> is this a good valley score for stock?


I got the same values almost exactly with mine when I tested stock.


----------



## SLK

Yep, mine were at the highend of 4400.


----------



## wermad

Congrats Feyris! We have a new op!


----------



## Mega Man

Maybe now u can be added


----------



## wermad

Lol,









and the psu recommended list!


----------



## Elmy

PSU recommended list for 2 X 295X2 is 2 1500 watt power supplies... LoL


----------



## wermad

Quote:


> Originally Posted by *Elmy*
> 
> PSU recommended list for 2 X 295X2 is 2 1500 watt power supplies... LoL


http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4890#post_23654185

edit; we're trying to embed the sheet to the op

edit 2: its been done


----------



## Feyris

Updated OP with PSU List & new member list with apply link!

I will be cleaning it up more in a little bit just wanted to get that done first, I will be adding legacy list members back to current list once I re-do topic.


----------



## wermad

edit: form submitted


----------



## Pandora's Box

Submitted also.


----------



## Feyris

Conversion has~ begun~


----------



## xer0h0ur

Why wipe the existing member list?


----------



## Sgt Bilko

Submitted as well, nice to finally be on the list


----------



## Feyris

Quote:


> Originally Posted by *xer0h0ur*
> 
> Why wipe the existing member list?


I cant access the orig document and its manual add so 0 control over the google document. Im adding everyone from existing into new one though. (it was such a short list anyway)

New one is updated for automatic submissions instead of having to deal with add requests


----------



## xer0h0ur

Gotcha


----------



## Feyris

Quote:


> Originally Posted by *xer0h0ur*
> 
> Gotcha


Dont worry we still love you







you'll be back on it in a few hehe

Legacy now all listed


----------



## Alex132

New OP looking good


----------



## kayan

Looks good. When I get my rebuild done, I will be applying for this group. I've got everything, but forgot to order longer screws for my rads. Bah.


----------



## Mega Man

Most ace hardware had then in Stock fyi


----------



## kayan

Quote:


> Originally Posted by *Mega Man*
> 
> Most ace hardware had then in Stock fyi


Thanks for the tip. You'd think Home Depot or Lowes would have them too, but noooo. Would be too easy since one of my places of employ is Home Depot.


----------



## SLK

Anyone on the stock cooler hear air in the pump at boot up? On a cold boot I am talking about.


----------



## Pandora's Box

For the life of me I can not get trifire working with my 295X2 and this 290X I recently bought. All the cards install just fine along with the drivers but catalyst control center disables the third adapter (290X). When I change the order the cards are installed in so the 290X is the top card and the 295 is the bottom, the driver disables crossfire completely and only the 290X works in games.


----------



## Elmy

With a new OP comes a new video from me 

You might of also seen my computer on the front cover and on pages 44-46 on last months edition of cpu magazine.


----------



## xer0h0ur

Quote:


> Originally Posted by *Pandora's Box*
> 
> For the life of me I can not get trifire working with my 295X2 and this 290X I recently bought. All the cards install just fine along with the drivers but catalyst control center disables the third adapter (290X). When I change the order the cards are installed in so the 290X is the top card and the 295 is the bottom, the driver disables crossfire completely and only the 290X works in games.


That literally makes no sense to me. I have tri-fired both ways. With the 295X2 as the primary and with the 290X as the primary. I am currently running the 290X as my primary video card. I have not had problems with getting any driver to enable crossfire between them as its always been done automatically upon a driver installation.


----------



## Mega Man

On mobile have to ask does his mobo have pcie dip switches?


----------



## Pandora's Box

Quote:


> Originally Posted by *Mega Man*
> 
> On mobile have to ask does his mobo have pcie dip switches?


nope, no switches.

what's even weirder is if i connect a second monitor to the second graphics card CCC changes the adapter from disabled to enabled. but crossfire doesnt change from 2 gpus (the 295x2) to 3 gpus. as soon as i unplug the adapter it switches it back to disabled.


----------



## xer0h0ur

And the 295X2 alone is crossfiring showing both GPUs on its own right?


----------



## Pandora's Box

yes. when i have the 295x2 in the top slot it crossfires just fine with itself, doesnt want to talk to the 290x though. ive tested all the pci-express ports on the board, every single one of them work


----------



## Pandora's Box

I honestly don't think it's the board as I have used it in the past for GTX 970 SLi and 7970 crossfire.


----------



## xer0h0ur

Out of curiosity what method do you use for driver installation or removal? Are you also disabling ULPS?


----------



## Pandora's Box

Quote:


> Originally Posted by *xer0h0ur*
> 
> Out of curiosity what method do you use for driver installation or removal? Are you also disabling ULPS?


I've tried simply uninstalling with the ati uninstaller, tried using DDU, tried just installing over top of the current drivers. even tried fresh install of windows. i have not tried disabling ULPS as the second cards fan doesn't shut off so I dont think it's kicking in.


----------



## xer0h0ur

I don't trust leaving ULPS on to be frank. It just seems like a broken feature.

I am at a loss, when I warned you about shenanigans with tri-fire I certainly wasn't talking about this. I expected it to be software issues only.


----------



## F4ze0ne

Is the .Net framework install ok? Maybe try installing it again?

Just a thought because mine was corrupted once and catalyst was totally broken. Crossfire not working, missing GPUs, Catalyst failed to start errors, etc...


----------



## xer0h0ur

Oh yeah, have you also verified you're using the most up to date motherboard BIOS?


----------



## Pandora's Box

I'm at a loss at this point. Think I am just going to return the 290X and just stick with 295X2. Like I said before I've ran 970 sli (tri-sli actually) with no issue, so I don't think its the motherboard. The only thing I can think of is this board uses a PLX chip to enable more PCI-E lane bandwidth, perhaps thats conflicting with the chip on the 295x2? No idea but at this point I'm done messing with it. Returning the 290X to Amazon and just going to wait for the 390 series.

And yes, .NET Framework is updated and installed correctly.

Edit: yes latest mobo bios installed.


----------



## Alex132

Asus WS mobo support (BIOS) is awful, I wouldn't be surprised if it was that to be honest.


----------



## mojobear

Hey guys,

Not an R9 295x2 owner, but I do have 4 way R9 290s, so we are almost cousins









Was hoping to get some input on those with 2 x R9 295x2 playing with mantle...namely BF4 and Dragons age inquisition....neither of which work in mantle for me. Dx11 works fine but I miss my buttery smoothness









I have tried driver reinstalls wth DDU, and upgrading from 14.12 -> 15.3 drivers....no luck with either.

Specically with BF4 -> stuck at logging in and dragons age inqusition -> cant even get into a load screen.

Anyone else with this issue? Any work arounds? Any help/suggestions/comments would be appreciated.............................

Much thanks!


----------



## xer0h0ur

To be honest the only game I played at all with Mantle was Thief and I experience zero performance advantage over DX.


----------



## F4ze0ne

I'm using Mantle on both of those games and I haven't run into any loading issues.

Have you tried testing different configurations with your cards like single, dual, triple?


----------



## xer0h0ur

Literally the only other time I have attempted to use Mantle has been the Starswarm benchmark which crashes immediately and doesn't work at all for me in Mantle. Then the API benchmark within 3dmark also acts oddly in Mantle. The framerate seemingly slows to crawl on the final test as soon as the camera tries to pan to the left.


----------



## mojobear

Quote:


> Originally Posted by *F4ze0ne*
> 
> I'm using Mantle on both of those games and I haven't run into any loading issues.
> 
> Have you tried testing different configurations with your cards like single, dual, triple?


Ah also forgot to say I'm runnign 7680 x 1440 at 80hz................

Maybe its a combo of quad fire + eyefinity?


----------



## mojobear

Quote:


> Originally Posted by *xer0h0ur*
> 
> Literally the only other time I have attempted to use Mantle has been the Starswarm benchmark which crashes immediately and doesn't work at all for me in Mantle. Then the API benchmark within 3dmark also acts oddly in Mantle. The framerate seemingly slows to crawl on the final test as soon as the camera tries to pan to the left.


Blah.......That sucks, I got mantle to work in December, etc but recently its been a pile of junk


----------



## mojobear

Quote:


> Originally Posted by *xer0h0ur*
> 
> To be honest the only game I played at all with Mantle was Thief and I experience zero performance advantage over DX.


Yah agreed... Thief mantle works for me too. Maybe its just something with frostbite engine (DAI and BF4) and mantle that is a bit wonky currently...........also we here have the most rare setups lol.....4 way R9 290s + 1440p eyefinity.............sigh.


----------



## xer0h0ur

Yeah I don't even own BF4 or DA:I so I never tried Mantle with it.


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Was hoping to get some input on those with 2 x R9 295x2 playing with mantle...namely BF4 and Dragons age inquisition....neither of which work in mantle for me. Dx11 works fine but I miss my buttery smoothness


The latest build of mantle is bugged in quadfire. BF4 and DA:I both use the latest build. PvZ however doesn't and it works fine in quad mantle.


----------



## SLK

Bleh, may have to RMA. The pumps seem to take in air for the first 5 mins of operation at a cold boot. I hear water sloshing around then it goes silent 5 mins later. I wonder if it may be because the rad is on the bottom?


----------



## Mega Man

yes it is, the top of the rad is the "reservoir" which needs to be above the pumps to operate properly when you shut off your system in any way that the pumps shut down the aid travels up toward the pump ,


----------



## mojobear

Quote:


> Originally Posted by *tsm106*
> 
> The latest build of mantle is bugged in quadfire. BF4 and DA:I both use the latest build. PvZ however doesn't and it works fine in quad mantle.


Bleh good to know. Rare to find other people with the same issue. Need these forums to get the required info.

Hopefully frostbite and AMD fix this issue. DAI runs soo butter smooth with mantle.


----------



## Pandora's Box

Woot! Got my issue fixed. Took a break from messing with this as I was getting annoyed and frustrated with it. Decided for ****s and giggles lets reset the motherboard bios. I'd been avoiding this as because I didn't think it would do anything because both cards were being recognized by Windows, just crossfire wasn't working, and I didn't want to have to fiddle with overclocking the CPU again. What do you know? It worked.

Results:










Under load all cards go to PCI-E 3.0 x16 (Minus the second GPU on the 295X2 which is at 8x).


----------



## xer0h0ur

I had a feeling it may have been the BIOS. Good to hear you're up and running fine now. Now let us know if you notice your GPU usage drop in games while tri-fired compared to crossfire. If not then I need to take a hard look again at my rig.


----------



## Mega Man

what do you guys think ?

http://www.3dmark.com/3dm/6528362


----------



## Pandora's Box

So far GPU Usage has been consistent in the games I've tested.

Tomb Raider on Ultimate settings at 3440x1440:










Shadow of Mordor Ultra preset 3440x1440










Metro Last Light Redux 3440x1440










Let me know if you want me to run any of them at a different resolution.


----------



## xer0h0ur

I don't have overclocking headroom yet to even try to push my cards. Best I got tri-fired with mild overclocks is 7180 on the Omega driver I believe. Couldn't be bothered to bench this Beta









Tomorrow most of my loop revamp parts arrive. I will only be waiting on my EK reservoir/pump. Whenever this amazon.com seller decides to get off his @ss and ship it









Edit: Nope that wasn't Omega driver, I wonder if I ever even benched the Omega using Firestrike Ultra.


----------



## F4ze0ne

*AMD Radeon R9 295X2 versus GeForce GTX Titan X SLI at 6480x3840*

Quote:


> We had some reader feedback that wanted to see how the dual-GPU Radeon R9 295X2 would go in the ring with the GeForce GTX 980s in SLI and the Titan X in SLI at 6480x3840, and we're glad to report that for a card that is close to a year old, it did surprisingly well.


http://www.tweaktown.com/tweakipedia/80/amd-radeon-r9-295x2-versus-geforce-gtx-titan-sli-6480x3840/


----------



## Pandora's Box

Quote:


> Originally Posted by *F4ze0ne*
> 
> *AMD Radeon R9 295X2 versus GeForce GTX Titan X SLI at 6480x3840*
> http://www.tweaktown.com/tweakipedia/80/amd-radeon-r9-295x2-versus-geforce-gtx-titan-sli-6480x3840/


At that high of a resolution it's running out of VRAM. I would bet that is the only thing holding the card back. Pretty sad that Nvidia still can't beat it, the card has been out for a year now.


----------



## Mega Man

figures, but come on titan x sli but no 295x2 CFX >.> thats shady imo


----------



## xer0h0ur

I would lay down the duckets for a 16GB HBM 395X2. Seriously. I would.


----------



## F4ze0ne

Quote:


> Originally Posted by *Mega Man*
> 
> figures, but come on titan x sli but no 295x2 CFX >.> thats shady imo


^ This

Price to performance would be nice.

And the 295X2 hasn't been $999 for quite a while. No idea where they got that figure from...


----------



## Mega Man

me thinks this card will be my last dual gpu card, unless i am getting one for htpc ( kick bum kind ) ill stick with good ole quadfire, but i have to admit it is fun, although i want to test the ares III


----------



## xer0h0ur

Its been at $700-750 for weeks now. Dipped under $700 for a little bit too.


----------



## xer0h0ur

Quote:


> Originally Posted by *Mega Man*
> 
> me thinks this card will be my last dual gpu card, unless i am getting one for htpc ( kick bum kind ) ill stick with good ole quadfire, but i have to admit it is fun, although i want to test the ares III


I am a genuine sucker for dual GPU video cards. Always have been since the 3DFX Voodoo 5 5500 AGP card I put into my first rig I built. Back in the day of the Super Nintendo cartridge style AMD Athlon processors lol. 800MHz wut wut!


----------



## Mega Man

have to post this here, idk i think it funnry

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125727


----------



## xer0h0ur

"Free Acer 27" G-Sync Monitor with purchase, limited offer"









Gotta make it hurt a little less to pay 3K for that abomination. LOLOLOL. Those things aren't even full coverage units right? Only cooling the GPUs I presume?


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am a genuine sucker for dual GPU video cards. Always have been since the 3DFX Voodoo 5 5500 AGP card I put into my first rig I built. Back in the day of the Super Nintendo cartridge style AMD Athlon processors lol. 800MHz wut wut!


I think I still have one of thos laying around somewhere... 3dfx will always be my favorite brand even though they went extinct and nvidia bought the the name I guess.


----------



## Mega Man

" Dear Nvidia/AMD

please adopt this as the new standard for GPU i/o

it not only allows for single slot solutions when possible * watercooling* but i mean, tell me this does not look better then the hodge podge junk on the market now

yours truly....
everyone ! "

Quote:


> Originally Posted by *xer0h0ur*
> 
> "Free Acer 27" G-Sync Monitor with purchase, limited offer"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gotta make it hurt a little less to pay 3K for that abomination. LOLOLOL. Those things aren't even full coverage units right? Only cooling the GPUs I presume?


correct, like the 295x2 pretty sure they are asecrap,

wish people would let the patent trolls go outta business and only use coolit

:/


----------



## xer0h0ur

I second that motion. If the 395X2 adopts that output setup its guaranteed to go into my rig. Regardless of how much vRAM it gets.


----------



## Mega Man

yea, i dont get it, they have it on 7750s,7770s,7850s, but never on 7970/280x//290/290x/.... why why............


----------



## DividebyZERO

Quote:


> Originally Posted by *Mega Man*
> 
> yea, i dont get it, they have it on 7750s,7770s,7850s, but never on 7970/280x//290/290x/.... why why............


Mega that Crysis 3 you requested, i have it, forgive my total noob self with this camera. Right now im trying to learn how to use it. Crysis 3 has major performance issues for me, and its not restricted to resolution. I think the game has a bug or something. Either way i did a short 4k video test for you its probably going to take youtube a while to proccess it. I will do better videos as i get more into this and learn how to do things like get rid of camera fisheye etc.. its 4k in low light is very bad on quality so either way its the best i can do for now.

Crysis 3 - triple 4k - settings are around 4 minutes, i even tested Very high, without AA of course. - indoors lighting is bad so might want to skip up til outside.

its going to take a while before HD becomes available/4k




Lords of the fallen - Test video triple 4k - accidently ran this one on tri-fire. Didn't realize until later however its just a test video and i forgot to show resolution in it, but its def 4kx3 - also i suck at this game as per game play lol


----------



## ViRuS2k

Quote:


> Originally Posted by *Feyris*
> 
> Updated OP with PSU List & new member list with apply link!
> 
> I will be cleaning it up more in a little bit just wanted to get that done first, I will be adding legacy list members back to current list once I re-do topic.


Can you add my PSU to the list please

Its a super flower R9295x2 2x Quad Core Certified beast. 2000w 8pack edition / 80+ Platinum Certified SF-2000F14HP model. *UK*


----------



## Mega Man

as soon as i can find a retailer ok to ship to the us( THAT IS NOT THE CRAPPY overclockers.co.uk as they rip off ) , and i have the funds ( currently saving for meh baby ) i will be getting 1-2 of the 2kw units

this is kinda epic


----------



## Bludge

Pretty much the last place in Oz selling the XFX 295X2 cards, just removed them from inventory.


----------



## wermad

It's been under $700 for a a bit and even the devil went under $600. Not sure what the reviewer is thinking but most other sites, the author quotes retailer's prices at time of writing or updates with pricing at time of publication. Seems a bit biased imho.....

Edit: looks like no 3D fanboy competition this year lads


----------



## Pandora's Box

That could have been a good review...if done by any other site than TweakTown. Worst reviews ever. Running Windows 7, doesn't list what drivers were used...


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> " Dear Nvidia/AMD
> 
> please adopt this as the new standard for GPU i/o
> 
> it not only allows for single slot solutions when possible * watercooling* but i mean, tell me this does not look better then the hodge podge junk on the market now
> 
> yours truly....
> everyone ! "


Include a mDP -> DP cable in the box too.

No, I don't want a VGA adapter or molex to PCIE. Shocking I know.


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> yea, i dont get it, they have it on 7750s,7770s,7850s, but never on 7970/280x//290/290x/.... why why............


Those 6 miniDP cards are not for enthusiasts but day traders and the ilk. It won't happen until DP becomes the defacto standard which isn't looking likely for another couple years, though it's inevitable by the way these stupid freesynch and gstring threads get populated by the shills. You'd think that synching panels is the only thing that matters lol.


----------



## Alex132

false
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> yea, i dont get it, they have it on 7750s,7770s,7850s, but never on 7970/280x//290/290x/.... why why............
> 
> 
> 
> Those 6 miniDP cards are not for enthusiasts but day traders and the ilk. It won't happen until DP becomes the defacto standard which isn't looking likely for another couple years, though it's inevitable by the way these stupid freesynch and gstring threads get populated by the shills. You'd think that synching panels is the only thing that matters lol.
Click to expand...

mDP only on cards + adapters.

ezpz


----------



## ViRuS2k

we need at least 1 hdmi 2.0 spec output on the card also for people with samsung 4k tvs :/


----------



## Mega Man

or the tv could just give up and accept DP as they should

or... and get this, screw the proprietary bs and lets just go to cat6/cat6e,

more then enough bandwidth ....

either way,. no more BS DP or Cat6 yes please


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> or the tv could just give up and accept DP as they should
> 
> or... and get this, screw the proprietary bs and lets just go to cat6/cat6e,
> 
> *more then enough bandwidth ....
> *
> either way,. no more BS DP or Cat6 yes please


lol no current DP has ~32Gb/s bandwidth.

Cat6 just has 1000Mb/s bandwidth.

We need fiber connections, but DP is fine for now.


----------



## Mega Man

Not for TV ( cat 6 comment was actually meant for tvs)


----------



## wermad

Is PI including 1.3?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Is PI including 1.3?


wut
Quote:


> Originally Posted by *Mega Man*
> 
> Not for TV ( cat 6 comment was actually meant for tvs)


still too slow


----------



## Mega Man

Pretty sure 30 fps at 4k cat6e ( Which is10gb/sec btw) Is fine by my math at 30 fps ( going from numbers all in my head and from memory ) you would need approx 8gb/s. Which cat6e exceeds at 10gb/sec ( assuming 32 bit color )


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> wut


PI = pirates island...next Gen amd cards.

1.3 - displayport 1.3 which has almost double the bandwidth of 1.2.


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Pretty sure 30 fps at 4k cat6e ( Which is10gb/sec btw) Is fine by my math at 30 fps ( going from numbers all in my head and from memory ) you would need approx 8gb/s. Which cat6e exceeds at 10gb/sec ( assuming 32 bit color )


Yeah at 10gbit/s you'll be fine with 32bit color, 8bit depth, 4k. Not 1Gbit/s.

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> wut
> 
> 
> 
> PI = pirates island...next Gen amd cards.
> 
> 1.3 - displayport 1.3 which has almost double the bandwidth of 1.2.
Click to expand...

Oh, yes it does.


----------



## babeasaurousrex

hi every one, so i was speaking with some of you the other day about my GPU freezing up and you guys recommended i look into buying a new PSU one with a max output of at least 70a. soooo i have done so and am still having the same issue







any other idea? or is there a way to be sure that it is "for sure" my GPU?


----------



## Alex132

Quote:


> Originally Posted by *babeasaurousrex*
> 
> hi every one, so i was speaking with some of you the other day about my GPU freezing up and you guys recommended i look into buying a new PSU one with a max output of at least 70a. soooo i have done so and am still having the same issue
> 
> 
> 
> 
> 
> 
> 
> any other idea? or is there a way to be sure that it is "for sure" my GPU?


What PSU do you have?

How is your "GPU freezing up"? Does Windows freeze and stop responding till a reboot?


----------



## babeasaurousrex

i bought this EVGA Supernova 850 B2 Power Supply 80PLUS Bronze Certified 850W ATX Power Supply
and its a full system freeze totally unresponsive, i have to hard reboot


----------



## Alex132

Quote:


> Originally Posted by *babeasaurousrex*
> 
> i bought this EVGA Supernova 850 B2 Power Supply 80PLUS Bronze Certified 850W ATX Power Supply
> and its a full system freeze totally unresponsive, i have to hard reboot


Not PSU issue. I have PSU issues, and it's an instant shut-down.

Sounds like bad software, clean OS install would be best way to 100% make sure no crap is causing this. I know it's a pain, but it's thorough.


----------



## wermad

Either way, her old unit was putting out 64 amps...not recommended for the 295x2. Might as well prevent now then fix later.

@babe, is your cpu stock or oc'd?


----------



## babeasaurousrex

Quote:


> Originally Posted by *wermad*
> 
> Got it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, didn't see anything online, but the obvious is there on the pic you provided. Its got 64.5amps for your the single rail. Tbh, that's less then what the recommended is 70amps on a single rail for a single card. Also, are you using any adapters on the psu? I know these can cause issues.


----------



## babeasaurousrex

idk i ordered the whole build from cyber power pc i think they may have OC'ed


----------



## wermad

Do you have oc profiles in the bios? Try going back to stock on the cpu if possible. Also, make sure the card is stock for now (tried the 2nd bios on the card?)


----------



## babeasaurousrex

Quote:


> Originally Posted by *wermad*
> 
> Do you have oc profiles in the bios? Try going back to stock on the cpu if possible. Also, make sure the card is on stock (tried the 2nd bios on the card?)


ok... id love to do that







but i dont know how to do that


----------



## wermad

Hmmmm....well, before we continue, does your pc have any warranty?


----------



## babeasaurousrex

probably but i get the feeling they don't really care lol i contacted them about a week ago and still haven't heard any thing back. plus i already put the power supply in so i think that would have voided it any way


----------



## babeasaurousrex

also it only freezes when high graphic gaming, i think that's why they were leaning toward GPU


----------



## wermad

How's benching?


----------



## babeasaurousrex

i haven't checked, whats good to use?


----------



## wermad

3d firestrike


----------



## WMW

Just picked this up. Hope I got a good price.

EBay


----------



## electro2u

Wish you woulda bought mine. Ill do 560 shipped for anyone in this thread:
http://www.overclock.net/t/1548554/fs-msi-amd-r9-295x2/0_30#post_23762590
up in marketplace.


----------



## WMW

I don't have any rep here, and I don't have eBay protection. Nothing personal against you.


----------



## electro2u

Quote:


> Originally Posted by *WMW*
> 
> I don't have any rep here, and I don't have eBay protection. Nothing personal against you.


You don't need rep to buy things here, and Paypal protection is pretty analogous to eBay protection. But your price seems reasonable was my point.


----------



## tsm106

Quote:


> Originally Posted by *WMW*
> 
> I don't have any rep here, and I don't have eBay protection. Nothing personal against you.


I'd be more concerned about selling to someone new than buying from him.


----------



## remnant

Quote:


> Originally Posted by *electro2u*
> 
> Wish you woulda bought mine. Ill do 560 shipped for anyone in this thread:
> http://www.overclock.net/t/1548554/fs-msi-amd-r9-295x2/0_30#post_23762590
> up in marketplace.


You tempt me sir


----------



## xer0h0ur

Yeah, no offense but if I were to buy it I would have far more trust in electro2u than some rando on eBay. That is my opinion despite the fact I have sold thousands of shoes on eBay and continue to sell my stores' merchandise on eBay.


----------



## doctakedooty

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah, no offense but if I were to buy it I would have far more trust in electro2u than some rando on eBay. That is my opinion despite the fact I have sold thousands of shoes on eBay and continue to sell my stores' merchandise on eBay.


I agree I rather buy from a member with good selling rep on here and a good member then someone on ebay. You never know if the card is refurbished etc or what has really happened to it then a member here who would tell ya.


----------



## wermad

Like me







........


----------



## Alex132

Having been scammed on OCN and eBay - it's hard to say









But yeah, trusted members on OCN = much better chance than eBay.


----------



## WMW

Okay you guys are right. I sent a message to the seller to try to cancel the order. /secondthoughts


----------



## electro2u

Do what? Price is fine and if there is a problem you hit the send it back button... why give the seller a hard time?


----------



## wermad

Yeah, ebay will not side with you. Having "buyer protection" doesn't mean "buyer's remorse bail out button". If the item is not as described, faulty, or wrong in any functional way, you may have a to wait a bit but there's a good chance they will side with you. Either way, keep in mind the seller is loosing out on 12% in ebay/paypal fee's where ocn members just get hit by the paypal ~2-3% fee (gift payments are not permitted due to no paypal protection). Keep it imho.

I have seen cases where members holding mod status or high reps have failed to live their part of the bargain. Unfortunately, it leaves a bad taste in many and they (as well as others) leave and avoid ocn. Its happened.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Yeah, ebay will not side with you. Having "buyer protection" doesn't mean "buyer's remorse bail out button". If the item is not as described, faulty, or wrong in any functional way, you may have a to wait a bit but there's a good chance they will side with you. Either way, keep in mind the seller is loosing out on 12% in ebay/paypal fee's where ocn members just get hit by the paypal ~2-3% fee (gift payments are not permitted due to no paypal protection). Keep it imho.
> 
> I have seen cases where members holding mod status or high reps have failed to live their part of the bargain. Unfortunately, it leaves a bad taste in many and they (as well as others) leave and avoid ocn. Its happened.


I got burned by someone who was a decently known member, less than 250 rep and less than 30 trader rating however.
I have also heard of others being burned by people with over 50 trader rating.

IMO sometimes it is all down to bad-luck, so I don't hold OCN accountable as much as that specific person.


----------



## Ragingun

So I just tried to play FC4 which I haven't in awhile. It would seem as Xfire is no longer working with the 15.3 beta drivers.









I don't have any profiles setup so it should default to Xfire but it's not. Just plain frustrating. AC:U is doing the same thing. Both these games were up and running well before.


----------



## xer0h0ur

So use the other driver then that was working for you before if that is the case. Or did Ubi**** patch the game broken? As they often have done to me with AC:U.


----------



## veaseomat

Hey guys, do any of you have experience testing freesync with a 295x2? I have 295x2's in crossfire and I just sent my freesync monitor to my base in korea where I will be when I get back from leave in a few days. From what I have read it is working best with single GPUs right now? I can't find much information on freesync yet because the monitors aren't available in the states yet. Any input is greatly appreciated.


----------



## joeh4384

I dont think they have crossfire drivers for freesync yet. Part of me is really tempted to pick up that 144hz 1440p ips asus free sync monitor coming out.


----------



## wermad

Whats up with far cry?!?!?!?!? Is it that crappy on amd? I have #3 but haven't played it yet. I tend to wait till games hit the bargain bin ($15 or under







).


----------



## xer0h0ur

There is no Crossfire support for Freesync in the 15.3 Beta. That becomes a bit annoying for 295X2 users since that would mean you would have to manually create application profiles for every game to disable Crossfire, since AMD never gave 295X2 users the stupid simplicity of a check box on/off setting for the 295X2's Crossfire.

Or you can just wait till the WHQL driver comes out which presumably will give Crossfire support to Freesync along with other goodies we are expecting.


----------



## Mega Man

or ..... or you could just not use it, as honestly like gsync it is just "fluff"


----------



## veaseomat

Quote:


> Originally Posted by *Mega Man*
> 
> or ..... or you could just not use it, as honestly like gsync it is just "fluff"


Sounds like I'll just be waiting for a driver or the 390x to release...


----------



## Mega Man

fyi, when i say "fluff" i mean "junk"

in other news
Quote:


> Originally Posted by *Mega Man*
> 
> anyone else find this ironic ??
> 
> http://www.ign.com/articles/2014/11/06/the-witcher-3-will-receive-16-dlc-packages-free-across-all-platforms
> 
> http://www.gog.com/game/the_witcher_3_wild_hunt_expansion_pass?utm_source=newsletter&utm_medium=email&utm_content=game_subject&utm_campaign=w3Expansion


----------



## remnant

Quote:


> Originally Posted by *Mega Man*
> 
> fyi, when i say "fluff" i mean "junk"
> 
> in other news


"Thank you" - "give us more money please"


----------



## cmoney408

can someone link an installation guide for the 295x2. i know the RAD should be above the the card, which it is, but i am unsure if the cables should be on top of the rad or bottom, or if it even matters.


----------



## Mega Man

tubes need to be on bottom or side


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> tubes need to be on bottom or side


^ agreed, tubes on the bottom gave me the best temps.


----------



## cmoney408

thanks for the info guys. so rad above card with tubes on bottom!


----------



## mah111

Quote:


> Originally Posted by *cmoney408*
> 
> thanks for the info guys. so rad above card with tubes on bottom!


yup,
the same case for me. Got satisfying result placing the rad on the front panel with coolant tank (or whatever is the name of the part where tubes come in to radiator) on the bottom, blowing in to the case.

Temps are about 6-7 Celcius less than having radiator on the top, blowing hot air out.
From my experience, having extra fan on the bottom of the case, blowing on the card to get some fresh air for VRM cooling is not a bad idea either.


----------



## Medusa666

Just recieved my new R9 295X2 yesterday, it is now installed and working well.

For 700 USD it is a bargain!


----------



## Gdourado

Hello, how are you?
I am wondering about the correct installation of the 295x2.

From what AMD says, the radiator must be placed higher than the card and with the pipes facing down.

Now, my case is a HAF XB.
It has a rear 120mm fan mount but it is a case modeled like a test bench where the motherboard sits horizontal.

In this case, to install a 295x2, the radiator would be at the same height as the card and the radiator would have to be installed with the pipes facing the side.

Is this a big issue?
From what I read, the main problem is that it can cause air to enter the cooling circuit.

Any thoughts on this?

Cheers!


----------



## Medusa666

Quote:


> Originally Posted by *Gdourado*
> 
> Hello, how are you?
> I am wondering about the correct installation of the 295x2.
> 
> From what AMD says, the radiator must be placed higher than the card and with the pipes facing down.
> 
> Now, my case is a HAF XB.
> It has a rear 120mm fan mount but it is a case modeled like a test bench where the motherboard sits horizontal.
> 
> In this case, to install a 295x2, the radiator would be at the same height as the card and the radiator would have to be installed with the pipes facing the side.
> 
> Is this a big issue?
> From what I read, the main problem is that it can cause air to enter the cooling circuit.
> 
> Any thoughts on this?
> 
> Cheers!


Logically it is not a problem. Given that a positioning above the card would be optimal, I doubt that mounting it in line with the card itself would pose any real danger to the function of the product. A huge selling point of this card has been the potential for a SFF gaming system, and many of the chassis used mounts the components horisontal.

It should be OK if you mount it anywhere in that case, I have had it so I know how it is outlined.


----------



## Ragingun

Quote:


> Originally Posted by *xer0h0ur*
> 
> So use the other driver then that was working for you before if that is the case. Or did Ubi**** patch the game broken? As they often have done to me with AC:U.


I'm using the same driver. Either UBI***** did something or something random happened and there's no explanation for it.


----------



## Gdourado

Quote:


> Originally Posted by *Medusa666*
> 
> Logically it is not a problem. Given that a positioning above the card would be optimal, I doubt that mounting it in line with the card itself would pose any real danger to the function of the product. A huge selling point of this card has been the potential for a SFF gaming system, and many of the chassis used mounts the components horisontal.
> 
> It should be OK if you mount it anywhere in that case, I have had it so I know how it is outlined.


Thank you for helping out.
I just don't know. I find that if there wasn't the possibility of issues, AMD wouldn't give such recommendations. The GPU is still expensive to risk anything going wrong with it.

Cheers


----------



## xer0h0ur

Quote:


> Originally Posted by *Ragingun*
> 
> I'm using the same driver. Either UBI***** did something or something random happened and there's no explanation for it.


I was so pissed off the last time I installed the game after the 15.3 Beta was released. The stupid game now crashes immediately every time I open it if the Afterburner OSD is active. Or it may have been if I even had Afterburner open at all. I don't remember exactly anymore. Either way I punted that game again. I am literally not even going to bother with it for a long time. With GTA V coming out and the rest of the games I haven't played I have more than enough alternate entertainment that isn't a complete piece of poop.


----------



## chrisman2731

hello All! I was hoping for some assistance or possibly just confirmation of an issue with this video card.

Setup
1600Watt EVGA Plat power supply
Core i7 5960X
12GB DDR4 - Corsair Dominator
2 X 512GB Samsung EVO Pro SSD Raid 0
Asus 4K Monitor
2 X 295X2 (quad Crossfire)
Windows 8.1 64bit
Tried 14.12 and 15.3 Beta drivers

I mostly play Call of Duty Advanced Warfare - Exo Zombies

I have read back to November on this thread and applied a couple of fixes including one that allows the use of all of my video memory by modifying config_mp. I do have shadows disabled, I tried setting up a custom crossfire profile, I reloaded to Windows 7 64bit and back to Windows 8.1 64bit

My current issue is as follows
1: I have to lower all my settings to medium-low
2: The game has random very bad lag spikes for 4-6 seconds every 10mins or so
3: It takes me about 5 minutes to load in game, normally by round 4 on a fresh game
4: I get flickering and certain guns when upgraded to thermal, cause my video cards to display nothing but white and are unusable (started with a cod patch only days ago)

I have tried one video card at a time, removing the other one from the system and it had no real difference except load times are much much better - performance still sucks. I am currently running the Beta driver from 3/20. I have run 48 hours of memtest without issue, I have run intel burn in, I have run a couple of video tests like 3dmark and fur. All work fine and perform as expected.

Anyone know how to get this working better? Is this just the usual quad crossfire, amd sucks at drivers issue?


----------



## xer0h0ur

Have you attempted using the Omega 14.12 instead of the 15.3 Beta? I don't play that game so I can't give game specific suggestions.


----------



## SLK

Quote:


> Originally Posted by *Mega Man*
> 
> yes it is, the top of the rad is the "reservoir" which needs to be above the pumps to operate properly when you shut off your system in any way that the pumps shut down the aid travels up toward the pump ,


Moved it to the rear exhaust above and still does it. Tubes are facing down on the reservoir. Slurps a little when turning on then goes away.

Temps are good, no coil whine, card runs great. It's a horrible Visiontek Brand so RMA might be painful.


----------



## Mega Man

visiontek is actuality a very reputable brand i wonder if you just hear "water" noise and assume it is bad?


----------



## SLK

My H100i doesn't make water noise at start up.


----------



## TooManyAlpacas

My system makes water noises on start up and the temps are really good


----------



## Mega Man

( please note the following is not an insult to you or your rig- but to asetroll {asetech})

Yea well 1 junky pump vs 2 junky pumps.


----------



## SLK

Then I guess it is normal. Good to hear. My H100i is made by Coolit but I am sure the pumps aren't the greatest either.


----------



## Alex132

If it isn't a D5 or DDC. It's a junk pump


----------



## Gdourado

Quote:


> Originally Posted by *SLK*
> 
> Moved it to the rear exhaust above and still does it. Tubes are facing down on the reservoir. Slurps a little when turning on then goes away.
> 
> Temps are good, no coil whine, card runs great. It's a horrible Visiontek Brand so RMA might be painful.


This is what is putting me off with the HAF XB.
Saw this review:
http://www.pcgameware.co.uk/reviews/graphics-cards/msi-radeon-r9-295-x2-graphics-card-review/

Seems to support my fears:


Spoiler: Warning: Spoiler!



The images above show the radiator in what is basically an invalid location, as (a) the radiator is NOT above the card (as per the install instructions (we all read those, right!?)) and (b) the inlet and outlet on the radiator end-tank is on the side. For optimum water flow (and to avoid picking up any air) the radiator should be mounted with the inlet and outlet at the base. this allows any air to be coolected in the upper end-tank, avoiding any air in the system loop. This is strangely not mentioned in the install instructions! BUT IT IS VERY IMPORTANT!









Due to the design our test case (Cooler Master HAF XB), the optimal setup was not possible, so a little imagination (ok, so not much but&#8230 was required, see images below&#8230;



Cheers!


----------



## Gdourado

On another issue, any information about the 295x2 and the upcoming directx12?
Will dx12 be able to use the whole 8gb of memory from the 295x2?

I ask because I am currently debating to either get a 295x2 or save 120 euros and get a 980 G1 and also save myself from crossfire and multi GPU issues.


----------



## cmoney408

thats what i am hoping for. i bought the card already with the hopes it will only become more powerful!!!!!


----------



## Alex132

Quote:


> Originally Posted by *Gdourado*
> 
> On another issue, any information about the 295x2 and the upcoming directx12?
> Will dx12 be able to use the whole 8gb of memory from the 295x2?
> 
> I ask because I am currently debating to either get a 295x2 or save 120 euros and get a 980 G1 and also save myself from crossfire and multi GPU issues.


Maxwell is a joke.

Hawaii is a joke too.

What is your current GPU?


----------



## joeh4384

People are too caught up in the utilization of 8gb of ram via SFR in DX12 etc. I just hope future APIs and game engines support multi-gpu better.


----------



## Gdourado

Quote:


> Originally Posted by *Alex132*
> 
> Maxwell is a joke.
> 
> Hawaii is a joke too.
> 
> What is your current GPU?


Currently no dedicated GPU.
Just using the HD4600 for windows and Lightroom work. No gaming. Want to game again!


----------



## Alex132

Quote:


> Originally Posted by *Gdourado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Maxwell is a joke.
> 
> Hawaii is a joke too.
> 
> What is your current GPU?
> 
> 
> 
> Currently no dedicated GPU.
> Just using the HD4600 for windows and Lightroom work. No gaming. Want to game again!
Click to expand...

Price bracket, PSU, CPU, case?


----------



## Gdourado

Quote:


> Originally Posted by *Alex132*
> 
> Price bracket, PSU, CPU, case?


This is the rest of my system:
Case: Coolermaster HAF XB Evo
PSU: EVGA 1000G2
MB: Asrock Z97 Fatal1ty Professional
CPU: i5 4670K @ 4.8ghz
Memory: 4x 8gb 2400 CL10
SSD: Samsung Evo 840
Cooler: Silverstone HE01
Case Fans: 2x Silverstone AP141 as front intake

Price i don't know. If it's worth to spend much now with probably new gpus coming in the summer.

I also have another option.
I saw a new 290X by Asus. The price is 220 euros which is really nice here in Europe.
The bad is that it is the reference cooler version.
I don't know if the stock cooler is that big of an issue. But for 220, I could just buy it to play until the summer and see if then it would be worth upgrading.

My monitor is an Asus PB278Q.
1440P 60hz.
When more options are available, I want to go 1440p 144hz variable refresh. Either gsync or freesync.

Cheers!


----------



## Alex132

Quote:


> Originally Posted by *Gdourado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Price bracket, PSU, CPU, case?
> 
> 
> 
> This is the rest of my system:
> Case: Coolermaster HAF XB Evo
> PSU: EVGA 1000G2
> MB: Asrock Z97 Fatal1ty Professional
> CPU: i5 4670K @ 4.8ghz
> Memory: 4x 8gb 2400 CL10
> SSD: Samsung Evo 840
> Cooler: Silverstone HE01
> Case Fans: 2x Silverstone AP141 as front intake
> 
> Price i don't know. If it's worth to spend much now with probably new gpus coming in the summer.
> 
> I also have another option.
> I saw a new 290X by Asus. The price is 220 euros which is really nice here in Europe.
> The bad is that it is the reference cooler version.
> I don't know if the stock cooler is that big of an issue. But for 220, I could just buy it to play until the summer and see if then it would be worth upgrading.
> 
> My monitor is an Asus PB278Q.
> 1440P 60hz.
> When more options are available, I want to go 1440p 144hz variable refresh. Either gsync or freesync.
> 
> Cheers!
Click to expand...

Ah like the previous guy said, XB would probably be a no-go for the 295X2's AIO.
On that 840, make sure you've applied that hot-fix to stop the read speed degradation hey









And yeah the 290X ref cooler is an absolute pig. Even aftermarket ones are hot and loud.
New cards should arrive August / September, so it's up to you if you can wait that long. If you can't then 980 might be a decent choice.


----------



## deep33

Quote:


> Originally Posted by *Pandora's Box*
> 
> I posted this in the HAF XB Case thread but figured you guys would like this:
> 
> Moved my PC over to my CoolerMaster HAF XB case. Not sure why I didn't use this case when I got the 295X2. I think I thought it wouldn't fit, nope, it fits


Hi guys,

I will probably have the same set up: R9 295x2 mounted in a HAF XB case as shown in the picture. The liquid CPU cooler will exhaust hot air into the case if mounted up front ( will take the place of one of the front intake fans). Is this going to cause a thermal issue? Because the GPU's exhaust fan will be trying to suck in the hot air dumped (into the case) by the CPU cooler to cool the GPU radiator.


----------



## Mega Man

Quote:


> Originally Posted by *SLK*
> 
> Then I guess it is normal. Good to hear. My H100i is made by Coolit but I am sure the pumps aren't the greatest either.


O sorry.

And good the led we support the patent trolls the better off we are
Quote:


> Originally Posted by *Gdourado*
> 
> On another issue, any information about the 295x2 and the upcoming directx12?
> Will dx12 be able to use the whole 8gb of memory from the 295x2?
> 
> I ask because I am currently debating to either get a 295x2 or save 120 euros and get a 980 G1 and also save myself from crossfire and multi GPU issues.


No one knows yet


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SLK*
> 
> Then I guess it is normal. Good to hear. My H100i is made by Coolit but I am sure the pumps aren't the greatest either.
> 
> 
> 
> O sorry.
> 
> And good the led we support the patent trolls the better off we are
> Quote:
> 
> 
> 
> Originally Posted by *Gdourado*
> 
> On another issue, any information about the 295x2 and the upcoming directx12?
> Will dx12 be able to use the whole 8gb of memory from the 295x2?
> 
> I ask because I am currently debating to either get a 295x2 or save 120 euros and get a 980 G1 and also save myself from crossfire and multi GPU issues.
> 
> Click to expand...
> 
> No one knows yet
Click to expand...

Speculation is yes.

My biggest question is how on earth are they going to deal with the increased latency. 295X2 is one thing across that single PLX chip on the same PCB. But when you have like 4-way Crossfire on 1 motherboard (like 4 290Xs)... that latency would be a loooot.
Not to mention each GPU 'talks' to the motherboard separately at each time, so how would VRAM sharing work then for dual-GPU graphics cards if at that instance it can't access the other GPU's VRAM?


----------



## Gdourado

Quote:


> Originally Posted by *deep33*
> 
> Hi guys,
> 
> I will probably have the same set up: R9 295x2 mounted in a HAF XB case as shown in the picture. The liquid CPU cooler will exhaust hot air into the case if mounted up front ( will take the place of one of the front intake fans). Is this going to cause a thermal issue? Because the GPU's exhaust fan will be trying to suck in the hot air dumped (into the case) by the CPU cooler to cool the GPU radiator.


From that build, I guess you can't fit both the 295x2 on the HAF XB with a 240 or 280 AIO CPU cooler, right?

Cheers


----------



## Feyris

If anyone has more ideas or contributions for op pm me. Ill update tuesdayish. Got sideswiped by severe virus


----------



## fishingfanatic

Hey, no sweat bud. Just get better !

FF


----------



## wermad

Threw in one more Monsta for some kicks







. Count's up to four of em'


----------



## Mega Man

still need one more 480 to ketch meh !~


----------



## wermad

I have monstas mind you


----------



## Mega Man

3 of my 5 are
1 is a 60 and other is a 45

i think i still win XD


----------



## wermad

Case is smaller and cheaper! I win! Alright, alright, you can have the e-peen crown







.

edit: Did you mount the fifth to the flexbays?


----------



## Mega Man

correct !~ no ped either XD


----------



## Alex132

No offense on the Monsta rads, but I don't like their looks.

2thick4me









Thickest I'd go is XSPC RX v3 series. But I'd rather have thinner rads, and slightly more of them than thicker rads and less. It feels like clogged up with radiators that way.


----------



## Mega Man

you would like the size when you have a TH10, so much space, so many ideas !


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> you would like the size when you have a TH10, so much space, so many ideas !


All those $ though









$900+ here


----------



## Mega Man

for the case or the rad ? you are talking about a 1500$ card !~


----------



## wermad

I think the tfc og Monsta and Admirals are the thickest. I got three and now a fourth monstas for less then a set of newer rads such as the Rx v3 and HL Gtx & Sr2. I couldn't pass up the deal on these thick phat rads


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> for the case or the rad ? you are talking about a 1500$ card !~


$900+ for the Case, and it's been a long time since this card was $1500.....


----------



## deep33

Also, the
Quote:


> Originally Posted by *Gdourado*
> 
> From that build, I guess you can't fit both the 295x2 on the HAF XB with a 240 or 280 AIO CPU cooler, right?
> 
> Cheers


Yes you can, you can stick a 240mm radiator/fan in the front and a 120 mm radiator/fan in the back


----------



## Gdourado

Quote:


> Originally Posted by *deep33*
> 
> Also, the
> Yes you can, you can stick a 240mm radiator/fan in the front and a 120 mm radiator/fan in the back


Isn't the 295x2 too long to clear a front radiator on the HAF XB?


----------



## xer0h0ur

Quote:


> Originally Posted by *joeh4384*
> 
> People are too caught up in the utilization of 8gb of ram via SFR in DX12 etc. I just hope future APIs and game engines support multi-gpu better.


People need to realize that just because DX12 offers the option it doesn't mean that the API will automatically do this for you. Developers have already had this option since Mantle was released and no one bothered to do it. Granted DX is the more popular API with more widespread use so it helps the possibility of some developer out there taking up the challenge. Either way though they have made mention about how ridiculously tricky it would be to program a game to use such a widely varying amount of vRAM from one video card to multiple video cards each of which has a varying amount of vRAM. If AMD and Nvidia stuck to using 2/4/8/16GB then it would be one thing but since there are 1/2/3/4/6/8/12/16GB vRAM setups its a pain in the you know what to make a game like that.
Quote:


> Originally Posted by *Alex132*
> 
> Speculation is yes.
> 
> My biggest question is how on earth are they going to deal with the increased latency. 295X2 is one thing across that single PLX chip on the same PCB. But when you have like 4-way Crossfire on 1 motherboard (like 4 290Xs)... that latency would be a loooot.
> Not to mention each GPU 'talks' to the motherboard separately at each time, so how would VRAM sharing work then for dual-GPU graphics cards if at that instance it can't access the other GPU's VRAM?


DX12 is already working towards CPU multi-core load balancing and bringing latency down through asynchronous shaders.

http://wccftech.com/amd-improves-dx12-performance-45-gpu-asynchronous-compute-engines/

This seems to be why AMD is confident their upcoming video cards are going to kill it with DX12 and Vulkan. Their high amount of compute power in their stream processors are finally going to be leveraged. Say goodbye to the Intel/Nvidia/Microsoft circlejerk. DX12 is going to even the playing field and then some. Nvidia will no longer be able to get by with cutting compute power from their architecture.


----------



## Gdourado

I have been searching more on the 295x2 and I found numerous cases online where the card would hit 75 degrees while gaming and then throttle down.
Is this a frequent issue with the card?

Cheers


----------



## xer0h0ur

My case is literally a worst case scenario with fairly sub-standard airflow. During the short amount of time I was using the 295X2 with the original Asetek cooler I was running into 74C temps. I don't think I ever saw it actually show 75C as it would throttle showing 74C. This was of course leaving it bone stock with the fan pushing air out of the case so it was using the case's air to cool the radiator. Once I added a 2nd fan for push/pull it backed it off 3-4 degrees. I believe it could have been even better if I had flipped the airflow so it would be drawing in fresh air from outside.


----------



## Gdourado

That's my fear. If I put the 295x3 on my HAF ZB, the rad would have to be at the rear exhaust 120mm fan support. And there it wood be in the path of the hot air from my 4670k @ 4.8 ghz. I fear I might face throttling.

Cheers!


----------



## tijmpie

Good day all

I have a problem with my new club R9 295x2 i just cant Get it to work properly. My fps on games is realy bad and stays the same on low to ultra settings.
For example bf 4 25-50 fps 1080p stays te same on low or ultra.
I tried all amd driver versions with clean install thru driver sweeper even installed my os again and formatted drives. I can only think About a few things what could be wrong.
I have a feeling it might run on only one gpu but even then fps should be better. Or my cpu is bottlenecking but al 6 cores are around 50-85 while playing bf4 so seems not Maxed out in my opinion. Or its a faulty card.
I hope you guys have Some opinions left before im gonna bring my pc to a shop to check the graphics card cause i dont know what to do anymore.
Thanks in advance.

Tijmen.

My spec are:

Msi 970 gaming mobo
Phenom x6 1090t black edit (oc 3.8)
Cpu watercooled h100i gtx
2x8gb corsair vengeance pro 1800mhz
Ax1200i psu
Samsung evo 840 Os drive
Hitachi 1tb hard drive
Windows 7 pro 64 bit


----------



## Sgt Bilko

Quote:


> Originally Posted by *tijmpie*
> 
> Good day all
> 
> I have a problem with my new club R9 295x2 i just cant Get it to work properly. My fps on games is realy bad and stays the same on low to ultra settings.
> For example bf 4 25-50 fps 1080p stays te same on low or ultra.
> I tried all amd driver versions with clean install thru driver sweeper even installed my os again and formatted drives. I can only think About a few things what could be wrong.
> I have a feeling it might run on only one gpu but even then fps should be better. Or my cpu is bottlenecking but al 6 cores are around 50-85 while playing bf4 so seems not Maxed out in my opinion. Or its a faulty card.
> I hope you guys have Some opinions left before im gonna bring my pc to a shop to check the graphics card cause i dont know what to do anymore.
> Thanks in advance.
> 
> Tijmen.
> 
> My spec are:
> 
> Msi 970 gaming mobo
> Phenom x6 1090t black edit (oc 3.8)
> Cpu watercooled h100i gtx
> 2x8gb corsair vengeance pro 1800mhz
> Ax1200i psu
> Samsung evo 840 Os drive
> Hitachi 1tb hard drive
> Windows 7 pro 64 bit


The rest of your rig looks like it should handle that fine, are you running at 1080p by chance?

What sort of temperatures are you getting on the 295x2 and does the core clocks stay consistent (1018Mhz all the time? )


----------



## tijmpie

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The rest of your rig looks like it should handle that fine, are you running at 1080p by chance?
> 
> What sort of temperatures are you getting on the 295x2 and does the core clocks stay consistent (1018Mhz all the time? )


Thanks for the reply









Im playing on 1080p indeed, and on idle the speeds fluctuate but i think on load its on 1018 all the time need to check that.
About the temp when running valley extreme or fire strike 1080 it didnt get Adove 60 degree Celsius.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tijmpie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> The rest of your rig looks like it should handle that fine, are you running at 1080p by chance?
> 
> What sort of temperatures are you getting on the 295x2 and does the core clocks stay consistent (1018Mhz all the time? )
> 
> 
> 
> Thanks for the reply
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im playing on 1080p indeed, and on idle the speeds fluctuate but i think on load its on 1018 all the time need to check that.
> About the temp when running valley extreme or fire strike 1080 it didnt get above 60 degree Celsius.
Click to expand...

Well it doesn't seem like you are temp throttling, have you tried disabling one of the cores and playing like that?

and with BF4, you should be using Mantle with that if possible......much better fps and usage


----------



## tijmpie

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well it doesn't seem like you are temp throttling, have you tried disabling one of the cores and playing like that?
> 
> and with BF4, you should be using Mantle with that if possible......much better fps and usage


I made a profile in ccc for bf4 disableing crossfire and played it on mantle still had 25-50 fps on low and ultra same my old gtx 460 had a steady 80 fps on low there is realy something fishy going on in my system somewhere.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tijmpie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well it doesn't seem like you are temp throttling, have you tried disabling one of the cores and playing like that?
> 
> and with BF4, you should be using Mantle with that if possible......much better fps and usage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I made a profile in ccc for bf4 disableing crossfire and played it on mantle still had 25-50 fps on low and ultra same my old gtx 460 had a steady 80 fps on low there is realy something fishy going on in my system somewhere.
Click to expand...

I've noticed that disabling crossfire via CCC doesn't work for BF4 in Mantle, you'd need to disable the second display adaptor in control panel or.......run it in DX windowed mode so only one GPU gets used


----------



## tijmpie

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've noticed that disabling crossfire via CCC doesn't work for BF4 in Mantle, you'd need to disable the second display adaptor in control panel or.......run it in DX windowed mode so only one GPU gets used


Thx







didnt know that i will try that first thing to see if i Get Some changes.


----------



## Alex132

It's a shame to use the R9 295X2 at only 1080








It really can start to stretch its legs at 1440p and beyond.


----------



## Gdourado

How is the issue with throttling?

Do you think that if I install a 295x2 on my build, it will throttle?
The AIO Rad would have to be where that red corsair SP120 is.
It would be in direct path of the CPU hot air with not much space from the CPU cooler.



Cheers!


----------



## tijmpie

Quote:


> Originally Posted by *Alex132*
> 
> It's a shame to use the R9 295X2 at only 1080
> 
> 
> 
> 
> 
> 
> 
> 
> It really can start to stretch its legs at 1440p and beyond.


Yeah i know im waiting till the acer-xb270hu comes available in my Country


----------



## xer0h0ur

tijmpie, reset your BIOS if you don't mind having to re-do your settings later. I also assume you had already disabled ULPS?


----------



## tijmpie

Quote:


> Originally Posted by *xer0h0ur*
> 
> tijmpie, reset your BIOS if you don't mind having to re-do your settings later. I also assume you had already disabled ULPS?


Hey xero
Thx for youre reply do you mean the bios of my mobo or graphic card. And im not that experienced with this i dont know what ULPS means so i think i dont have it disabled yet.


----------



## wermad

It's a power saving feature on amd cards. You can turn it off via Sapphire Trixx or Msi Afterburner utilities.


----------



## Feyris

Quote:


> Originally Posted by *tijmpie*
> 
> Hey xero
> Thx for youre reply do you mean the bios of my mobo or graphic card. And im not that experienced with this i dont know what ULPS means so i think i dont have it disabled yet.


Since you dont know what it is just run this


----------



## Synthaxx

Tijmen, I also had quite a bunch of problems when I first got my cards. My issue back then was the motherboard's bios (rampage IV BE) . Flashed to the latest one and problems solved. Maybe another thing to try is plugging in your monitor to a different output, I can have some sort of stutter in a port and in another I don't 
BF4 on ultra in 4k (no AA) gives me between 80-100 fps.


----------



## tijmpie

Thx guys for all youre input gonna try it as soon i get home keep you updated


----------



## xer0h0ur

ULPS is a power saving feature which sounds great in theory but doesn't always work in practice. Its possible for your GPUs or a GPU to get stuck in a low power state with low clock speeds if ULPS is enabled.


----------



## tijmpie

Ok guys its getting more grimm with the minute.

I now also flashed my mobo bios and installed latest driver disabled ULPS and tried to run it on one gpu.
which wasnt possible cause when i disable either one of the gpu in device manager and then started a bf4 game i got the error message:

Failed to initialize display adapter. please ensure youre card is comppatible and the drivers for it installed. usually this is a result of missing vendor specific drivers.

Now the thing is when i enabled both the gpu bf just starts but still with max 40 fps 50 looking to the sky. Now i watched gpuz while playing and one core is on 1018 mhz and 1500 mhz all the time flicking between 0-8 % and a 100 % load
while the other stays at 150 mhz and 300 mhz memory and 0-2 % load.


----------



## xer0h0ur

No no no you never use the device manager to disable a GPU. With the 295X2 you would have to manually create an application profile within the catalyst control center for whatever given game you're trying to disable crossfire. Once you create the profile you click the profile and scroll to the bottom to set crossfire to disabled.

Have you not tried to use the video card in crossfire as it normally would function since updating the BIOS? Normally you don't want to make a boatload of changes all at once. You do them one by one to figure out what you did to fix it. Otherwise you're just left guessing which change fixed it for you.


----------



## wermad

What is gpuz showing for pcie speed? 2.0 8x or 2.0 16x?

Also, is your cpu maxed out?


----------



## tijmpie

Quote:


> Originally Posted by *xer0h0ur*
> 
> No no no you never use the device manager to disable a GPU. With the 295X2 you would have to manually create and application profile within the catalyst control center for whatever given game you're trying to disable crossfire. Once you create the profile you click the profile and scroll to the bottom to set crossfire to disabled.
> 
> Have you not tried to use the video card in crossfire as it normally would function since updating the BIOS? Normally you don't want to make a boatload of changes all at once. You do them one by one to figure out what you did to fix it. Otherwise you're just left guessing which change fixed it for you.


I was told earlier making a profile doesnt work and needs to be done from the device panel (had tried a profile in ccc earlier no change).
And i did try the changes one by one to see if i would see any difference in my frame rates. It just stays the same i realy think something is broken cus tried almost anything and my fps stays the same even when i play on low settings a r9 270 triples my fps.


----------



## xer0h0ur

I have never used the device manager to disable a GPU. You can use afterburner or another 3rd party GPU monitoring application to verify that one of the GPUs is being disabled with the application profile.

Perhaps you did get a defective 295X2. Have you tested with other games or just BF4?


----------



## tijmpie

Quote:


> Originally Posted by *wermad*
> 
> What is gpuz showing for pcie speed? 2.0 8x or 2.0 16x?
> 
> Also, is your cpu maxed out?


Im not sure where to find that info cant see it on my gpuz anywhere.
and my 6 cores are switching around 50-85% so i dont think thats maxed out but im no expert.


----------



## wermad

It should be in gpuz in the first tab.
Will say "bus interface".

Are using the top pcie slot or the bottom?


----------



## tijmpie

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have never used the device manager to disable a GPU. You can use afterburner or another 3rd party GPU monitoring application to verify that one of the GPUs is being disabled with the application profile.
> 
> Perhaps you did get a defective 295X2. Have you tested with other games or just BF4?


I will try it with another program then to disable one gpu.
and i dont have much demenading games i only tried star citizen got about 50 fps on very high it was good playable but i thought it should be higer on 1080p also


----------



## xer0h0ur

Its a button on GPU-Z for GPU load test to see your PCI-E setting at load. I don't remember exactly but it may be a question mark symbol directly next to the "Bus Interface" box.


----------



## tijmpie

Quote:


> Originally Posted by *wermad*
> 
> It should be in gpuz in the first tab.
> 
> Are using the top pcie slot or the bottom?


its the top slot for sure but found it it were next to my glasses xD. It says : PCI-E [email protected] 2.0


----------



## wermad

Amd boards still run Gen 2.0 pcie, I would make sure the card is reading 16x in gpuz (equivalent to 8x 3.0).


----------



## cmoney408

Have any of you ever experienced the screen showing a red rash across the entire screen then the screen going black and only flashing on for a split second every few seconds?

This randomly happens when I bring my computer to my gf's house and use it on her 1080p tv. But it has never happened on my vizio p65 4k tv.


----------



## xer0h0ur

tijmpie have you tried by any chance using the BIOS switch on the 295X2 to use the other BIOS? I suppose its worth a try.


----------



## drchoi21

My newest and most powerful setup in build


----------



## wermad

Quadfire + quad-Monsta's...


----------



## tijmpie

Quote:


> Originally Posted by *xer0h0ur*
> 
> tijmpie have you tried by any chance using the BIOS switch on the 295X2 to use the other BIOS? I suppose its worth a try.


No did not try that yet Will do ?


----------



## deep33

not too long. it fits (no worries)


----------



## deep33

Quote:


> Originally Posted by *Gdourado*
> 
> Isn't the 295x2 too long to clear a front radiator on the HAF XB?


not too long. it fits (no worries)


----------



## veaseomat

9590 under phase, r9 295x2 crossfire. Here's my 3d mark score. http://www.3dmark.com/3dm/6622963


----------



## Mega Man

Quote:


> Originally Posted by *veaseomat*
> 
> 
> 
> 9590 under phase, r9 295x2 crossfire. Here's my 3d mark score.


no score


----------



## wermad

I see 13k in extreme.

Nice clock @5.5


----------



## cmoney408

GTA V!!!! i really thought it was going to be crappy. but its SO smooth. too bad i cant get 4k 60hz to my HDMI 2.0 4k tv yet.


----------



## Mega Man

yea he edited it iin


----------



## kayan

I just leak tested my new build, and bam got a great deal on a koolance block for my 295x2, haha. Can't wait!


----------



## Mega Man

Woo cg


----------



## y4h3ll

Can someone help me, I have an r9 295x2 and full screen isn't working properly for gta 5, there for only one gpu loads up. Any idea how to fix this? Specs are "1300 corsair, 32gb ddr3, 2 evo 240 250gb r9 295x2 and fx 9690 5ghz stable. I just found the first site who gives 2
Shizzes about ati|amd would really appreciate the help. Maybe I am not using the car correctly or something. Built my rig my self.


----------



## Feyris

Are you using the newest beta drivers for gtav? if not try that


----------



## y4h3ll

Yes I am using beta drivers, I'm running windows 8.1 and full screen just doesn't work it just runs in windowed either way. Litterally no activity for 2nd gpu on gta 5.


----------



## Feyris

try disabling ulps 2nd gpu should be lighting up with the beta driver amd posted yesterday since cf works...


----------



## Sgt Bilko

Quote:


> Originally Posted by *y4h3ll*
> 
> Yes I am using beta drivers, I'm running windows 8.1 and full screen just doesn't work it just runs in windowed either way. Litterally no activity for 2nd gpu on gta 5.


Try Alt + Enter if you can't set it to fullscreen in the options

It loaded up for me in Windowed to start with but i could change it to fullscreen via the options and by Alt + Enter


----------



## y4h3ll

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Try Alt + Enter if you can't set it to fullscreen in the options
> 
> It loaded up for me in Windowed to start with but i could change it to fullscreen via the options and by Alt + Enter


I have done this as well. Seems to be loading windowed anyways... I'll mess with it when I get home in 6hrs. Then I will let you guys know whats going on after that.


----------



## Alex132

Quote:


> Originally Posted by *y4h3ll*
> 
> Can someone help me, I have an r9 295x2 and full screen isn't working properly for gta 5, there for only one gpu loads up. Any idea how to fix this? Specs are "1300 corsair, 32gb ddr3, 2 evo 240 250gb r9 295x2 and fx 9690 5ghz stable. I just found the first site who gives 2
> Shizzes about ati|amd would really appreciate the help. Maybe I am not using the car correctly or something. Built my rig my self.


alt+enter? (if it's not full-screen)

if it is; disable ULPS.
Quote:


> Originally Posted by *Feyris*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tijmpie*
> 
> Hey xero
> Thx for youre reply do you mean the bios of my mobo or graphic card. And im not that experienced with this i dont know what ULPS means so i think i dont have it disabled yet.
> 
> 
> 
> Since you dont know what it is just run this
Click to expand...


----------



## y4h3ll

Quote:


> Originally Posted by *Alex132*
> 
> alt+enter? (if it's not full-screen)
> 
> if it is; disable ULPS.


Get msi after burner and in the settings you can disable ULPS, yes I have done this too. Checked my registery it's set. Maybe windowed mode then full screen? I'm running windows 8.1 maybe you have win 7 and full screen works for you?


----------



## Feyris

theres a small dos tool from an ocn member for ulps management ive been using too (i posted 1click disable exe for it too earlier)

Atleast for those not wanting to mess with AB


----------



## y4h3ll

Quote:


> Originally Posted by *Feyris*
> 
> theres a small dos tool from an ocn member for ulps management ive been using too (i posted 1click disable exe for it too earlier)
> 
> Atleast for those not wanting to mess with AB


I will try re disabling it. I did forget reinstalling or updating catilast it will re-enable ulps thanks for the reminder. I'll try that when I get home.


----------



## cmoney408

does your GTA have a AMD Crossfire X logo/icon in the top right corner while you play the game. mine does, and it wouldnt go away.


----------



## y4h3ll

Quote:


> Originally Posted by *cmoney408*
> 
> does your GTA have a AMD Crossfire X logo/icon in the top right corner while you play the game. mine does, and it wouldnt go away.


It can be disabled and enabled by the clock if you right click on the amd catilast icon. But no it did not.


----------



## y4h3ll

Here is what happens when I apply full screen.
if you look at the top right you will notice it isn't in full screen.

Also here is a pic of what my gpu are doing "r9-295x2"
Xfire isn't working I have icons enabled aswell. New beta driver everything.


----------



## tsm106

Quote:


> Originally Posted by *y4h3ll*
> 
> Here is what happens when I apply full screen.
> if you look at the top right you will notice it isn't in full screen.
> 
> Also here is a pic of what my gpu are doing "r9-295x2"
> Xfire isn't working I have icons enabled aswell. New beta driver everything.


Set AB to show your gpu usage on both gpus in the OSD. If fullscreen is not enabled, you will not see usage on both gpus, vice versa. Also check under settings/display in game that set up your safezone screen settings. It dictates how far or where the safe display area is on your physical screen, ie. safezone.


----------



## y4h3ll

Called amd tech support and they told me to change pre defined settings to optimized 1x1 and worked finally. anyone having the same issue just change xfire settings to optimed 1x1 fixed my application Issue. This is a known issue for "win 8.1"that's why it wouldon't load f-s


----------



## FiXi

Hi, I saw GTA V benchmarks that 295x2 can get ~100fps @1080p, but i only get 50-60fps in CF, no matter the graphic settings or resolution (same fps on 1280x720) and when i disable CF i still get 50-60fps. Gpu usage is about 20%-99% on both (jumping up and down rapidly), but never 100% at the same time. VSR @3200x1800 the gpu usage is higher almost 100%, but keeps dropping all the time to 0-20%. CF disabled i get 100% usage all the time(dropping few times to ~20%, but rarely) and cpu usage is 70-80%.
I have disabled uart and increased power limit (helped in crysis 3 and bf4), but won't on GTA V. Also using the new 15.4 drivers and have clean installed them few times.
Could it be that my CPU 3570k @4,0Ghz is bottlenecking? Cpu usage is at 95%-99%, while playing.
OR could it have anything to do with low power? I have Corsair TX850W V2 and i have calculated that my computer is using about 820W.

Specs:
i5 3570k @4,0Ghz
Thermalright HR-02 Macho Rev. B
Gigabyte Z77X-UD3H
16Gb 1600Mhz Vengeance
XFX 295x2
Samsung 840 pro
WD Green 2 TB SATAIII
Seagate Barracuda 3 TB 64 MB 7200 RPM 3.5" SATA III (GTA V installed here)
2 Monitors


----------



## Sgt Bilko

Quote:


> Originally Posted by *FiXi*
> 
> Hi, I saw GTA V benchmarks that 295x2 can get ~100fps @1080p, but i only get 50-60fps in CF, no matter the graphic settings or resolution (same fps on 1280x720) and when i disable CF i still get 50-60fps. Gpu usage is about 20%-99% on both (jumping up and down rapidly), but never 100% at the same time. VSR @3200x1800 the gpu usage is higher almost 100%, but keeps dropping all the time to 0-20%. CF disabled i get 100% usage all the time(dropping few times to ~20%, but rarely).
> I have disabled uart and increased power limit (helped in crysis 3 and bf4), but won't on GTA V. Also using the new 15.4 drivers and have clean installed them few times.
> Could it be that my CPU 3570k @4,0Ghz is bottlenecking? *Cpu usage is at 95%-99%*, while playing.
> OR could it have anything to do with low power? I have Corsair TX850W V2 and i have calculated that my computer is using about 820W.
> 
> Specs:
> i5 3570k @4,0Ghz
> Thermalright HR-02 Macho Rev. B
> Gigabyte Z77X-UD3H
> XFX 295x2
> Samsung 840 pro
> WD Green 2 TB SATAIII
> Seagate Barracuda 3 TB 64 MB 7200 RPM 3.5" SATA III (GTA V installed here)
> 2 Monitors


You answered your own question









Yeah, Cpu is holding you back there....try lowering the view distance sliders


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You answered your own question
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, Cpu is holding you back there....try lowering the view distance sliders


^this. oc cpu higher and check to make sure you do what y4 did in post #5608.


----------



## y4h3ll

Any one have issues with random grass spawning in the air and on the ground with the "R9-295X2" only happens up north like during the first heist. Also changing grass to normal will fix it. Just a heads up for you guys.


----------



## Bfun

Would a PC Power and Cooling S75QB 750W with a single 12v 60A rail be sufficient to power a 295x2?


----------



## wermad

Use this list of recommended units:

http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4890#post_23654185

Unfortunately, the Pc Power & Cooling 750w does not meet the amd requirements for amperage. Though it is possible to run, something can go wrong as its not ideal for a 295x2 + its system. Its recommended 50Amps for the 295x2 (single rail) and at least 20amps for the rest of your system and components. So, that's why the minimum recommended is 70Amps on a single rail. It can get complicated but I took the time to look at several units and put them on a list for us to help w/ this very question.


----------



## Bfun

Ah thanks. I had actually gone through your list but mistook it as a report of PSU that were being used by members. Using a Kill-A-Watt I pull 134w in Prime95, 110w in BurnIn Test, and 430w in FurMark and 310w in Heaven. If those numbers are accurate it would seem total wattage wouldn't be a problem. It's that 60A rail that's questionable.


----------



## druout

Hey guys/gals, not sure if this is the right place for this, let me know if it's not, but I was looking to get two R9 295X2s; however I'm seeing a lot of criticism lately since the Titan X came out that the 295X2 has poor performance with high frame variance http://www.tomshardware.com/answers/id-2601829/amd-295x2-titan-superclocked.html. What are your experiences with the R9 295X2 in games? Currently I have 2 ROG Matrix 290Xs in crossfire, but I'd like to make the jump to the dual 295X2s; I'm just concerned by the criticisms leveled at performance. Any feedback or thoughts would be much appreciated.


----------



## veaseomat

Quote:


> Originally Posted by *druout*
> 
> Hey guys/gals, not sure if this is the right place for this, let me know if it's not, but I was looking to get two R9 295X2s; however I'm seeing a lot of criticism lately since the Titan X came out that the 295X2 has poor performance with high frame variance http://www.tomshardware.com/answers/id-2601829/amd-295x2-titan-superclocked.html. What are your experiences with the R9 295X2 in games? Currently I have 2 ROG Matrix 290Xs in crossfire, but I'd like to make the jump to the dual 295X2s; I'm just concerned by the criticisms leveled at performance. Any feedback or thoughts would be much appreciated.


I'm doing lots of overclocking with mine in crossfire and my advice is... dont. I love my cards but they dont oc high enough. get a titan or wait for the 390x. I switched from 3x sapphire 290 (non x) trix oc to these when the prices dropped in december. rumor has it there are 8gb nvidia 980s in the works also. Idk man I would stick with your 290x's for now, they probably OC higher too lol.


----------



## druout

Quote:


> Originally Posted by *veaseomat*
> 
> I'm doing lots of overclocking with mine in crossfire and my advice is... dont. I love my cards but they dont oc high enough. get a titan or wait for the 390x. I switched from 3x sapphire 290 (non x) trix oc to these when the prices dropped in december. rumor has it there are 8gb nvidia 980s in the works also. Idk man I would stick with your 290x's for now, they probably OC higher too lol.


No my 290Xs don't overclock very high, I'm just wanting to hear from people who actually own these cards how they actually perform in games vis-a-vis all the criticism they are getting now.


----------



## Feyris

Quadfire is always an iffy thing in gaming overall unless your going to be running high resolutions they wont scale so well a lot. I would keep your 290X and wait to grab 390X CF/Tri fire in july or just grab a 395x2 when it hits


----------



## silencespr

i have to hang my radiator out the windows to be able to play BF4 on 4k settings other wise card gets super hot and pc freezes.


----------



## druout

Ok thanks guys, appreciate the info.


----------



## xer0h0ur

I avoided quadfire because of the issues I see with gaming quadfired. Be it reported by users or in the driver bug reports. Scaling also isn't favorable past tri-fire so it just seems like a waste of money that introduces issues.


----------



## wermad

Quote:


> Originally Posted by *Bfun*
> 
> Ah thanks. I had actually gone through your list but mistook it as a report of PSU that were being used by members. Using a Kill-A-Watt I pull 134w in Prime95, 110w in BurnIn Test, and 430w in FurMark and 310w in Heaven. If those numbers are accurate it would seem total wattage wouldn't be a problem. It's that 60A rail that's questionable.


Few benchmarks can give you ideal tdp readings. I used 3dmark11 as that drew up to 1200w at the kill-a-watt vs any other benchmark. If you wanna try Furmark, its a risk you wanna take on your own accord, its probably the best way to gauge wattage. But don't skimp on the psu imho, as that may end up blowing everything. when you run high end components, put a lot of emphasis on the psu. 295x2 was the first psu I really had to get right. Before, as long as I had enough wattage, it was like go for it. I'm sure w/ the more power hungry next gen, amperage will continue to be crucial with Amd. When i was putting the list together, I was shocked and very surprised a lot of great and quality units didn't make the cut. And how some 650-750w did! it goes to show how very different psu's can be made. The mighty ST1500 that powered my quad gtx 480 oc'd to 965mhz (x79 setup, w/ full wc gear) with no issues. But it did not make to cut for even a single 295x2...


----------



## vmon2009

I have an old Corsair TX950W PSU: http://www.newegg.com/Product/Product.aspx?Item=N82E16817139013

Will this PSU power a 295x2?


----------



## TooManyAlpacas

Yes that would be plenty for the card I use a 1000W PSU and power has never been an issue only a 50W difference


----------



## vmon2009

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> Yes that would be plenty for the card I use a 1000W PSU and power has never been an issue only a 50W difference


That's good to know. Many have been telling me since the beginning of time that i need a new PSU because of the 28A 8-pin thing, but i think there is someone else on here that has the same PSU as I do and also has a 295x2 and it works fine.


----------



## wermad

A 650w unit can do it if you run lga1150 and everything stock. 750w would be the ideal minimum, and 850w would be the recommended starting wattage for a single card (imho). Mind you these #s mean nothing if the unit does not meet the amperage requirements from amd.


----------



## Mega Man

depends on cpu and cpu power consumption
Quote:


> Originally Posted by *Feyris*
> 
> Quadfire is always an iffy thing in gaming overall unless your going to be running high resolutions they wont scale so well a lot. I would keep your 290X and wait to grab 390X CF/Tri fire in july or just grab a 395x2 when it hits


Quote:


> Originally Posted by *xer0h0ur*
> 
> I avoided quadfire because of the issues I see with gaming quadfired. Be it reported by users or in the driver bug reports. Scaling also isn't favorable past tri-fire so it just seems like a waste of money that introduces issues.


not really i honestly have few issues


----------



## deep33

Quote:


> Originally Posted by *Bfun*
> 
> Would a PC Power and Cooling S75QB 750W with a single 12v 60A rail be sufficient to power a 295x2?


Largely depends on what else you have in the case....But i'd go with a minimum of 1000W 80 plus gold certified.....Keep in mind, you have to account for capacitor aging and so on. I'm the kinda guy who doesn't like replace a psu for atleast 5 years. Leave some legroom. 30 to 40 bucks more shouldn't hurt your pocket when you're considering a timespan of five to 6 years.

Go with

1) http://www.newegg.com/Product/Product.aspx?Item=N82E16817139057&cm_re=corsair_rm1000-_-17-139-057-_-Product

or

2) http://www.newegg.com/Product/Product.aspx?Item=9SIA24G28N5238&cm_re=evga_1300W-_-17-438-011-_-Product


----------



## Mega Man

please do not buy a corsair rm, they are not bad, but they are far from great, and the value ( what you get for the price you pay ) is extremely poor

http://www.overclock.net/t/1455892/why-you-might-not-want-to-buy-a-corsair-rm-psu/0_100#post_21501656

please do not buy a psu based on 80+ anything

http://www.overclock.net/t/711542/on-efficiency/0_100

if you need help ask shilka or Phaedrus2129 or if you want me although i do not claim to be psu gurus, i can read reviews just fine


----------



## Feyris

rm1000 is CWT so....Leadex would be better. but I dont think he would need a 1300W. When your going into higher end platforms you need to worry less about getting near listed wattage, my 1200w platinum leadex pushed two 295x2 a-ok going beyond spec when a Thermaltake 1350w couldnt.


----------



## Mega Man

i would also advise against that

http://www.overclock.net/t/928113/a-message-to-the-community-on-enthusiast-power-supplies/0_100
Quote:


> Originally Posted by *Phaedrus2129*
> 
> These enthusiast power supplies we all have (high-end Corsair, Antec, SeaSonic, Enermax, Silverstone, etc.) can take a lot more abuse than you think. I have tell people this a lot. These are heavy duty, precision engineered electron pushers, not tinker toys.
> 
> While you can't trust your average cheap or OEM power supply so much, and you can't trust a generic as far as you can throw it... A high-end enthusiast grade power supply is engineered with massive safety margins. Take the Corsair VX550. That's a CWT PSH, and Rocketfish/Best Buy took that same PSU (w/ minor modifications, nothing important) and rated it as a 700W _and it can hold that rating within ATX specs_.
> 
> _When you buy a high-end PSU you aren't just buying reliability and performance, you're buying your headroom right there. Some people say, "Well the TX750 can push 900W, so when I buy it I'm getting a 900W PSU". That defeats the purpose. The purpose is that you can treat it as a 750W PSU and draw that amount from it long-term, for extended periods, while a cheap 750W might eventually break. That's part of the reason for buying a high-end PSU, instead of something just adequate, like an OCZ ModXStream or a Rosewill RV2 or a CM Silent Pro._
> 
> When you buy a high end enthusiast power supply, especially one that I can vouch for, you should know that you're buying into more than just a name. You're buying a machine, and one that's a lot tougher than the typical dreck you might have used before. So don't be afraid to use it for what it's intended for. Forget about "extra headroom", forget about babying your PSU or keeping some massive unnecessary safety margin. Go ahead, get that second Fermi and go wild. Most of you have already paid for that ability, so make the most of it.


----------



## Feyris

Quote:


> Originally Posted by *Mega Man*
> 
> i would also advise against that
> 
> http://www.overclock.net/t/928113/a-message-to-the-community-on-enthusiast-power-supplies/0_100


Against pushing high end psu past spec? well ya







(but then theres times you just wanna bench short term - like in my case now 2nd 295 is in mail to certain african)

Was pretty much to point out 1300w is overkill for ONE 295.


----------



## wermad

pick up a used ax1200 on ebay if you can find one for ~$100 usd. though, mind the length....


----------



## PureBlackFire

Quote:


> Originally Posted by *deep33*
> 
> Largely depends on what else you have in the case....But i'd go with a minimum of 1000W 80 plus gold certified.....Keep in mind, you have to account for capacitor aging and so on. I'm the kinda guy who doesn't like replace a psu for atleast 5 years. Leave some legroom. 30 to 40 bucks more shouldn't hurt your pocket when you're considering a timespan of five to 6 years.
> 
> Go with
> 
> 1) http://www.newegg.com/Product/Product.aspx?Item=N82E16817139057&cm_re=corsair_rm1000-_-17-139-057-_-Product
> 
> or
> 
> 2) http://www.newegg.com/Product/Product.aspx?Item=9SIA24G28N5238&cm_re=evga_1300W-_-17-438-011-_-Product


go with the EVGA 1050 GS. it's about $160.

edit: it's $175 on newegg and $160 on ncix.us.


----------



## cennis

Why is it that my GTAV runs at 120+ fps constantly with 1 gpu enabled (14.12 driver)

1440p, AA off, everything else ultra/veryhigh

in the starting hostage scene, maybe outdoor much lower fps?


----------



## Alex132

Technically my HX850 could push the R9 295X2, but it failed to power my GTX690 + 2500k properly, so I highly doubt it'd start working properly with a R9 295X2.


----------



## Feyris

Quote:


> Originally Posted by *Alex132*
> 
> Technically my HX850 could push the R9 295X2, but it failed to power my GTX690 + 2500k properly, so I highly doubt it'd start working properly with a R9 295X2.


You would start a 2nd fire


----------



## wermad

GTX 690 sips power, so the begs the question if you were running something super power hungry with that setup. I know the Hx850 had a pretty good overhead (I think...







).


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> GTX 690 sips power, so the begs the question if you were running something super power hungry with that setup. I know the Hx850 had a pretty good overhead (I think...
> 
> 
> 
> 
> 
> 
> 
> ).


~1000w overhead.

~5Ghz 2500k is 300w max. + 50w max other crap (pump + 3 USB devices)... 690 stock even is ~300w. That's only 650w....

It stopped when I used the modular cables rather than the hard-wired ones....


----------



## wermad

really?!?!?!?!?! I don't recall my 5.0 2700k using that much power and 2700k uses a bit more for the eight (vs four) threads. Maybe your kill-a-watt become faulty? My first one did and gave me ~1500w for a triple sli 780 setup using that same 5.0 2700k







. It was probably ~1000-1100w tbh.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> really?!?!?!?!?! I don't recall my 5.0 2700k using that much power and 2700k uses a bit more for the eight (vs four) threads. Maybe your kill-a-watt become faulty? My first one did and gave me ~1500w for a triple sli 780 setup using that same 5.0 2700k
> 
> 
> 
> 
> 
> 
> 
> . It was probably ~1000-1100w tbh.


guess-a-watt more like.

Also 1.54v 5Ghz + mobo + RAM was my guess for 300w peak. 50w peak was for pump+fan+hdds+ssds+usb. And actually add on like 20w for soundcard at most.

Yeah it wasn't clear lol.


----------



## NBAasDOGG

Hello gentlemen (or woman),

I recently bought myself a Club 3D R9 295X2 for just €650 and it's an absolute fantastic card, but I do have a few question after reading some pages in this forum. I have seen that some of you flashed their bioses, but what is the purpose of that exactly? Does it remove the 74C throttle of the card or ads any overclocking potential? Btw, is it possible to higher to voltage of the card by using Afterburner, or is it voltage locked? And what about memory voltage? And final question, how to remove the 74C throttle (I think it's impossible), although I never hit 62C









Thank already for helping


----------



## wermad

hmmm...what that much voltage you may be pushing more then 200w. I was @ 1.425v 5.0 and did get to 5.1 w/ 1.45v. But I'm sure I was under 200w. If that was X79, I would be more inclined to believe that tbh. A SB churning out 300w, unless you were pushing 5.5+, but just 5.0 its just does seem right unless you got a pig of a cpu. Which can happen unfortunately. You still have that chip?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> hmmm...what that much voltage you may be pushing more then 200w. I was @ 1.425v 5.0 and did get to 5.1 w/ 1.45v. But I'm sure I was under 200w. If that was X79, I would be more inclined to believe that tbh. A SB churning out 300w, unless you were pushing 5.5+, but just 5.0 its just does seem right unless you got a pig of a cpu. Which can happen unfortunately. You still have that chip?


Yes, sadly. But realistically, nothing at all has impressed me since SB came out. X99 was tempting - until I saw the priceerformance increase for non-rending applications. (not to mention you have to double the prices because of where I live).

I will say that it's at 4.8Ghz and 1.48v peak now (1.47v avg). It used to be able to do 5Ghz on 1.45v. And takes now 1.54 for 5Ghz.


----------



## wermad

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hello gentlemen (or woman),
> 
> I recently bought myself a Club 3D R9 295X2 for just €650 and it's an absolute fantastic card, but I do have a few question after reading some pages in this forum. I have seen that some of you flashed their bioses, but what is the purpose of that exactly? Does it remove the 74C throttle of the card or ads any overclocking potential? Btw, is it possible to higher to voltage of the card by using Afterburner, or is it voltage locked? And what about memory voltage? *And final question, how to remove the 74C throttle (I think it's impossible), although I never hit 62C*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank already for helping


Go custom water







. I haven't dabbled into bios flashing since my 780s but in theory, like the 290/290x, its to give you a more aggressive *stock* default bios parameter. I haven't heard of anything for the 295x2 (unless its a custom one like the devil or ares 3).

As for temps, if you can keep the stock under ther 75°C throttle thermal limit, then you're doing a great job. Mine were in the 60s during benching and testing but I'm sure w/ two, it was gonna hit the limit pretty quick while gaming. I had plans for custom water all along though.

Quote:


> Originally Posted by *Alex132*
> 
> Yes, sadly. But realistically, nothing at all has impressed me since SB came out. X99 was tempting - until I saw the priceerformance increase for non-rending applications. (not to mention you have to double the prices because of where I live).
> 
> I will say that it's at 4.8Ghz and 1.48v peak now (1.47v avg). It used to be able to do 5Ghz on 1.45v. And takes now 1.54 for 5Ghz.


You do have pcie 3.0 to look forward to. Even an lga1150/1155 setup w/ 8x 3.0 would give you enough bandwidth for two of these puppies.


----------



## Alex132

Not gonna bother with two 295x my god, I don't need a sauna









PCIE2.0 16x is more than enough for any GPU now-a-days anyway.

Mostly the upgrade would be for the sake of getting something new.


----------



## F4ze0ne

Think the 295x2 will be able to run this?


----------



## Feyris

Quote:


> Originally Posted by *F4ze0ne*
> 
> Think the 295x2 will be able to run this?


It should be able to behind it over and....

_run it properly_


----------



## xer0h0ur

Finally got the water cooling loop done and re-vamped to handle the heat. I can now be satisfied with load temps in the high 40's to mid 50's. Now I just need to start overclocking and testing the limits of these cards.

http://www.3dmark.com/fs/4612590 with a super light overclock to 1060MHz GPU 1500MHz vRAM

I need to take pictures.


----------



## Amfamora

My current build in progress...

i7-4790k
Asus Maximus V2 Formula Mobo
Obsidian 900D Case
16 gig GSkill Ripjaws 2133
XFX R9 295x2
Ek Nickel blocks
XSPC 360mm and 280mm rads
Corsair AX1200i PSU

Just waiting on a Ek D5 pump and res to arrive


----------



## Orivaa

Quote:


> Originally Posted by *Amfamora*
> 
> 
> 
> My current build in progress...
> 
> i7-4790k
> Asus Maximus V2 Formula Mobo
> Obsidian 900D Case
> 16 gig GSkill Ripjaws 2133
> XFX R9 295x2
> Ek Nickel blocks
> XSPC 360mm and 280mm rads
> Corsair AX1200i PSU
> 
> Just waiting on a Ek D5 pump and res to arrive


Looking nice, dude.

I'm currently waiting on materials plus captial. Have a Rampage V Extreme, an i7-5960x, and a 32GB 2400 G.skill kit on the way. When I then get money at the end of the month, I'll be looking at buying a MAGNUM STH10 case from Caselabs, although it'll take at least 5-6 weeks to process, according to their website. =/


----------



## Orivaa

Quote:


> Originally Posted by *F4ze0ne*
> 
> Think the 295x2 will be able to run this?


Technically, that is in-engine footage, but that does not mean it's how the game will look, so there's no real way to tell yet.


----------



## caste1200

hey guys, anyone else having problems installing 15.4 drivers?
I can´t get it to work...
uninstall current drivers,
safe mode and clean all amd files left with driver sweeper,
install 15.4, by the end of the install my mouse was not working properly anymore but my keyboard was.
restart the computer to finish the install, and when back into windows my keyboard and mouse do not work anymore but my system is running, so I restart to see if that helps and when
back to windows, the drivers are gone! so my pc starts at a 800x600 res (something like that but clearly not 1080..) and I get an error from CCC that it cannot find amd drivers or card...
heeeeeeeeeelp
windows 7 x64


----------



## Mega Man

Mine did that to. I think it is from a windows update.

Reinstall both drivers and chipset drivers

Excluding the mouse problem


----------



## NBAasDOGG

I finally got my hands on the Club 3D R9 295X2, and installed it in my new PC. It looked fantastic for a few minutes before the AMD garbage horror just started.

Code:



Installed new 15.4 drivers and went straight into GTA and BF4, and gues what GPU usage stutter all over the place with frame rates lower than my R9 280X crossfire. So I went into CCC and tried to disable crossfire, but the crossfire disable option was disappeared, however GPUz showed crossfire enabled, but I had more something like single GPU performance. From then on I started to loose hope, because I remembered the same horror from the HD6990.
I changed GPU slot, different drivers, double checked cables, reinstalled windows, but none of these resolved the problem and still no crossfire disable option and stutters from 0-100% usage all over the place. So I send the GPU back to the store for a replacement








I don't think the problem came from my system, because other card just run fine. And my PSU is a Seaonic 1000W, so that's more than enough power for the GPU. Anyone heard about this issue before, and what would you do?


----------



## Orivaa

The problem likely came because you failed to uninstall the previous driver properly.


----------



## deesee76

I will definitely run driversweep tomorrow.

In the interim, i just want to share my experience. I Poured the cash and bought an XFX 295X2, Gigabyte 290x OC and a AMD Reference 290x. Lots of inconsistencies in fps in 4k when i had XFX 295X2 and Gigabyte 290x OC cards running Together. Most games run in ultra mode without Antialiasing.

I have not tried the 3 cards together as my psu would probably melt...

BF Hardline runs in 4k 2160p in ultra between 40-130fps in game depending on sections. Most games look absolutely amazing and super sharp and fluid 50% of the time.

Here are my crossfire stats for the 2 cards - XFX 295X2 WITH GIGABYTE 290X OC http://www.3dmark.com/fs/4607028

Stats from 3dmark looks ok but still no match for 3 nvidia 980 in sli.

Maybe amd driver issues or cpu bottleneck or maybe my PSU (Coolmaster v1000) does not have enough juice.

Considering adding waterblocks to the 290s and 295x2 but just cant be bothered with the process and im bored trying to figure out why certain games are not running smoothly. Im running the latest beta as of today.

GTA V is great in certain sections but crashes in others. Forums said there are corrupt files and again shows why i left pc gaming for low resolution console gaming for the last 8 years.

I'm going to swap out the Gigabyte card and replacw it with mt reference 290x from AMD next weekend to see if its the Gigabyte 290x is causing the performance issues.

All i want is smooth game play (eg. 4k 2160p in ultra with some settings turned off ).

So far my experience with such a setup has been less than satisfactory.

Considering ditching both the gigabyte and amd reference 290x cards and pair it with another xfx 295x2.

Stay tune


----------



## wermad

Quote:


> Originally Posted by *NBAasDOGG*
> 
> I finally got my hands on the Club 3D R9 295X2, and installed it in my new PC. It looked fantastic for a few minutes before the AMD garbage horror just started.
> 
> Code:




Installed new 15.4 drivers and went straight into GTA and BF4, and gues what GPU usage stutter all over the place with frame rates lower than my R9 280X crossfire. So I went into CCC and tried to disable crossfire, but the crossfire disable option was disappeared, however GPUz showed crossfire enabled, but I had more something like single GPU performance. From then on I started to loose hope, because I remembered the same horror from the HD6990.
I changed GPU slot, different drivers, double checked cables, reinstalled windows, but none of these resolved the problem and still no crossfire disable option and stutters from 0-100% usage all over the place. So I send the GPU back to the store for a replacement








I don't think the problem came from my system, because other card just run fine. And my PSU is a Seaonic 1000W, so that's more than enough power for the GPU. Anyone heard about this issue before, and what would you do?

Guessing you gaming in 1080? Btw, these are the recommended Seasonic units:

single Seasonic SS-850AM^2 BRONZE 70A
single SS-850HT SILVER 70A
single SS-850EM SILVER 70A
single SS-850KM^3 GOLD 70A
single SS-1050XM GOLD 87A
single SS-1250XM GOLD 104A
single SS-860XP PLATINUM 71A
single SS-1000XP PLATINUM 83A
single SS-1050XP^3 PLATINUM 87A
single SS-1200XP^3 PLATINUM 100A


----------



## NBAasDOGG

Quote:


> Originally Posted by *wermad*
> 
> Guessing you gaming in 1080? Btw, these are the recommended Seasonic units:
> 
> single Seasonic SS-850AM^2 BRONZE 70A
> single SS-850HT SILVER 70A
> single SS-850EM SILVER 70A
> single SS-850KM^3 GOLD 70A
> single SS-1050XM GOLD 87A
> single SS-1250XM GOLD 104A
> single SS-860XP PLATINUM 71A
> single SS-1000XP PLATINUM 83A
> single SS-1050XP^3 PLATINUM 87A
> single SS-1200XP^3 PLATINUM 100A


single Seasonic SS-850AM^2 BRONZE 70A
single SS-850HT SILVER 70A
single SS-850EM SILVER 70A
single SS-850KM^3 GOLD 70A
single SS-1050XM GOLD 87A
single SS-1250XM GOLD 104A
single SS-860XP PLATINUM 71A
single SS-1000XP PLATINUM 83A <<<<<<<<<<<<<<<<<<<<<<<<< I have this one, so the issue is not from PSU
single SS-1050XP^3 PLATINUM 87A
single SS-1200XP^3 PLATINUM 100A


----------



## rdr09

Quote:


> Originally Posted by *deesee76*
> 
> I will definitely run driversweep tomorrow.
> 
> In the interim, i just want to share my experience. I Poured the cash and bought an XFX 295X2, Gigabyte 290x OC and a AMD Reference 290x. Lots of inconsistencies in fps in 4k when i had XFX 295X2 and Gigabyte 290x OC cards running Together. Most games run in ultra mode without Antialiasing.
> 
> I have not tried the 3 cards together as my psu would probably melt...
> 
> BF Hardline runs in 4k 2160p in ultra between 40-130fps in game depending on sections. Most games look absolutely amazing and super sharp and fluid 50% of the time.
> 
> Here are my crossfire stats for the 2 cards - XFX 295X2 WITH GIGABYTE 290X OC http://www.3dmark.com/fs/4607028
> 
> Stats from 3dmark looks ok but still no match for 3 nvidia 980 in sli.
> 
> Maybe amd driver issues or cpu bottleneck or maybe my PSU (Coolmaster v1000) does not have enough juice.
> 
> Considering adding waterblocks to the 290s and 295x2 but just cant be bothered with the process and im bored trying to figure out why certain games are not running smoothly. Im running the latest beta as of today.
> 
> GTA V is great in certain sections but crashes in others. Forums said there are corrupt files and again shows why i left pc gaming for low resolution console gaming for the last 8 years.
> 
> I'm going to swap out the Gigabyte card and replacw it with mt reference 290x from AMD next weekend to see if its the Gigabyte 290x is causing the performance issues.
> 
> All i want is smooth game play (eg. 4k 2160p in ultra with some settings turned off ).
> 
> So far my experience with such a setup has been less than satisfactory.
> 
> Considering ditching both the gigabyte and amd reference 290x cards and pair it with another xfx 295x2.
> 
> Stay tune


prolly a combination of stuff but certainly one of them is cpu bottleneck. here is mine with 2 290s . . .

http://www.3dmark.com/3dm/4644282?

in my experience, my i7 SB has seen its limits with just 2 290s. not gonna even attempt to crossfire 300 series with this rig, thus i went ahead X99.


----------



## wermad

Quote:


> Originally Posted by *NBAasDOGG*
> 
> single Seasonic SS-850AM^2 BRONZE 70A
> single SS-850HT SILVER 70A
> single SS-850EM SILVER 70A
> single SS-850KM^3 GOLD 70A
> single SS-1050XM GOLD 87A
> single SS-1250XM GOLD 104A
> single SS-860XP PLATINUM 71A
> single SS-1000XP PLATINUM 83A <<<<<<<<<<<<<<<<<<<<<<<<< I have this one, so the issue is not from PSU
> single SS-1050XP^3 PLATINUM 87A
> single SS-1200XP^3 PLATINUM 100A


Never said psu was an issue. You never specified which model nor did I see your rig specs. I came up with the recommended list and i like to ensure owners are well informed on the 295x2 requirements. If youre asking for help, it helps us know what your specifications are. Post your specs or fill out your rig builder under your settings


----------



## xer0h0ur

Your card was not defective as for the crossfire setting. The 295X2 does NOT give you the crossfire on/off setting in the CCC. The only way to disable crossfire is to manually create an application profile within the CCC for any given application/game you're trying to disable it for and setting crossfire to disabled within said profile.


----------



## xer0h0ur

There isn't a chance in hell I am attempting this madness again within a micro ATX case ever again but its done.


----------



## wermad

CL time


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> The 295X2 does NOT give you the crossfire on/off setting in the CCC.


That's seriously stupid. Well done AMD.


----------



## Mega Man

Seriously don't have problems do don't need it.

As to the matx I like it. But why not try again.

The s5.... limitless

My s3 has 240 monsta, 240 UT60, and 240 xt45
But that is a Dell case so Yea


----------



## wermad

S5 is a bit too long imho. S3 is the right length but itx


----------



## xer0h0ur

Spacing man. Spacing is my entire headache in a micro atx form factor but this case was exceedingly cramped on top of the spacing issues.


----------



## wermad

TT x9, ~170. Up to four 480mm rads. Quality is medium but you get tons of space.


----------



## caste1200

Quote:


> Originally Posted by *Alex132*
> 
> That's seriously stupid. Well done AMD.


you can disable it on the 3d application setting, you need to add the .exe you want to use with XF disabled


----------



## caste1200

Quote:


> Originally Posted by *Mega Man*
> 
> Mine did that to. I think it is from a windows update.
> 
> Reinstall both drivers and chipset drivers
> 
> Excluding the mouse problem


yeah but you have an amd system, im on intel, you only reinstall your chipset drivers and gfx drivers or also keyboard and mouse?


----------



## Alex132

Quote:


> Originally Posted by *caste1200*
> 
> yeah but you have an amd system, im on intel, you only reinstall your chipset drivers and gfx drivers or also keyboard and mouse?


huh

...I have an Intel CPU

Driver-wise (currently) I have Nvidia, soundcard and logitech mouse drivers. That's it.


----------



## caste1200

Quote:


> Originally Posted by *Alex132*
> 
> huh
> 
> ...I have an Intel CPU
> 
> Driver-wise (currently) I have Nvidia, soundcard and logitech mouse drivers. That's it.


ohh sorry wrong quote!


----------



## caste1200

so much problems to update drivers with amd lately.... i might go for nvidia next time once I get my custom loop on my case!


----------



## Mega Man

Quote:


> Originally Posted by *caste1200*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Mine did that to. I think it is from a windows update.
> 
> Reinstall both drivers and chipset drivers
> 
> Excluding the mouse problem
> 
> 
> 
> yeah but you have an amd system, im on intel, you only reinstall your chipset drivers and gfx drivers or also keyboard and mouse?
Click to expand...

Not true

I Also have 4790 and 3930.

I was speaking of my 3930

All I did was chipset and gpu


----------



## rdr09

Quote:


> Originally Posted by *caste1200*
> 
> so much problems to update drivers with amd lately.... i might go for nvidia next time once I get my custom loop on my case!


i know. it took me 10 mins to install this latest beta. used to be just 8 minutes.


----------



## Orivaa

Quote:


> Originally Posted by *rdr09*
> 
> i know. it took me 10 mins to install this latest beta. used to be just 8 minutes.


Unacceptable!


----------



## rdr09

Quote:


> Originally Posted by *Orivaa*
> 
> Unacceptable!


used to be 8 minutes with Omega and the beta after that . . .



this last beta took longer prolly because it's a bigger file. that's using HDD. i just install over the last driver.

much older drivers have uninstall feature, which AMD took away from Omega onwards. using the uninstall feature, it was taking me like 12 minutes.

i don't use DDU.


----------



## xer0h0ur

Potato gotta potato


----------



## blarty

Hi-
Just want to see if anyone has seen the same things I'm seeing with my r9 295x2 - working at stock speeds, I see a lot of GPU clock fluctuation in games - anywhere between 845 and 1018, it's not hitting thermal limits (I only fired up ESO for a couple of minutes after reboot just to check and the GPU temp didn't break 50% in Afterburner) now I am running an i7 2700k, so it could well be that - I just wanted to know if these clock fluctuations are normal and/or anything to be worried about?

I'm running Eyefinity across three monitors, and my GPU usage is often 100%, latest Beta drivers. I've disabled ULPS and turnined off Powerplay support and it's does not seem to have made much difference. My PSU is a bequiet! 1000w PowerZone model that the manufacturer has tested and certified with the Card - I don't see any artifacts or really any slowdown in game - just looking at the clock graph fluctuating quite a lot is making me wonder.

Cheers


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Potato gotta potato


Really glad you got the trifire setup running man. I think you did it perfectly. i bet a lot of people didmt notice the top card sunce its so close to the giant monsta. Very ballsy putting 3 gpus on mATX And i suspect you are one of a handful of people who've managed it. how is the in game performance? Does the bigger rad hang onto the case or is it set up by itself somehow?

i finally sold my 295x2 so ill be leaving the club. Its a difficult card to work with sometimes but i still thought it w a s a beast during final testing after r3installing the stock cooler.


----------



## xer0h0ur

Quote:


> Originally Posted by *electro2u*
> 
> Really glad you got the trifire setup running man. I think you did it perfectly. i bet a lot of people didmt notice the top card sunce its so close to the giant monsta. Very ballsy putting 3 gpus on mATX And i suspect you are one of a handful of people who've managed it. how is the in game performance? Does the bigger rad hang onto the case or is it set up by itself somehow?
> 
> i finally sold my 295x2 so ill be leaving the club. Its a difficult card to work with sometimes but i still thought it w a s a beast during final testing after r3installing the stock cooler.


Thanks for the kind words. The inner radiator has a fan that I scavenged off the original Alienware/Asetek CPU liquid cooler and that is what is holding the inner 120mm radiator with the screws for the external Koolance radiator mount. I was literally scratching my head for a while there trying to figure out how the hell I was going to maintain the inner radiator since I didn't want to only have the 360mm rad. I would have loved to have an all fittings C loop connecting the cards together but frustration won out on that one so I said screw it and just connected it whichever way I could with what I had. I am not going to win any awards or anything like that but its at least aesthetically pleasing enough to open the side panel and show friends.

As for game performance I still get games that don't want to cooperate in tri-fire, giving me BSODs trying to load the game or poor performance. Mostly I would say its driver related as I am using the 15.4 Beta driver. However I was playing CS:GO last night with a friend and the performance was flawless. Skyrim is another game I play a lot of and it works perfectly fine as well. Matter of fact the only games that are giving me problems right now are Dying Light and Tombraider. Everything else seems to be okay. I still need to toy around with application profiles for the games that give me problems to see whats up.


----------



## NBAasDOGG

Quote:


> Originally Posted by *xer0h0ur*
> 
> There isn't a chance in hell I am attempting this madness again within a micro ATX case ever again but its done.


WHAT IS THAT.....? WOW, those rads are bigger than your case. How are the temps of your setup, especially with this Noctua Industrial 3000rpm









Btw guys, I think I found a way to raise the target temp (throttle temp) above 75C using AMD CCC and Artmoney 7.39, but I'm experimenting to make sure it's save before I post a little tutorial


----------



## xer0h0ur

I actually went with the 2000 RPM flavor instead of the 3000 RPM. I knew there wasn't a chance in hell I would even want to run the fans at anywhere past 2000 RPM so it wasn't necessary to get the 3000 RPM units. I am currently running them at only 1300-1400 RPM for the sake of keeping them relatively silent. Idle temps are 37C CPU and 31C across the three GPUs + or - 1C from one to the next. At load CPU temp varies depending on how CPU intensive the game/application is while the GPUs are anywhere from the high 40's to mid 50's. They rarely go past 56C and that is usually the 3rd GPU (290X).

Those temps are running 4.2GHz CPU, 1060MHz GPUs 1500MHz vRAM. Monday I am off so I will finally have all day to toy around with overclocking these suckers further and finding a happy medium between higher overclocks and respectable load temps.


----------



## Alex132

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Btw guys, I think I found a way to raise the target temp (throttle temp) above 75C using AMD CCC and Artmoney 7.39, but I'm experimenting to make sure it's save before I post a little tutorial


Don't do this. Maybe 76-78'c would be okay... but the thermal limit is there to ensure the longevity of the CLC. It's for protection.


----------



## NBAasDOGG

Quote:


> Originally Posted by *Alex132*
> 
> Don't do this. Maybe 76-78'c would be okay... but the thermal limit is there to ensure the longevity of the CLC. It's for protection.


But what about the guys with a custom loop?


----------



## electro2u

Quote:


> Originally Posted by *NBAasDOGG*
> 
> But what about the guys with a custom loop?


Wont be getting anywhere near those temps.


----------



## Synthaxx

Hey,

I'm currently running into a small but annoying issue when playing GTA5. After a long time playing, suddenly the card begins to throttle, instead of a fixed 1100/1600 they downclock under full load to max 1090mhz and sometimes even as low as 500 mhz. The temperatures are even below 50 degrees Celcius so I know for sure the cards do not have a temperature issue at all. I can actually replicate this issue by just alt-tabbing out of the game and coming back in. Then the card wont run at the stable clocks anymore as shown by MSi afterburner. Higher fan rpm doesn't make a difference in this issue.
The only way to get back to stable clocks is then leaving GTA and starting it up again.

Aside from this issue, playing GTA5 online @60fps in 4K is really awesome, especially since I've seen the PS3 GTA 5.


----------



## Alex132

Quote:


> Originally Posted by *Synthaxx*
> 
> Hey,
> 
> I'm currently running into a small but annoying issue when playing GTA5. After a long time playing, suddenly the card begins to throttle, instead of a fixed 1100/1600 they downclock under full load to max 1090mhz and sometimes even as low as 500 mhz. The temperatures are even below 50 degrees Celcius so I know for sure the cards do not have a temperature issue at all. I can actually replicate this issue by just alt-tabbing out of the game and coming back in. Then the card wont run at the stable clocks anymore as shown by MSi afterburner. Higher fan rpm doesn't make a difference in this issue.
> The only way to get back to stable clocks is then leaving GTA and starting it up again.
> 
> Aside from this issue, playing GTA5 online @60fps in 4K is really awesome, especially since I've seen the PS3 GTA 5.


Make sure your waterblock is making proper contact with the VRMs. Try feel the back of the card to see if they are getting super hot too.


----------



## Synthaxx

Quote:


> Originally Posted by *Alex132*
> 
> Make sure your waterblock is making proper contact with the VRMs. Try feel the back of the card to see if they are getting super hot too.


The WHOLE card is under fujipoly ultra extreme and the cores under PK-3. I do have to say that the backplate gets warm but not scorching hot. I only have this issue in GTA though, and also that doesn't explain the throttling right of the bat if I alt-tab out of the game for a sec

EDIT: I should also add that i ran 3Dmark for a few hours with an OC of 1250/1600 on 2gpus. No throttling there at all :S


----------



## Orivaa

Quote:


> Originally Posted by *Synthaxx*
> 
> The WHOLE card is under fujipoly ultra extreme and the cores under PK-3. I do have to say that the backplate gets warm but not scorching hot. I only have this issue in GTA though, and also that doesn't explain the throttling right of the bat if I alt-tab out of the game for a sec


GTA V is not good at alt-tabbing, as noted by Totalbiscuit. I don't think it matters what setup you use, whether CF or SLI.


----------



## Alex132

Quote:


> Originally Posted by *Synthaxx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Make sure your waterblock is making proper contact with the VRMs. Try feel the back of the card to see if they are getting super hot too.
> 
> 
> 
> The WHOLE card is under fujipoly ultra extreme and the cores under PK-3. I do have to say that the backplate gets warm but not scorching hot. I only have this issue in GTA though, and also that doesn't explain the throttling right of the bat if I alt-tab out of the game for a sec
> 
> EDIT: I should also add that i ran 3Dmark for a few hours with an OC of 1250/1600 on 2gpus. No throttling there at all :S
Click to expand...

Probably just the game then, many people are reporting sparse errors about random things.


----------



## Feyris

Quote:


> Originally Posted by *Orivaa*
> 
> GTA V is not good at alt-tabbing, as noted by Totalbiscuit. I don't think it matters what setup you use, whether CF or SLI.


100x this for some reason it throws everything out of whack


----------



## DanWoodsPcMods

Hey guys I am wondering if anyone else has an ares 3 and has serious problems with gpu sag, being a single slot card and crazy heavy.


----------



## kayan

I bought a Koolance block used on eBay. It came without thermal pads, so what should I get?


----------



## NBAasDOGG

Quote:


> Originally Posted by *DanWoodsPcMods*
> 
> Hey guys I am wondering if anyone else has an ares 3 and has serious problems with gpu sag, being a single slot card and crazy heavy.


One of my friends owns the Ares 3 and he indeed complained about ''single slot sagging super duper multi heavy GPU''
His GPU is working perfect, and a little bit of sagging is not of a big deal. So no worries


----------



## Alex132

Quote:


> Originally Posted by *kayan*
> 
> I bought a Koolance block used on eBay. It came without thermal pads, so what should I get?


Anything that isn't awful.

Fujipoly if you have no budget limit, otherwise anything with decent reviews and like 7w/mk, or whatever the rating is, 7 is generally high-end enough.


----------



## kayan

Quote:


> Originally Posted by *Alex132*
> 
> Anything that isn't awful.
> 
> Fujipoly if you have no budget limit, otherwise anything with decent reviews and like 7w/mk, or whatever the rating is, 7 is generally high-end enough.


I've got roughly 30 bucks for it. Plan on buying from ppcs.

I was looking at the fujipoly, lots of choices, but what thickness should I get?


----------



## ramos29

it been a long time i did not post here, hope you are all ok
i wanted to ask about your cpu and gpu temp while playing gta5, my gpu ( stock clock ) never exeeded 50 but in gta 5 i am getting 61
my cpu is between 45-50
and wanted to ask about titanfall: i used to put 4k and some anti aliasing and keep a steady 60fps but yesterday at 4k without aa i exeperienced multiple fps drop, ones who own titanfall which are your settings and your fps
many thx


----------



## xer0h0ur

Quote:


> Originally Posted by *kayan*
> 
> I've got roughly 30 bucks for it. Plan on buying from ppcs.
> 
> I was looking at the fujipoly, lots of choices, but what thickness should I get?


Download the installation manual. Not all blocks use the same pads. My EK blocks used 0.5mm on the vRAM and 1mm / 1.5mm pads on the mosfets and VRMs.


----------



## wermad

This. I have the koolance diagram if you need it for this block.


----------



## PriestOfSin

Is the 295x2 worth getting over a 980? Both are similar in price now, and I'm tempted. Are AMD drivers still garbage? I haven't done AMD since i tried 5770 crossfire and was incredibly disappointed.


----------



## wermad

Nvidia drivers haven't been stellar recently so I've actually switched to amd. I've had zero driver issues since 2013 (coming from triple 780s) with amd.

To call amd drivers garbage is borderline fanboy as Nvidia has had crap drivers too. In my personal experience, both sides may need time for drivers to mature and adapt to new games.

If you plan to game in wqhd/4k/mmg, go with the 295x2. If you plan to game in 1080-1600, 120hz, 3d, etc. go w/ Nvidia. I would wait on the 980 as a ti model or 390x may cause a price drop.


----------



## xer0h0ur

Seems like the 395X2 is going to drop soon after the 390X if not at the same time. They are going to be pushing VR hard with the 395X2 this go round.


----------



## PriestOfSin

Quote:


> Originally Posted by *wermad*
> 
> Nvidia drivers haven't been stellar recently so I've actually switched to amd. I've had zero driver issues since 2013 (coming from triple 780s) with amd.
> 
> To call amd drivers garbage is borderline fanboy as Nvidia has had crap drivers too. In my personal experience, both sides may need time for drivers to mature and adapt to new games.
> 
> If you plan to game in wqhd/4k/mmg, go with the 295x2. If you plan to game in 1080-1600, 120hz, 3d, etc. go w/ Nvidia. I would wait on the 980 as a ti model or 390x may cause a price drop.


Didn't mean to offend. Basically, I'm trying to put together the most future proofed rig that i can for 1920x1080, since I'm putting my wife through her masters program in the fall. Means no upgrades for 2-3 years.

The 980ti will apparently retail for $800 or so, and I'd rather not pay that much. Would you still say to wait for the 980ti launch?


----------



## wermad

None taken, just like to dispell misconceptions.

The launch of ti and pi may cause the 980 price to drop. It happened when the 780 ti came out, caused the 780 to go down. Sounds like the 980 is the better fit for you but wait for a possible price drop. A couple of 970s is a good choice too. The 290x/290 is a another choice for a single card but get non reference cooler ones to better manage the heat.


----------



## deep33

Quote:


> Originally Posted by *PriestOfSin*
> 
> Didn't mean to offend. Basically, I'm trying to put together the most future proofed rig that i can for 1920x1080, since I'm putting my wife through her masters program in the fall. Means no upgrades for 2-3 years.
> 
> The 980ti will apparently retail for $800 or so, and I'd rather not pay that much. Would you still say to wait for the 980ti launch?


Dude,,,its a freaking no-brainer. If you want the most "FUTURE PROOFED RIG", you need the r9 295 x2 selling for about 660 measly bucks. I'm gonna be watching my card rip through the TITAN Z (3000 bucks) and laugh all the way from here to Chinatown while the Nvidia clown club chatters away about bad drivers.


----------



## kayan

Quote:


> Originally Posted by *PriestOfSin*
> 
> Didn't mean to offend. Basically, I'm trying to put together the most future proofed rig that i can for 1920x1080, since I'm putting my wife through her masters program in the fall. Means no upgrades for 2-3 years.
> 
> The 980ti will apparently retail for $800 or so, and I'd rather not pay that much. Would you still say to wait for the 980ti launch?


At 1080, you probably don't need a 295x2, and a 290x or 970 would probably be enough, but for the power the 295x2 is awesome.

What games are you playing/planning to get in the next year or so? As this will likely greatly influence your options.

I'm in the same boat as you with no upgrades for 2ish years. My wife and I are having our first in October, and I chose the 295x2 as my card.


----------



## deep33

And lo and behold, the Nvidia clown club has paid GPU boss to post these absolutely FAKE benchmarks.

http://gpuboss.com/graphics-card/Radeon-R9-295X2

No wonder so many clowns exist in the clown club ready to shell out 3000 bucks for a Titan Z!


----------



## wermad

For some reason, Hawaii, especially 295x2, just doesn't seem to perform good in 1080. It really begs to go higher. That's why I'm more inclined in recommending to pay the extra dough for Nvidia as it will do better now and probably later in 1080. If you're going w/ extreme resolotions: WQHD (ie 1440/1600), 4k, 5k, Eyefinity 3x1/5x1/3x2, Hawaii does a better job now and possibly later. You really have to delve into GM200 for much better experience vs GM204 in larger resolutions imho.

The only probably I see is GTX 980 may come down in price if PI R9 390 is price the same or cheaper and out performs it. The natural response from Nvidia would be to drop the price of 980 somewhere from the $300 290X retail and R9 390. I'm guesstimating from here on, but R9 390X seems like a shoe in for $650-700 right here Ti would land (closer to $750).

edit: just an fyi, if you're going w/ gsync or freesync, you'll be locked into either card to get this benefit from the monitor.


----------



## xer0h0ur

Quote:


> Originally Posted by *wermad*
> 
> None taken, just like to dispell misconceptions.
> 
> The launch of ti and pi may cause the 980 price to drop. It happened when the 780 ti came out, caused the 780 to go down. Sounds like the 980 is the better fit for you but wait for a possible price drop. A couple of 970s is a good choice too. The 290x/290 is a another choice for a single card but get non reference cooler ones to better manage the heat.


IMO the launch of AMD's lineup will cause far more of a shakeup in terms of Nvidia's pricing than any other products Nvidia may release.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> IMO the launch of AMD's lineup will cause far more of a shakeup in terms of Nvidia's pricing than any other products Nvidia may release.


Same old tactics. I got back in the game in 2009 and not much is different. Nvidia always holds the upper hand but Amd proves the better value. Ultimately, drivers was the deciding factor and up to hd6000, amd did turns things around while Nvidia slipped (in terms of drivers for both). Now, its down to the hp wars and bang per buck. Amd cannot do a Titan X killer. They don't have the time nor do they want to waste time in a smaller segment knowing how financially fragile the corporation is. Amd has to play it safe and the same formula works, price your card a bit lower then the competition while being close and sometimes a bit faster. 390X will be faster then 980 but not as fast as Titan X. Ti 980 will fill in the slot of 980 and X where 390X will swim for a bit. One area amd really has done great vs Nvidia is dual gpu cards. The 690 was pretty decent but was not stellar in quad (well, Kepler wasn't that great beyond three cores tbh) and Titan Z was a disaster. Let's not forget the exploding 590s







(I had two of em







). 6990 was fast but way too hot. 7990 got it right be came in very late. 295x2 was spot on but amd was pushing the price (and power) limits imho. 395x2 may not be as brave as 295x2 (from 7990) when it comes to pricing but it maybe enough to outshine a 990 (gm202) and any possible but unlikely Z 2.0 (daul gm200). I'm a little uneasy with all the amd financial issues and descrete gpu's maybe something on the fat list to trim off if the company needs to shed some excess to keep them afloat. PI maybe the last of the Amd cards and we'll have to jump on the green team.


----------



## NBAasDOGG

Quote:


> Originally Posted by *deep33*
> 
> And lo and behold, the Nvidia clown club has paid GPU boss to post these absolutely FAKE benchmarks.
> 
> http://gpuboss.com/graphics-card/Radeon-R9-295X2
> 
> No wonder so many clowns exist in the clown club ready to shell out 3000 bucks for a Titan Z!


Isn't that illegal? In my country you may loose a lot of money by posting fake things about high end expensive products.
You'll win the lawsuit within seconds, which means "bye bye NVIDIA and GPUBOSS"


----------



## TooManyAlpacas

Jesus, how could those resulted be true maybe if you have two Titan X's, but defiantly not one


----------



## Sgt Bilko

Quote:


> Originally Posted by *deep33*
> 
> And lo and behold, the Nvidia clown club has paid GPU boss to post these absolutely FAKE benchmarks.
> 
> http://gpuboss.com/graphics-card/Radeon-R9-295X2
> 
> No wonder so many clowns exist in the clown club ready to shell out 3000 bucks for a Titan Z!


Looks like they only ran one GPU core tbh.

Rookie mistake.


----------



## xer0h0ur

Quote:


> Originally Posted by *wermad*
> 
> Same old tactics. I got back in the game in 2009 and not much is different. Nvidia always holds the upper hand but Amd proves the better value. Ultimately, drivers was the deciding factor and up to hd6000, amd did turns things around while Nvidia slipped (in terms of drivers for both). Now, its down to the hp wars and bang per buck. Amd cannot do a Titan X killer. They don't have the time nor do they want to waste time in a smaller segment knowing how financially fragile the corporation is. Amd has to play it safe and the same formula works, price your card a bit lower then the competition while being close and sometimes a bit faster. 390X will be faster then 980 but not as fast as Titan X. Ti 980 will fill in the slot of 980 and X where 390X will swim for a bit. One area amd really has done great vs Nvidia is dual gpu cards. The 690 was pretty decent but was not stellar in quad (well, Kepler wasn't that great beyond three cores tbh) and Titan Z was a disaster. Let's not forget the exploding 590s
> 
> 
> 
> 
> 
> 
> 
> (I had two of em
> 
> 
> 
> 
> 
> 
> 
> ). 6990 was fast but way too hot. 7990 got it right be came in very late. 295x2 was spot on but amd was pushing the price (and power) limits imho. 395x2 may not be as brave as 295x2 (from 7990) when it comes to pricing but it maybe enough to outshine a 990 (gm202) and any possible but unlikely Z 2.0 (daul gm200). I'm a little uneasy with all the amd financial issues and descrete gpu's maybe something on the fat list to trim off if the company needs to shed some excess to keep them afloat. PI maybe the last of the Amd cards and we'll have to jump on the green team.


You're ignoring every single sign if you think the 390X's target competition is a piddly 980. The 380X should be able to trade blows with a 980 and beat it. The 390X is without a doubt setting its sights on Titan X territory. Nvidia is currently holding back the 980 Ti and the possibility of a water cooled Titan X because they have no idea what to expect from AMD's lineup. The fact reports of the 980 Ti being the full fat chip already in the Titan X proves as much.


----------



## PriestOfSin

Quote:


> Originally Posted by *kayan*
> 
> At 1080, you probably don't need a 295x2, and a 290x or 970 would probably be enough, but for the power the 295x2 is awesome.
> 
> What games are you playing/planning to get in the next year or so? As this will likely greatly influence your options.
> 
> I'm in the same boat as you with no upgrades for 2ish years. My wife and I are having our first in October, and I chose the 295x2 as my card.


I'm looking forward to killing floor 2, GTAV, Witcher 3, and star citizen.

I can't do a 970, the memory issue is too huge. Games are memory hungry it seems, so anything less than 4 Is unacceptable.


----------



## DividebyZERO

I wonder if Nvidia could do this?


Spoiler: Warning: Spoiler!



So i'd like to share this about the crossover 44k, although its potentially specific to AMD GPU's most likely. So i decided to play with the PBP 4 way option on the 44k. And so i hooked up 4 inputs, via my 290x quadfire. Thanks to AMD's VSR i was able to up the resolution per input and achieved
[email protected] 2x2 eyefinity. I can also do 3840x2160 or a wide range of resolutions if i so choose. While its super demanding on games at [email protected] the scaling is actually not to bad considering the resolution im pushing through the crossover.

Ill post some screenshots to show what i mean exactly. i can also post some pictures from my phone but its not the best for photos. Anyways enjoy.



Game screnshot tests (you can right click and open in new tab for full size.)



Cell phone pics(obviously doesnt do it justice with a cell phone but its all i got)




As far as scren synch goes, its perfectly seamless if vsynch is on. With no vsynch and high FPS there is a slight synch line variance from top half to bottom half. middle of the screen - if fps is over 60 with no vsynch that is.

I might just use this instead of my other eyefinity setup, i do loose alot of screen size from my 3x39inchseiki's but i get almost triple 4k resolution on a single [email protected] And as for gameplay it seems really good, although ill need to test it more.










p.s. i forgot to say i think this will be worth checking out in windows 10, as Win7 has crappy UI scaling in super high resolutions.


----------



## wermad

Good lord, the huge crosshair of bezels in the middle







. Debezel, and there goes your warranty







. I would rather get 5x1 wqhd u27s. I miss my old 5x1, very immersive but gotta sit back to get the all monitors in your fov.

Quote:


> Originally Posted by *xer0h0ur*
> 
> You're ignoring every single sign if you think the 390X's target competition is a piddly 980. The 380X should be able to trade blows with a 980 and beat it. The 390X is without a doubt setting its sights on Titan X territory. Nvidia is currently holding back the 980 Ti and the possibility of a water cooled Titan X because they have no idea what to expect from AMD's lineup. The fact reports of the 980 Ti being the full fat chip already in the Titan X proves as much.


Knowing what amd has done in the past to counter Nvidia, and vice -versa, and the fact the company is really on a limb, they're not going to risk money on catering to the egos of a few fans, so calm down. There's no Titan X killa from amd coming this gen. Probably next gen after PI, it may wipe the floor with it (a'la 290x vs og Titan), but this is natural progression. And more then likely, there will be a new Titan 3.0 by then. Amd needs products that can sell now. They're not going to build a 390X e-peen because they don't need to. They're really pushing the pricing scheme with this 390x w/ aio but they had success w/ the 295x2. Though, that success really comes from the little to no competition from Nvidia tbh and the necessity to go w/ liquid cooling vs a massive air system. The last "uber" single core flagship card was the 4890. There was no 5xxx, 6xxx, 7xxx, and 2xx series of this model. Also, look back at my post and see how I said 390 and not 390x. As I've said a few times, 390 is probably the direct competitor of 980 while 390X will slot between 980 and Titan X. Ti would really come in if and only if 390X does pose a big threat to Titan X with the typical amd price advantage. If there's no uber card again this time, we can always look for the aib's doing some special cards that can do the job (lightning 390X w/ six dp 1.3







).

I wouldnt call the 980x "piddly" as nvidia really has proven some remarkable things, such as 256 bit bus handling 4k and putting to rest the dead horse of needing a bigger bus. Its seems like a great card though I'm not happy w/ the Nvidia price surcharge. Also, Nvidia can magically change numbers to partial numbers (







).


----------



## ViRuS2k

[quote name="NBAasDOGG" Btw guys, I think I found a way to raise the target temp (throttle temp) above 75C using AMD CCC and Artmoney 7.39, but I'm experimenting to make sure it's save before I post a little tutorial







[/quote]

Please do post the guide to remove 75c throttle limit i would love it and so would custom cooled cards................
though that program artmoney can only ulter games, you would need to ulter the cards bios then reflash it for it to stick otherwise there would be no point.


----------



## DividebyZERO

Quote:


> Originally Posted by *wermad*
> 
> Good lord, the huge crosshair of bezels in the middle
> 
> 
> 
> 
> 
> 
> 
> . Debezel, and there goes your warranty
> 
> 
> 
> 
> 
> 
> 
> . I would rather get 5x1 wqhd u27s. I miss my old 5x1, very immersive but gotta sit back to get the all monitors in your fov.
> Knowing what amd has done in the past to counter Nvidia, and vice -versa, and the fact the company is really on a limb, they're not going to risk money on catering to the egos of a few fans, so calm down. There's no Titan X killa from amd coming this gen. Probably next gen after PI, it may wipe the floor with it (a'la 290x vs og Titan), but this is natural progression. And more then likely, there will be a new Titan 3.0 by then. Amd needs products that can sell now. They're not going to build a 390X e-peen because they don't need to. They're really pushing the pricing scheme with this 390x w/ aio but they had success w/ the 295x2. Though, that success really comes from the little to no competition from Nvidia tbh and the necessity to go w/ liquid cooling vs a massive air system. The last "uber" single core flagship card was the 4890. There was no 5xxx, 6xxx, 7xxx, and 2xx series of this model. Also, look back at my post and see how I said 390 and not 390x. As I've said a few times, 390 is probably the direct competitor of 980 while 390X will slot between 980 and Titan X. Ti would really come in if and only if 390X does pose a big threat to Titan X with the typical amd price advantage. If there's no uber card again this time, we can always look for the aib's doing some special cards that can do the job (lightning 390X w/ six dp 1.3
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> I wouldnt call the 980x "piddly" as nvidia really has proven some remarkable things, such as 256 bit bus handling 4k and putting to rest the dead horse of needing a bigger bus. Its seems like a great card though I'm not happy w/ the Nvidia price surcharge. Also, Nvidia can magically change numbers to partial numbers (
> 
> 
> 
> 
> 
> 
> 
> ).


are you talking about bezels in regards to my post?

if so look again, there are no bezels its one 40 inch 4k monitor with picture by picture x4.


----------



## wermad

In 2x2 monitors? yeah, I wouldnt want that tbh. I can tolerate them in 3x1 and 5x1, but can really stand it if its getting in the way of a fps crosshair


----------



## xer0h0ur

Honestly I believe you're far off and the 390X / 395X2 are aimed squarely at the highest end enthusiast market. Which is why the 390X is going to come both air cooled and water cooled. The 395X2 will presumably only be water cooled like the 295X2 was. You keep making mention of AMD's current state yet you're ignoring the fact everything they create has been in the pipeline for years of development. Their current state doesn't affect something already freakin made...The core being used in the 380/380X is presumably what was meant to be Hawaii XTX which was never used. That chip alone should be fairly on par with a stock GTX 980.


----------



## DividebyZERO

Quote:


> Originally Posted by *wermad*
> 
> In 2x2 monitors? yeah, I wouldnt want that tbh. I can tolerate them in 3x1 and 5x1, but can really stand it if its getting in the way of a fps crosshair


its one 40 inch monitor... with 4 inputs that segment into 4 sections on the panel with No bezels whatsoever


----------



## wermad

Companies will cancel initiatives no matter how far they are in development if necessary simply to avoid a bigger future loss. Take the small loss now or a big one later? Its a difficult business decision that has been made, especially if the company is facing financial hardships.

The fact 390x is probably speculated to be ~600-750 is more then likely due to the aio cooler imho. When you consider the 295x2 is two 290x, using the the 7990 and 7970s formula, technically the 295x2 should have been $1100. Why charge $400 for a aio? Development cost? Licensing? Low volume surcharge? Probably all the above. So that's about a 35% increase and if we apply that to the $550 range amd alpha single core card, and now we have the $600-750 390X aio rumored. Tbh, there's too much risk during these times for the company I say. Play it safe amd to live another day. I wouldn't be crying if there was no 390X epeen edition to slam the Titan X. I rather see amd continue w/ d-gpu's in the future and offer things like 490X and 495X2







.

Quote:


> Originally Posted by *DividebyZERO*
> 
> its one 40 inch monitor... with 4 inputs that segment into 4 sections on the panel with No bezels whatsoever


Interesting







. So, is this emulated resolutions? edit: reading up on VSR


----------



## deep33

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Isn't that illegal? In my country you may loose a lot of money by posting fake things about high end expensive products.
> You'll win the lawsuit within seconds, which means "bye bye NVIDIA and GPUBOSS"


Yes, definitely deserves a lawsuit!!


----------



## deep33

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Looks like they only ran one GPU core tbh.
> 
> Rookie mistake.


Rookie Mistake?? "GPU boss" made a "rookie" mistake. Don't you think it sounds very odd if you listened to yourself again? And even if they did, it still hasn't occured in all this time to correct it? If someone at GPU boss knew so li'l about GPU hardware (in the first place), he should be fired and start flippin burgers at Wendys.

Nothing that looks like rookie innocence here. GPU boss = Paid Con-men


----------



## Sgt Bilko

Quote:


> Originally Posted by *deep33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Looks like they only ran one GPU core tbh.
> 
> Rookie mistake.
> 
> 
> 
> Rookie Mistake?? "GPU boss" made a "rookie" mistake. Don't you think it sounds very odd if you listened to yourself again? And even if they did, it still hasn't occured in all this time to correct it? If someone at GPU boss knew so li'l about GPU hardware (in the first place), he should be fired and start flippin burgers at Wendys.
> 
> Nothing that looks like rookie innocence here. GPU boss = Paid Con-men
Click to expand...

It's still possible it was a mistake that no-one looked at.

I agree it's odd but not out of the realm of possibility.


----------



## kayan

Quote:


> Originally Posted by *PriestOfSin*
> 
> I'm looking forward to killing floor 2, GTAV, Witcher 3, and star citizen.
> 
> I can't do a 970, the memory issue is too huge. Games are memory hungry it seems, so anything less than 4 Is unacceptable.


Well, if you haven't pre ordered The Witcher 3, then the 980 could be the way to go. Figure, if you are buying it anyway, then that's essentially 60 bucks off the price of the card, since it's a free download. So that being said sounds like a 290x or 980 is the way to go for you. Unless you can hold off for a couple/few months and see what the new cards are going to be like.


----------



## wermad

@Priest
970 ain't bad for 1080. Watched a video of a guy running SoM and memory wasn't an issue (raw power is though). If you can't really spend too much, the 290 w/ a different cooler like the TriX is a good choice. I'm sure you can find one @$200 or under. It's funny how the 970 vram-gate caused used 290 prices to go up







. Avoid the stock 290/290X as its Hella hot and Hella noisy. If you want something cheaper, look for a good ol' 7970 or 280x.


----------



## NBAasDOGG

Quote:


> Originally Posted by *ViRuS2k*
> 
> [quote name="NBAasDOGG" Btw guys, I think I found a way to raise the target temp (throttle temp) above 75C using AMD CCC and Artmoney 7.39, but I'm experimenting to make sure it's save before I post a little tutorial


[quote Please do post the guide to remove 75c throttle limit i would love it and so would custom cooled cards................
though that program artmoney can only ulter games, you would need to ulter the cards bios then reflash it for it to stick otherwise there would be no point.[/quote]

Artmoney also works with other software including CCC or MSI afterburner or even Internet Explorer.
I will post a tutorial very soon


----------



## DividebyZERO

Quote:


> Originally Posted by *wermad*
> 
> Companies will cancel initiatives no matter how far they are in development if necessary simply to avoid a bigger future loss. Take the small loss now or a big one later? Its a difficult business decision that has been made, especially if the company is facing financial hardships.
> 
> The fact 390x is probably speculated to be ~600-750 is more then likely due to the aio cooler imho. When you consider the 295x2 is two 290x, using the the 7990 and 7970s formula, technically the 295x2 should have been $1100. Why charge $400 for a aio? Development cost? Licensing? Low volume surcharge? Probably all the above. So that's about a 35% increase and if we apply that to the $550 range amd alpha single core card, and now we have the $600-750 390X aio rumored. Tbh, there's too much risk during these times for the company I say. Play it safe amd to live another day. I wouldn't be crying if there was no 390X epeen edition to slam the Titan X. I rather see amd continue w/ d-gpu's in the future and offer things like 490X and 495X2
> 
> 
> 
> 
> 
> 
> 
> .
> Interesting
> 
> 
> 
> 
> 
> 
> 
> . So, is this emulated resolutions? edit: reading up on VSR


Maybe this will help as well, i forgot to attach these pics originally and somehow missed it.

http://www.overclock.net/t/1549360/crossover-44k-uhd-led-40-inch-monitor/420#post_23818932


----------



## xer0h0ur

Gentlemen, just as a warning to you guys about raising the temperature limit on the 295X2, you should know that the reason the temp limit is 75C is not because of what the cores themselves can handle so much as the temperature the Asetek cooler can handle. Going above 75C over prolonged use is going to kill those pumps.

And on that note, if someone does murder their pumps I have a spare unit here collecting dust


----------



## Feyris

Quote:


> Originally Posted by *xer0h0ur*
> 
> Gentlemen, just as a warning to you guys about raising the temperature limit on the 295X2, you should know that the reason the temp limit is 75C is not because of what the cores themselves can handle so much as the temperature the Asetek cooler can handle. Going above 75C over prolonged use is going to kill those pumps.


Not only that but if your reaching 75c on custom loop, you have other problems. And theres no need to put it higher since you could just p/p or single fan it with one thats better then stock Fan if sticking to the clc.


----------



## xer0h0ur

Quote:


> Originally Posted by *Feyris*
> 
> Not only that but if your reaching 75c on custom loop, you have other problems. And theres no need to put it higher since you could just p/p or single fan it with one thats better then stock Fan if sticking to the clc.


Agreed as for the custom loop comment. That is why I ended up upgrading and re-vamping my loop upon adding a block on the 290X since I already knew I needed more cooling capacity.


----------



## wermad

Isn't the pump similar to those aio's that are IB and Haswell ready??? These non-delid chips can easily go over 75c once you crank up the clocks.


----------



## xer0h0ur

The problem is the water temp. You must remember that this is 500W of heat.


----------



## deep33

Quote:


> Originally Posted by *kayan*
> 
> At 1080, you probably don't need a 295x2, and a 290x or 970 would probably be enough, but for the power the 295x2 is awesome.
> 
> What games are you playing/planning to get in the next year or so? As this will likely greatly influence your options.
> 
> I'm in the same boat as you with no upgrades for 2ish years. My wife and I are having our first in October, and I chose the 295x2 as my card.


Have you ever played 4k resolutions? Its an incredible letdown going back to 1080p after playing 4k. Hence, i suggest the 295x2 ( perfect for 4k) and the perfect card within affordability for future proofing. In the near future, everything's going to head towards the 4k realm. Considering what i just said, why would a guy still dump more than what the 295x2 costs on a GTX 980?

I was able to pick up a samsung 4k monitor on sale for 450.00 and have been playing the witcher 2, skyrim and masseffect 3 with my 295x2. Never ever going back to1080p. It is truly glorious.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> The problem is the water temp. You must remember that this is 500W of heat.


Mostly bingo here. I suspect its more a concern about pressure which rises with temperature of the coolant. Tygon tubing can go to 135c but that would drive the pressure such that it would blow the seals. This happens to people with AIOs sometimes.


----------



## Roxycon

so, how's your experience with the 295x2 in 4k guys?

wanna exchange my main monitor (va, 1080p60hz) to a 40 inch 4k60hz monitor as gaming is one of my last concerns as of late, and if i do game, it's just some casual borderlands or car game. the amd drivers are a bit crap on triple monitor setup and eyefinity, if it wasnt i would not consider the 4k, but..


----------



## xer0h0ur

Been playing Skyrim, CS:GO, and Postal 2: Paradise Lost lately and my 4K performance doesn't leave anything to be desired. Everything cranked up high as it can go and still have horsepower left over. I am waiting it out to play Dying Light and GTA V. I prefer to let the drivers mature and developers patch their games since nothing is ever launched bug free anymore.


----------



## dndfm

Hi, I am having issues understanding the v-ram situation on my 295x2, let me explain:
all the people I know say that you won't double the v-ram with sli/crossfire *even* if both are on the same pcb,
yet gta v and other games and benchmarks tell me that I have 8gb of memory, so does the card trick the benchmarks
or I'm I surrounded by idiots?


----------



## Roxycon

so you would say it is a okay?











monitor setup will be like this, think it's the most optimal setup i can get considering getting a job in web development, fell in love with horizontal screens at the last LAN i attended to


----------



## xer0h0ur

Quote:


> Originally Posted by *dndfm*
> 
> Hi, I am having issues understanding the v-ram situation on my 295x2, let me explain:
> all the people I know say that you won't double the v-ram with sli/crossfire *even* if both are on the same pcb,
> yet gta v and other games and benchmarks tell me that I have 8gb of memory, so does the card trick the benchmarks
> or I'm I surrounded by idiots?


If you are viewing it through Afterburner's OSD its showing you the unified usage. That does not mean that your vRAM is stacking. vRAM in Crossfire or SLI is mirrored so each GPU is using half of the total amount of vRAM.

For a game to use the full amount of vRAM without mirroring it, the game needs to be programmed to do so while using SFR (split frame rendering) and no developer has tackled this yet.


----------



## xer0h0ur

Quote:


> Originally Posted by *Roxycon*
> 
> so you would say it is a okay?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> monitor setup will be like this, think it's the most optimal setup i can get considering getting a job in web development, fell in love with horizontal screens at the last LAN i attended to


That changes everything. I can not give you any performance opinion on multi-monitor gaming as I am using a single 28" 4K monitor.


----------



## Roxycon

Quote:


> Originally Posted by *xer0h0ur*
> 
> That changes everything. I can not give you any performance opinion on multi-monitor gaming as I am using a single 28" 4K monitor.


im only gaming on the middle monitor the three other ones have been peripheral monitors ever since i changed to red team, because of drivers, as i said


----------



## wermad

Quote:


> Originally Posted by *Roxycon*
> 
> so you would say it is a okay?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> monitor setup will be like this, think it's the most optimal setup i can get considering getting a job in web development, fell in love with horizontal screens at the last LAN i attended to


3+1 setup in eyefinity for gaming? Dp is your friend with Eyefinity







.


----------



## Roxycon

Quote:


> Originally Posted by *wermad*
> 
> 3+1 setup in eyefinity for gaming? Dp is your friend with Eyefinity
> 
> 
> 
> 
> 
> 
> 
> .


haha, DP and mDP is a curse of a standard having aging screens







hard/impossible to troubleshoot, adapters are pricey in norway, bootscreen on two monitors plus the bulkyness of said adapters..

pre RMA of my 295x2 had every HDMI screen connected registered as a tv and not a monitor taking away about 20% of usable pixels too


----------



## xer0h0ur

Quote:


> Originally Posted by *Roxycon*
> 
> haha, DP and mDP is a curse of a standard having aging screens
> 
> 
> 
> 
> 
> 
> 
> hard/impossible to troubleshoot, adapters are pricey in norway, bootscreen on two monitors plus the bulkyness of said adapters..
> 
> pre RMA of my 295x2 had every HDMI screen connected registered as a tv and not a monitor *taking away about 20% of usable pixels too*


You know I encountered this exact same thing when I put together my cousin's 6-1080p monitor trading station setup. The only screens that were using the full screen were the screens connected: Monitor ---> HDMI cable ---> Active DisplayPort adapter ---> DisplayPort port on his 290X and 270. The other four screens connected through HDMI to HDMI or HDMI to DVI had a black border making the effective screen size smaller. We had to change the resolutions of the monitors through the CCC to resolve this.


----------



## Roxycon

Quote:


> Originally Posted by *xer0h0ur*
> 
> You know I encountered this exact same thing when I put together my cousin's 6-1080p monitor trading station setup. The only screens that were using the full screen were the screens connected: Monitor ---> HDMI cable ---> Active DisplayPort adapter ---> DisplayPort port on his 290X and 270. The other four screens connected through HDMI to HDMI or HDMI to DVI had a black border making the effective screen size smaller. We had to change the resolutions of the monitors through the CCC to resolve this.


I had to RMA the card, turned out to be some faulty VRAM issues, the hdmi was the least of my issues though, was bsod'ing almost once an hour for about two weeks with the bsod's directing me to multiple completely different parts of the system, both software and hardware







AMD have given me an long and hard welcome


----------



## xer0h0ur

Well for what its worth my cousin's PC was bought at Micro Center, basically the most powerful PowerSpec they offer. Its pretty much all Corsair and MSI branded parts. His system came with a defective MSI Gaming R9 290X that never made it past its first boot without BSODing upon driver installation. MicroCenter swapped out the video card for him with a Diamond R9 290X and everything was fine ever since. Crap happens.


----------



## Roxycon

yhea, hopefully its all perfect now with the new unit







but im not going to dive into the many problems of eyefinity just yet, so a main monitor upgrade is in place i think


----------



## cmoney408

i wondered the same thing. obviously he hasn't played it, because he would know what you were talking about.

i am curious too, why does the GTA V in game settings menu show all 8gb of ram as usable? as you increase settings it actively shows you are using XXXXmb / 8000mb

(XXXX will change depending on the settings. the max doesnt show 8000, but like 7980 or something similar).

if you figure this out, let me know!


----------



## xer0h0ur

For christ's sake I just explained it... its showing you unified vRAM usage.


----------



## mojobear

Quote:


> Originally Posted by *xer0h0ur*
> 
> For christ's sake I just explained it... its showing you unified vRAM usage.


You know whats weird? If you choose very high textures etc, you get about 3+ gb vram @ 1080p. I run the game at 7680 x 1440 @ very high and given how much vram is used at 1080p, there should be no way I can run the game @ my 1440p eyefinity settings since I have 4 gb vram cards. It does say in the settings I am using 13 GB vram. At those settings, I get none of the stutter associated with vram limits....wweird


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> You know I encountered this exact same thing when I put together my cousin's 6-1080p monitor trading station setup. The only screens that were using the full screen were the screens connected: Monitor ---> HDMI cable ---> Active DisplayPort adapter ---> DisplayPort port on his 290X and 270. The other four screens connected through HDMI to HDMI or HDMI to DVI had a black border making the effective screen size smaller. We had to change the resolutions of the monitors through the CCC to resolve this.


After toying with my 44k crossover and hooking up 4 various imputs it took some work to get all fullscreen and scaled properly. I'm wondering if its a different issue when using all DP but probably not.

I think the issue becomes EDID and monitor identification over various connections. For instance my 44k detected over HDMI showed different resolutions than say DVI or DP. DP obviously has a higher pixel clock/more bandwidth so it saw 60hz4k vs hdmi saw 4k30hz. I know AMD used to say different panels didnt matter much but i think it does along with connectivity. To get proper uniformity for my 44k over 4 different inputs i had to use toastyx's CRU and match top end resolutions and refresh rates according to my lowest input capability. Now i do not know if this will apply to 295x2 if your using all DP only connections. Chances are all DP sees the same EDID. Also There is probably a Windows interference on EDID information as well. ToastyX is the guy to ask.

Example of an issue i had for 3x1 4k eyefinity using seiki/crossover/seiki was crossover uses [email protected] unlike Seiki 39 was [email protected] tops. When i enable eyefinity i got [email protected]/[email protected]/[email protected] instead of [email protected]/[email protected]/[email protected] which would have been correct. I had to remove 60hz from crossover and drop it to 30hz. In that case it was just a refresh rate issue, and rightfully so since im mixing multiple connection types.

With my crossover doing 4 inputs @ 1080 was a chore because it was picking up different EDID/resolution sets. So i edited DP and removed the 4k resolution since in my case im not using it. I also had to tamper with screen scaling and gpu scaling sadly i do not remember the order i did it in.

Also to note, even without eyefinity im farily certain i had to adjust scaling/options to get full screen per input.

I think if your not going to use CRU at least look at the third CCC shot below and click the bar to see if each monitor is identified with max resolution/refresh rate.


----------



## Mega Man

Quote:


> Originally Posted by *deep33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kayan*
> 
> At 1080, you probably don't need a 295x2, and a 290x or 970 would probably be enough, but for the power the 295x2 is awesome.
> 
> What games are you playing/planning to get in the next year or so? As this will likely greatly influence your options.
> 
> I'm in the same boat as you with no upgrades for 2ish years. My wife and I are having our first in October, and I chose the 295x2 as my card.
> 
> 
> 
> Have you ever played 4k resolutions? Its an incredible letdown going back to 1080p after playing 4k. Hence, i suggest the 295x2 ( perfect for 4k) and the perfect card within affordability for future proofing. In the near future, everything's going to head towards the 4k realm. Considering what i just said, why would a guy still dump more than what the 295x2 costs on a GTX 980?
> 
> I was able to pick up a samsung 4k monitor on sale for 450.00 and have been playing the witcher 2, skyrim and masseffect 3 with my 295x2. Never ever going back to1080p. It is truly glorious.
Click to expand...

They said the same thing about 1440p and it still isn't a standard. How long had it been out.

I prefer 144hz though soon will get 3 or 6 1440p 144hz screens
Quote:


> Originally Posted by *dndfm*
> 
> Hi, I am having issues understanding the v-ram situation on my 295x2, let me explain:
> all the people I know say that you won't double the v-ram with sli/crossfire *even* if both are on the same pcb,
> yet gta v and other games and benchmarks tell me that I have 8gb of memory, so does the card trick the benchmarks
> or I'm I surrounded by idiots?


See last response ( on mobile and can't edit easily )
Quote:


> Originally Posted by *Roxycon*
> 
> so you would say it is a okay?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> monitor setup will be like this, think it's the most optimal setup i can get considering getting a job in web development, fell in love with horizontal screens at the last LAN i attended to


Make sure the horizontal monitors are ips
Quote:


> Originally Posted by *xer0h0ur*
> 
> For christ's sake I just explained it... its showing you unified vRAM usage.


He was correct. Mine shows 16gb as I have quadfire ( 4x 4gb )

You have to divide the ram by the number of gpus you have


----------



## wermad

@roxy

That's some bad luck, I luv dp. It worked wonders for my 5x1 1200 setup. Most issues steam from the monitor tbh. I've only ran the single 4k screen w/ my 295x (ran quad 7970 lightnings w/ the 5x1 setup) so I can't comment on the mmg capabilities. The more adapters you're adding, the trouble youre ask for imho. I always recommend getting monitors with native dp to make it easy for setup.


----------



## Roxycon

@Mega Man do they have to? i only got VA panels


----------



## DividebyZERO

Quote:


> Originally Posted by *wermad*
> 
> @roxy
> 
> That's some bad luck, I luv dp. It worked wonders for my 5x1 1200 setup. Most issues steam from the monitor tbh. I've only ran the single 4k screen w/ my 295x (ran quad 7970 lightnings w/ the 5x1 setup) so I can't comment on the mmg capabilities. The more adapters you're adding, the trouble youre ask for imho. I always recommend getting monitors with native dp to make it easy for setup.


Since DP is the superior connectivity for monitors given 4k and extra bandwidth and all. It makes sense they would be the real only option. Thats said im willing to bet if there was a video card with all say HDMI 2.0 connections it would probably end up the same in regards to it working seamlessly. I can vouch for this though, using multiple inputs i get ZERO screen tearing unless vsynch is off. I'm sure if i had true 120+hz 4k(assuming it existed) screen tear would be non existent even more. I guess i am saying it appears that without DP screen tear only happens if your above the refresh rate without vsynch. I seem to recall that not being the case back in 7970 days?


----------



## steezebe

I won't go to the 4k until they get the color accuracy improved while operating at a native 60Hz. The few 60Hz 4k's I've seen, or even the 120Hz 1440's, have such washed-out and flat color it almost offends. 720p with proper AA looks better than some of the 4k's out there. I want some 4.4.4 Chroma with 95% CIE1931 baby!!!

Until the panel quality improves, I'd prefer to use the GPU power for AA processing than jacking up the raw pixel count for my gaming and 3D rendering applications. It just looks better.

Standing by for that 30in+ WQHD OLED....


----------



## xer0h0ur

Well 60Hz @ 4K is merely a stepping stone much like 30Hz was as well. 120Hz is up to bat now with DisplayPort 1.3 and HDMI 2.0.


----------



## steezebe

Oh you're absolutely correct. DP can handle plenty that's for sure, but the bottleneck isn't at the datalink; a good display is still the limiting factor. The 295 is so nice because it has the power to do some quality AA at lower res, instead of worrying about raw pixel count.


----------



## rdr09

Quote:


> Originally Posted by *steezebe*
> 
> I won't go to the 4k until they get the color accuracy improved while operating at a native 60Hz. The few 60Hz 4k's I've seen, or even the 120Hz 1440's, have such washed-out and flat color it almost offends. 720p with proper AA looks better than some of the 4k's out there. I want some 4.4.4 Chroma with 95% CIE1931 baby!!!
> 
> Until the panel quality improves, I'd prefer to use the GPU power for AA processing than jacking up the raw pixel count for my gaming and 3D rendering applications. It just looks better.
> 
> Standing by for that 30in+ WQHD OLED....


nah, go 4K. does this look washed-out . . .


----------



## xer0h0ur

Dude I went from a 20" Samsung 204B to a 28" UD590 lmao. I love my 4K panel. Even if its TN instead of IPS.


----------



## wermad

I was a bit skeptical, mostly due to the tn panel, but i Have to say its a gem. The scaling is a bit weird (win7) sometimes but most of my stuff works great. I really can't tell much difference in some settings for games. Definitely awesome high res without going w/ mmg.

Keep an eye out for used Sammys or the Acer on sale. I'm sure by next year, things will be a bit more affordable. 28" is pretty big for me albeit I do have limit desk space







. A 32" 4k 120hz ips monitor would be e-pr0n


----------



## Mega Man

Quote:


> Originally Posted by *Roxycon*
> 
> @Mega Man do they have to? i only got VA panels


not at all ( tn?) But ips has a far superior viewing angle in comparison


----------



## Alex132

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *steezebe*
> 
> I won't go to the 4k until they get the color accuracy improved while operating at a native 60Hz. The few 60Hz 4k's I've seen, or even the 120Hz 1440's, have such washed-out and flat color it almost offends. 720p with proper AA looks better than some of the 4k's out there. I want some 4.4.4 Chroma with 95% CIE1931 baby!!!
> 
> Until the panel quality improves, I'd prefer to use the GPU power for AA processing than jacking up the raw pixel count for my gaming and 3D rendering applications. It just looks better.
> 
> Standing by for that 30in+ WQHD OLED....
> 
> 
> 
> nah, go 4K. does this look washed-out . . .
Click to expand...

No. That' because I am viewing in on my 1440p IPS panel









I'm 100% certain he was talking about the panels and not the content.

To be honest, I'd rather have a 1440p IPS panel than a 4k TN panel. Both 60hz.

edit- that image is over-saturated, probably done on purpose because it's promo material :/


----------



## wermad

Switching from 1440 to 2160 is like switching from 720 to 1080...there's a noticeable difference in my experience. Almost like 1080 to 1440. 4k is really the best alternative for mmg setups. The samsung TN panel is very nice and doesn't have the matte finish of my old u2412s. Once these 28" TN 4k 60hz monitors hit ~$300 range, more and more folks will migrate to 2160 imho.


----------



## dndfm

yeh but dude I have seen benchmarks that are showing 1080 gameplay on gta v using almost entire 4gb, and the titan x is using almost 6 gb on 4k (alone no sli) and on same settings my 295 has better 4k performance than the titan x, which means that the card is not limited by the v-ram, so I still don’t understand....


----------



## steezebe

Quote:


> Originally Posted by *rdr09*
> 
> nah, go 4K. does this look washed-out . . .


Um... I'm talking about the physical limitations of a panel. A screen shoot is just the data, which says nothing about how it looks in person.

It's like speakers: you can get floor to ceiling cabinets with poop quality drivers in them that sound 'meh,' or two amazing drivers in a small cabinet for the same price that sound massively better.

Quote:


> Originally Posted by *Alex132*
> 
> No. That' because I am viewing in on my 1440p IPS panel
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm 100% certain he was talking about the panels and not the content.
> 
> To be honest, I'd rather have a 1440p IPS panel than a 4k TN panel. Both 60hz.
> 
> edit- that image is over-saturated, probably done on purpose because it's promo material :/


Thank you @Alex132 I was starting to think I was being unclear.


----------



## rdr09

Quote:


> Originally Posted by *steezebe*
> 
> Um... I'm talking about the physical limitations of a panel. A screen shoot is just the data, which says nothing about how it looks in person.
> 
> It's like speakers: you can get floor to ceiling cabinets with poop quality drivers in them that sound 'meh,' or two amazing drivers in a small cabinet for the same price that sound massively better.
> 
> Thank you @Alex132
> I was starting to think I was being unclear.


actually, screen shots do not give justice to 4K rez. it is way better than that image.

if i am to use print screen in windows the file size would be around 24MB. i use raptr which takes it down to 1.4MB and that could be sacrificing how the real thing looks.


----------



## DividebyZERO

Quote:


> Originally Posted by *rdr09*
> 
> actually, screen shots do not give justice to 4K rez. it is way better than that image.
> 
> if i am to use print screen in windows the file size would be around 24MB. i use raptr which takes it down to 1.4MB and that could be sacrificing how the real thing looks.


I don't think you can persuade him, he basically said 720p with AA looks better than 4k......


----------



## electro2u

Quote:


> Originally Posted by *Mega Man*
> 
> not at all ( tn?) But ips has a far superior viewing angle in comparison


VA panels have the same specified viewing angles as IPS. 178/178. They arent exactly the same but they both offer better off center viewing than TN.


----------



## steezebe

Quote:


> Originally Posted by *DividebyZERO*
> 
> I don't think you can persuade him, he basically said 720p with AA looks better than 4k......


For high frame rate gaming, yeah. A static image will look pretty on a 4k, but we don't buy 295s to game at 1 fps, and for me I find a1^ immersion in games when the image is accurately represented.

I build columated projector, 240deg fov wraparound visual systems for full motion flight simulators; every customer prefers the lower resolution, higher contrast displays because they look better than the higher resolution counterparts.


----------



## Mega Man

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> not at all ( tn?) But ips has a far superior viewing angle in comparison
> 
> 
> 
> VA panels have the same specified viewing angles as IPS. 178/178. They arent exactly the same but they both offer better off center viewing than TN.
Click to expand...

Ah tyvm. I stand corrected. I looked into monitors. But I missed those panels. ..


----------



## rdr09

Quote:


> Originally Posted by *steezebe*
> 
> For high frame rate gaming, yeah. A static image will look pretty on a 4k, but we don't buy 295s to game at 1 fps, and for me I find a1^ immersion in games when the image is accurately represented.
> 
> I build columated projector, 240deg fov wraparound visual systems for full motion flight simulators; every customer prefers the lower resolution, higher contrast displays because they look better than the higher resolution counterparts.


why would you get 1fps with a 295? i might be missing something. i only have 2 290s to drive a 4K. i don't even have to oc them.









anyways, enjoy whatever rez you like.


----------



## DividebyZERO

I think the issue at 720p will be cpu with a 295x2, if your looking for max fps at 720 wouldn't nvidia be a better option?

You said originally color at 4k was an issue not 500million fps. I am not trying to argue im just trying to understand.


----------



## Alex132

Quote:


> Originally Posted by *rdr09*
> 
> actually, screen shots do not give justice to 4K rez. it is way better than that image.
> 
> if i am to use print screen in windows the file size would be around 24MB. i use raptr which takes it down to 1.4MB and that could be sacrificing how the real thing looks.












I was stating that you cannot take a screenshot of an image to show how well your panel displays it









I will only jump to 4k when there is an affordable IPS 4k monitor with no input latency issues and hopefully free-sync / g-sync support. And when I have money to do so, of course.


----------



## rdr09

Quote:


> Originally Posted by *Alex132*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was stating that you cannot take a screenshot of an image to show how well your panel displays it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will only jump to 4k when there is an affordable IPS 4k monitor with no input latency issues and hopefully free-sync / g-sync support. And when I have money to do so, of course.


my bad. my 4K is TN and I also have a 1080 that is a TN. somehow the 4K does not suffer the issue with viewing angle as the 1080. maybe the newer TNs suffer less or the 4K itself makes the difference.


----------



## Alex132

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was stating that you cannot take a screenshot of an image to show how well your panel displays it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will only jump to 4k when there is an affordable IPS 4k monitor with no input latency issues and hopefully free-sync / g-sync support. And when I have money to do so, of course.
> 
> 
> 
> my bad. my 4K is TN and I also have a 1080 that is a TN. somehow the 4K does not suffer the issue with viewing angle as the 1080. maybe the newer TNs suffer less or the 4K itself makes the difference.
Click to expand...

TN technology is so vast, it's hard to compare one to another. I mean, my old TN panel from 2004 was so crap compared to my 2008 Samsung TN panel.

But comparing either to my IPS panel - I honestly wondered how I managed to use them.

While I'm not saying the TN of your 4k panel is bad (especially if it's actually glossy and not matte, god I hate matte displays) - I'm just saying it won't stand up to a proper IPS 1440p panel. And to me color is probably one of the most important aspects of a monitor - not just for videos but for gaming too. More than 120hz IMO. Heh, I remember it took me about 2-3 weeks to stop just being absorbed into staring at the beautiful colors and getting completely distracted from the actual game itself









Like I said before, I'd jump on a good IPS 4k panel that wasn't overpriced. I'm not saying 4k is 'bad'. Heck, I've experienced it before and I loved the extra resolution. It reminded me of going from 1080p to 1440p.

One thing I do wonder with 4K is the up/down viewing color change with 4k. I noticed it very easily on my 23" TN panel, and I love having none of that at all on my 27" panel. Surely with a 28 or 32" 4k panel you'd notice it even more?


----------



## rdr09

is it not TN is known to have better colors than IPS?

even up and down does not suffer like my 1080 does. although i dont sit below table level.

here are acouple more screenies . . .


----------



## xer0h0ur

No, IPS panels are generally regarded to have better color fidelity and viewing angles while TN panels are regarded as having better response times and less ghosting than IPS panels.


----------



## Alex132

Quote:


> Originally Posted by *rdr09*
> 
> is it not TN is known to have better colors than IPS?
> 
> even up and down does not suffer like my 1080 does. although i dont sit below table level.
> 
> here are acouple more screenies . . .
> 
> 
> Spoiler: Warning: Spoiler!


No heh. Very simply put; IPS is the go-to for color-accuracy. TN is cheaper and easier to make but suffers from lower viewing angles, lower contrast, color accuracy, etc. IPS can suffer from back-light bleed and lower response time however.

A big thing for movie watching is that the dithering/banding effect can be reduced or completely elimated with a 10-bit or 8-bit IPS display. You ever see the color-banding in movies where there is some dark scene, normally with clouds or smoke? That's banding. With a proper IPS panel and 10-bit content you can easily eliminate that all together. It's a very nice experience to never see such problems to be honest.

http://en.wikipedia.org/wiki/IPS_panel
http://www.ehow.com/info_12149701_advantages-ips-panel-tv.html
http://www.lg.com/us/commercial/display-discover/IPS-advantage
http://www.tnpanel.com/tn-vs-ips-va/
http://www.slrlounge.com/what-is-an-ips-monitor-understanding-ips-displays/


----------



## rdr09

Quote:


> Originally Posted by *Alex132*
> 
> No heh. Very simply put; IPS is the go-to for color-accuracy. TN is cheaper and easier to make but suffers from lower viewing angles, lower contrast, color accuracy, etc. IPS can suffer from back-light bleed and lower response time however.
> 
> A big thing for movie watching is that the dithering/banding effect can be reduced or completely elimated with a 10-bit or 8-bit IPS display. You ever see the color-banding in movies where there is some dark scene, normally with clouds or smoke? That's banding. With a proper IPS panel and 10-bit content you can easily eliminate that all together. It's a very nice experience to never see such problems to be honest.
> 
> http://en.wikipedia.org/wiki/IPS_panel
> http://www.ehow.com/info_12149701_advantages-ips-panel-tv.html
> http://www.lg.com/us/commercial/display-discover/IPS-advantage
> http://www.tnpanel.com/tn-vs-ips-va/
> http://www.slrlounge.com/what-is-an-ips-monitor-understanding-ips-displays/


i see. +rep to you and x. i did compare my 4K to my retina mcb and the latter is not any better. prolly 'cause of the rez.


----------



## xer0h0ur

Lost my Kill A Watt meter and I finally got another one. I just measured my peak draw @ 986 Watts? I was certain I would be drawing over 1000 Watts but nope. Never broke 986. That was using Firestrike Ultra. Or should I use a different bench to measure peak?


----------



## wermad

3d11 extreme drew more power the fs 4k in my setup.

edit: up to you if you wanna risk it w/ furmark....


----------



## xer0h0ur

Peaked at 1024 Watts in Firestrike Extreme.


----------



## wermad

I peaked ~1260w (stock) but averaged ~1200w in 3dmark11 in the more intensive tests. I'll run some #s tomorrow since I'm off


----------



## Agent Smith1984

Anyone in here have the PowerColor Devil 13 instead of the 295x2 reference?

I am drooling over that thing and seriously considering buying it, selling my 290 tri-x for around $220 to recoop some loss...

The Devil is on sale for $599 after rebate.... THAT IS INSANE!!! The performance of a titan x for half the cost.... and with DX12, the buffer won't be limited to 4GB, so that's a plus also...

Not to mention it comes with a $150 mouse....

So you get $700 worth of graphics cards, and a $150 mouse, in an awesome package, for $599


----------



## starichok

Hello guys.
Have quick question,since my 295x2 is getting 74 c after 10 m in Valley i have decided to get water block for my card.
But i have no clue what ells should i purchase to have it functioning properly.
If anyone can give me heads up what i need to install in order my water block works i would appreciate it,


----------



## wermad

You might as well get the cpu on water too.

You need:

-cpu block
-gpu block (koolance one is ~$110 at performance-pcs.com)
-tube (recommend primochill advanced lrt, 10' is more then enough); soft or hard tube<---much more advance project tbh
-pump
-reservoir
-fittings; if you go w/ soft tube, get compression. If you go hard tube, make sure you stick to either imperial or metric.
-radiator; internal (this really comes down to your case, which it would be nice to know which one) or external (no limit other then just making sure its secured)
-fans; I recommended good static pressure fans as they work best for radiator. Slim radiators like higher rpm fans (> 1800) while medium to thick do good w/ medium to slow fans (1500-1000 medium, <1000 low). Pwm allows you to control the fans via your mb or software-controller. Standard fans can be controlled via a fan-controller.
-distilled water (or what ever the block maker prescribes) + a biocide or anti-corrosive if you wish.

Be prepared to spend about $500 or more for this custom loop. Look for used items if you can't or don't want to shell out a lot of cash. Also, there are pre-assembled kits typically cpu only but should be able to accommodate a gpu block. The basic rule of thumb (which many just like to contradict it) is 120mm radiator for each core. So a 360 or a 240 +120 radiator will be your starting point. Obviously, more will handle your loop better, especially with the hot 295x2. Though radiators do follow the law-of-diminishing-returns (see my rig for example







).

Other then going through this massive (and expensive, but fun) upgrade, try to tweak your current setup. A lot of guys are able to maintain this beast under 75°C. Are you using dual fans? Alternative fans? Where is your gpu radiator setup getting its air from?

edit: btw, welcome to overkill.n....I mean overclock.net
















Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anyone in here have the PowerColor Devil 13 instead of the 295x2 reference?
> 
> I am drooling over that thing and seriously considering buying it, selling my 290 tri-x for around $220 to recoop some loss...
> 
> The Devil is on sale for $599 after rebate.... THAT IS INSANE!!! The performance of a titan x for half the cost.... and with DX12, the buffer won't be limited to 4GB, so that's a plus also...
> 
> Not to mention it comes with a $150 mouse....
> 
> So you get $700 worth of graphics cards, and a $150 mouse, in an awesome package, for $599


Honestly, there's more value in the reference design tbh. If you want a rare special ed card, go for it. Though, you may want a case w/ a horizontal layout to cope w/ this massive custom beast. Otherwise, the reference design has the option to go custom water cooling to handle the heat this bad-boy generates. Its tempting at that price but the XFX reference has been as low as ~$629 w/ rebates (and possibly less).

edit: used reference ones are going for under $600, just make sure you get the stock cooler.


----------



## Agent Smith1984

Well, the card comes with an adjustable jack for weight support, and use 4x 8 pin connectors for additional power. Also, the mouse alone is $150, and I could actually use a mouse upgrade right now. $599 just seems like a really good value for such a nice package, but the card would no doubt run hotter than the reference....
Most reviews show it running at 85-90c in 4K gaming with a 75-100MHz overclock.

The XFX 295x2 is going for $620 after rebate right now, which is amazing considering just a year and a half ago they were $1500, annd still $1000 6 months ago.
As far as rareness, exclusivity, and total package, the Devil 13 is great, so I guess I must choose between cool looks, or cool operation


----------



## wermad

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, the card comes with an adjustable jack for weight support, and use 4x 8 pin connectors for additional power. Also, the mouse alone is $150, and I could actually use a mouse upgrade right now. $599 just seems like a really good value for such a nice package, but the card would no doubt run hotter than the reference....
> Most reviews show it running at 85-90c in 4K gaming with a 75-100MHz overclock.
> 
> The XFX 295x2 is going for $620 after rebate right now, which is amazing considering just a year and a half ago they were $1500, annd still $1000 6 months ago.
> As far as rareness, exclusivity, and total package, the Devil 13 is great, so I guess I must choose between cool looks, or cool operation


Honestly, Hawaii will always be a Kilauea inferno. If you really are going to take use of it, go for the reference as you have the back up of going custom cooling. If you want the exclusivity and possibly a higher resale value in the future (albeit, may take a while to sell), go for the Devil.

Remember, you can always quad-fire another 295x2 and manage some respectable temps w/ the aio's. Where as the Devil, you'll have to cinder blocks to manage on air.... As for the mouse, I doubt you will get $150 and tbh, that's absurd for a mouse unless it makes cookies







. I bought a $50 Deathadder for my bear paw







, and that's really expensive for me.


----------



## starichok

Hello man,thank you for your feedback.
My CPU is water cooled,as you see on the pictures it is asetec water cool and radiator on the top with two fans.
In regards to my fans,all i have is two on the top of the radiator,one on the bottom of the case and second one is behind-you can see that on the picture.
If i keep my front door open,i can maintain 73C in heavy gaming,but i dont want to do it.....maybe there is another way to get my case more cool,get few more fans??
But i have no more space where i can hook them up....so i think i have only water cool option.Also i have water blocked already as a present.
Let me know what you think and what would you do on my place.Thank you


----------



## wermad

Quote:


> Originally Posted by *starichok*
> 
> Hello man,thank you for your feedback.
> My CPU is water cooled,as you see on the pictures it is asetec water cool and radiator on the top with two fans.
> In regards to my fans,all i have is two on the top of the radiator,one on the bottom of the case and second one is behind-you can see that on the picture.
> If i keep my front door open,i can maintain 73C in heavy gaming,but i dont want to do it.....maybe there is another way to get my case more cool,get few more fans??
> But i have no more space where i can hook them up....so i think i have only water cool option.Also i have water blocked already as a present.
> Let me know what you think and what would you do on my place.Thank you


What are your specs?


----------



## Agent Smith1984

Quote:


> Originally Posted by *wermad*
> 
> Honestly, Hawaii will always be a Kilauea inferno. If you really are going to take use of it, go for the reference as you have the back up of going custom cooling. If you want the exclusivity and possibly a higher resale value in the future (albeit, may take a while to sell), go for the Devil.
> 
> Remember, you can always quad-fire another 295x2 and manage some respectable temps w/ the aio's. Where as the Devil, you'll have to cinder blocks to manage on air.... As for the mouse, I doubt you will get $150 and tbh, that's absurd for a mouse unless it makes cookies
> 
> 
> 
> 
> 
> 
> 
> . I bought a $50 Deathadder for my bear paw
> 
> 
> 
> 
> 
> 
> 
> , and that's really expensive for me.


You are definitely right about the card itself...

As far as the mouse, I am with you, except that the included one is wireless.... I NEED WIRELESS, and nothing cheap works without latency, whereas this mouse is suppose to be great. I have a Razor imperator, and love it... I got it for $25 on CL when they were selling for $75 new. I'm definitely a bargain minded guy, so spending $600 on a GPU is a huge leap for me. I do like the water cooling on the reference card, but something about the packagehere is just cool I guess.

I just looked at the Tom's review, and surpisingly, in performance mode, the air cooler keeps that thing at 70c undr load, and it also uses 60W less load power than the 295x2 which will help me greatly since I am only running an 850W PSU with an FX at 4.9GHz.

If I do get this thing, I still get to join the 295x2 club right?


----------



## starichok

Cpu 5820
16 ram
The rest is standard


----------



## wermad

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are definitely right about the card itself...
> 
> As far as the mouse, I am with you, except that the included one is wireless.... I NEED WIRELESS, and nothing cheap works without latency, whereas this mouse is suppose to be great. I have a Razor imperator, and love it... I got it for $25 on CL when they were selling for $75 new. I'm definitely a bargain minded guy, so spending $600 on a GPU is a huge leap for me. I do like the water cooling on the reference card, but something about the packagehere is just cool I guess.
> 
> I just looked at the Tom's review, and surpisingly, in performance mode, the air cooler keeps that thing at 70c undr load, and it also uses 60W less load power than the 295x2 which will help me greatly since I am only running an 850W PSU with an FX at 4.9GHz.
> 
> If I do get this thing, I still get to join the 295x2 club right?










Devil and 295x2 are essentially 290Xx2's so we're all on the same boat









edit:

hit up roaches for more info:
Quote:


> Originally Posted by *Roaches*
> 
> *snip*
> Anyhow, just finish installing the cards. I'm afraid to run any benches until the G2 1600 arrives.


Quote:


> Originally Posted by *starichok*
> 
> Cpu 5820
> 16 ram
> The rest is standard


Fill out the rig builder: go to your profile, go to the bottom of the page, click on edit rig/build, fill out the specs, & hit save (you don't have to do the bench and prices tbh). This will auto appear on every post and will help others help you knowing your complete specs.

Knowing what case you have will give us a better idea of the cooling potential you have currently.


----------



## Mega Man

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Honestly, Hawaii will always be a Kilauea inferno. If you really are going to take use of it, go for the reference as you have the back up of going custom cooling. If you want the exclusivity and possibly a higher resale value in the future (albeit, may take a while to sell), go for the Devil.
> 
> Remember, you can always quad-fire another 295x2 and manage some respectable temps w/ the aio's. Where as the Devil, you'll have to cinder blocks to manage on air.... As for the mouse, I doubt you will get $150 and tbh, that's absurd for a mouse unless it makes cookies
> 
> 
> 
> 
> 
> 
> 
> . I bought a $50 Deathadder for my bear paw
> 
> 
> 
> 
> 
> 
> 
> , and that's really expensive for me.
> 
> 
> 
> You are definitely right about the card itself...
> 
> As far as the mouse, I am with you, except that the included one is wireless.... I NEED WIRELESS, and nothing cheap works without latency, whereas this mouse is suppose to be great. I have a Razor imperator, and love it... I got it for $25 on CL when they were selling for $75 new. I'm definitely a bargain minded guy, so spending $600 on a GPU is a huge leap for me. I do like the water cooling on the reference card, but something about the packagehere is just cool I guess.
> 
> I just looked at the Tom's review, and surpisingly, in performance mode, the air cooler keeps that thing at 70c undr load, and it also uses 60W less load power than the 295x2 which will help me greatly since I am only running an 850W PSU with an FX at 4.9GHz.
> 
> If I do get this thing, I still get to join the 295x2 club right?
Click to expand...

you may also have to wait for custom drivers, but idk everyone who gets the devil disapears ..... IE no feedback i know they used to have to wait for custom drivers


----------



## starichok

Done


----------



## rakesh27

Guys,

Im chiming in on this, i thought the Devil 13 was a alittle more power hungier then its twin the 295x2, i know definitely the Devil 13 runs hotter then the 295x2 as its not water cooled.

I think your temps were ok, you should do push pull on your radiators, it really helps.

To be honest dont get the Devil 13, just get a 295x2 you will be very happy with this, or if you can wait the 390x is only around the corner.

I read somewhere the 390x has a huge jump from the 295x2 and a massive jump from the 290x, i know there is always something coming out around the corner.

If you can hold out for the 390x or wait for the 395x2 that will be a insane card, full on DX12/4k with impressive frame rates, saying that the 390x will be just as good.

I think this will be the last time AMD and NVidea will be making these monster cards, i think the technology is changing, soon they will be combining CPU & GPU chips togeather, like a all in one processor...

Thats my 2 cents,


----------



## wermad

Amd typically doesn't supercede it's current dual core card with the next Gen single core flagship. The reason is more then likely, the next Gen dual core card is still under proposal or development. I'm sure some instances it may edge out the old card but it won't beat it to a pulp. Saw it with 5970 vs 6970, 6990 vs 7970, 7990 vs 290x, and very likely will see this again with 295x2 vs 390x.

I think an Uber 390x may come out later to challenge Ti after the 395x2 launches.


----------



## deesee76

Can the DVI port on the 295x2 output 4k @ 60hz? Im considering ditching my samsung ud590 4k monitor as the card has a hardtime powering up the display through the displayport.


----------



## wermad

Nope. I have the same monitor. You saying there's display issues via dp or your card can't cope w/ with 4k gaming?


----------



## deesee76

I use the displayport and its fine when the monitor powers up. The monitor seems to be stuck in sleep mode and doesnot power up. Only works when i reset cmos. I have a feeling its my cheap display port cable, its 1.2 compatible. What cable r u using?


----------



## wermad

I was using an inexpensive oem cable (hp, dell, etc.) dp to dp cable I bought off ebay but I was using an amd adapter (dp to mdp). I bought a startech mdp to dp cable off newegg for ~$7 recently and that has worked flawless.

It seems like something is up w/ the mb. It should send the proper signals to the gpu to wake up. The fact you can only restore it is via cmos clear is a clue something is wrong w/ the mb. do you have onboard igpu disabled? load your specs on the rig builder (click your name on the top right, go to the bottom, click "edit", follow the rig builder).


----------



## deesee76

Any one running 4k with 2x 295x2? Is it worth it? I currently have a xfx 295x2 with a radeon 290x in 3 way. I play 4k battlefield series, gta 5 and looking ahead would like to max out all settings for all games running at 3840x2160? Actual owner feedback only as ive read enough reviews and benchmarks from various sources. Ultimate goal is not for coin mining or benchmark boasting, just want to play games as intended by the developers in 4k with max settings. Cant stand 1080p gaming as it looks so blocky now i have a taste of 4k.


----------



## deesee76

Hey thanks for the suggestion. Will look at the settings.


----------



## wermad

If the game takes advantage of quad fire, its a nice boost though some ppl have pretty solid frames w/ tri-fire. Honestly, if you're going w/ quads, make sure you have a psu that can handle it (see my list in sig rig or op) and go w/ custom water cooling.


----------



## xer0h0ur

Quote:


> Originally Posted by *rakesh27*
> 
> Guys,
> 
> Im chiming in on this, i thought the Devil 13 was a alittle more power hungier then its twin the 295x2, i know definitely the Devil 13 runs hotter then the 295x2 as its not water cooled.
> 
> I think your temps were ok, you should do push pull on your radiators, it really helps.
> 
> To be honest dont get the Devil 13, just get a 295x2 you will be very happy with this, or if you can wait the 390x is only around the corner.
> 
> I read somewhere the 390x has a huge jump from the 295x2 and a massive jump from the 290x, i know there is always something coming out around the corner.
> 
> If you can hold out for the 390x or wait for the 395x2 that will be a insane card, full on DX12/4k with impressive frame rates, saying that the 390x will be just as good.
> 
> I think this will be the last time AMD and NVidea will be making these monster cards, i think the technology is changing, soon they will be combining CPU & GPU chips togeather, like a all in one processor...
> 
> Thats my 2 cents,


Nope, not a single leaked benchmark puts the 390X past a 295X2. That would be 100% improvement over the 290X which is not only nuts, its not going to happen. The leaked benches put the 390X in TitanX territory.


----------



## soulwrath

Agreed with the statement above. Puts the 390x in thr same territory as a titan x. Might as well wait forna 395x2, supposed to be priced @ 1400 but the 390x is said to handle 4k well with just 1 card.


----------



## veaseomat

I'm thinking about putting ek water blocks on my 295x2's. Mainly because I don't think I'm going to upgrade to the 390 generation. Is there any performance to gain from custom water or are they already maxed out at 1100core 1600mem?


----------



## Alex132

Quote:


> Originally Posted by *veaseomat*
> 
> I'm thinking about putting ek water blocks on my 295x2's. Mainly because I don't think I'm going to upgrade to the 390 generation. Is there any performance to gain from custom water or are they already maxed out at 1100core 1600mem?


That's pretty much maxed out, you can squeeze maybe a few more Mhz out of it with water - but mainly you will see a massive decrease in noise and potentially longer lifespan.


----------



## veaseomat

Quote:


> Originally Posted by *Alex132*
> 
> That's pretty much maxed out, you can squeeze maybe a few more Mhz out of it with water - but mainly you will see a massive decrease in noise and potentially longer lifespan.


Thank you for the reply. Yeah so I can't complain about noise because I'm running my 9590 under phase. Also lifespan isn't so much a thing with me because I do want to upgrade when the performance is there. We'll have to see about the 390x's, I'm getting excited. I will probably hold out for the 395x2's rumored to drop in Q4 2015.

here's a pic of my rig after I put a custom loop on my C5FZ motherboard the other day:



Random question: what are everyone's asic scores on their 295x2 chips?
mine are:
75.8, 69.7 ; 71.2, 76.8


----------



## DividebyZERO

Quote:


> Originally Posted by *veaseomat*
> 
> Thank you for the reply. Yeah so I can't complain about noise because I'm running my 9590 under phase. Also lifespan isn't so much a thing with me because I do want to upgrade when the performance is there. We'll have to see about the 390x's, I'm getting excited. I will probably hold out for the 395x2's rumored to drop in Q4 2015.
> 
> here's a pic of my rig after I put a custom loop on my C5FZ motherboard the other day:
> 
> 
> 
> Random question: what are everyone's asic scores on their 295x2 chips?
> mine are:
> 75.8, 69.7 ; 71.2, 76.8


lol I am sooo waiting for 3xx as I am curious on the high end resolutions how it will perform with HBM.


----------



## F4ze0ne

Quote:


> Originally Posted by *veaseomat*
> 
> Random question: what are everyone's asic scores on their 295x2 chips?
> mine are:
> 75.8, 69.7 ; 71.2, 76.8


Both chips on mine are 75.6.

My single 290x is even better at 78.2.


----------



## Alex132

Lower ASIC means higher overclocking, generally. So you will find the 295X2's 290X cores will have lower ASICs in general compared to 290Xs.


----------



## wermad

Tsm160 mentioned something about the number relating to being better suited for air or water (custom) cooling.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Tsm160 mentioned something about the number relating to being better suited for air or water (custom) cooling.


Higher ASIC = better for overclocking on air*
Lower ASIC = better overclocking on water.

That higher ASIC doesn't really mean much other than it's more stable at higher temps, lower ASIC is generally better for a good air cooler (ie; not the stock 290X PoS), AIO coolers and water.

There is also a reason I'm being so _general_, because ASIC values are far from an absolute truth and should be taken as a rough guide only.


----------



## F4ze0ne

Quote:


> Quote:
> 
> 
> 
> Hi Dave, do u think AMD can share the ASIC quality range of voltage?
> 
> 
> 
> ASIC "quality" is a misnomer propobated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.
Click to expand...

https://forum.beyond3d.com/threads/amd-volcanic-islands-r1100-1200-8-9-series-speculation-rumour-thread.54117/page-107#post-1719982


----------



## Mega Man

Plus 1.

Hahaha I didn't even know this had been posted.
Quote:


> Originally Posted by *F4ze0ne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *veaseomat*
> 
> Random question: what are everyone's asic scores on their 295x2 chips?
> mine are:
> 75.8, 69.7 ; 71.2, 76.8
> 
> 
> 
> Both chips on mine are 75.6.
> 
> My single 290x is even better at 78.2.
Click to expand...

There is no "better"
Quote:


> Originally Posted by *Alex132*
> 
> Lower ASIC means higher overclocking, generally. So you will find the 295X2's 290X cores will have lower ASICs in general compared to 290Xs.


Totally false. Unless something has changed the 2xx has no meaning. As it was not released on how to properly read it. Going beyond that in the 79xx all it was was that the higher number was a low voltage leak chip. Lower was higher leakage. As wermad mentions it is for cooling. . Last I knew the numbers were rubbish on the 2xx series but that could of changed. Either way asic means absolutely nothing in terms of ocing.

Don't believe me. Ask t small about his which overclock far greater then most
Quote:


> Originally Posted by *wermad*
> 
> Tsm160 mentioned something about the number relating to being better suited for air or water (custom) cooling.


Qft


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Totally false. Unless something has changed the 2xx has no meaning. As it was not released on how to properly read it. Going beyond that in the 79xx all it was was that the higher number was a low voltage leak chip. Lower was higher leakage. As wermad mentions it is for cooling. . Last I knew the numbers were rubbish on the 2xx series but that could of changed. Either way asic means absolutely nothing in terms of ocing.


I am just relaying information that TheBlademaster01 gave me, and I trust his knowledge so yeah.


----------



## Mega Man

Well. It is false so please don't spread it. If there is doubt ask tsm or sugar.

Asic has nothing to do with overclocking potential


----------



## wermad

We should just stick to: "dude! You have asic card!"







.

I'll find out what mine show up as once I get home.


----------



## TooManyAlpacas

Speaking of Overclocking the card what are some one the OC's you guys have achieved, just trying to find out what I should use to OC the card and where to start with incressing the speed and power


----------



## wermad

I was looking forward to the 3Dfanboy competition but it looks like there won't be one this year. I'm holding on oc'ing my cards but I hear 1100 is obtainable. I would use afterburner.


----------



## Mega Man

i do 1150/1400 i probably can do more but i have not dialed it in, just threw it in and stressed it, ( including hours of gaming )


----------



## Coppy

Hello everybody !

What temperatures do you guys with waterblocks have under load on your 295x2 ?

My Temperatures are mid 60´s (65-68°C) after an hour of playing Far Cry 4

My System is on TriFireX and the 290x has 10°C lower temps !

System has 2x 480 Rads (45mm/30mm), 1x 240 Rad (30mm), i7 4790K (4,[email protected],25V) 295x2 (stock), 290x (1060/1350) all in one loop.

Both Cards are on EK Waterblocks. The waterblock on the 290x was preinstalled (Powercolor 290x LCS). So i´m wondering if i did something wrong with my 295x2.
(I used the thermal paste included with the EK Waterblock)

Are those temperatures still OK or will i have to disassemble the 295´s waterblock ?
Or do i need more radiator space ?

Coppy


----------



## Mega Man

Mine is in the 40s and 50s. But I have 5x480s and 4 pumps


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> I was looking forward to the 3Dfanboy competition but it looks like there won't be one this year. I'm holding on oc'ing my cards but I hear 1100 is obtainable. I would use afterburner.


Quote:


> Originally Posted by *Mega Man*
> 
> i do 1150/1400 i probably can do more but i have not dialed it in, just threw it in and stressed it, ( including hours of gaming )


Mine will do 1150/1500 with +100mV in Afterburner (for benching, can't do long periods of gaming) , since then I've flashed the Sapphire OC bios on my card so i could potentially get higher clocks thanks to the extra voltage i get in Trixx (equivalent to +300mV in Afterburner iirc)

oh, and my asic's are 73.4% and 75.2% for the very little it's worth


----------



## veaseomat

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mine will do 1150/1500 with +100mV in Afterburner (for benching, can't do long periods of gaming) , since then I've flashed the Sapphire OC bios on my card so i could potentially get higher clocks thanks to the extra voltage i get in Trixx (equivalent to +300mV in Afterburner iirc)
> 
> oh, and my asic's are 73.4% and 75.2% for the very little it's worth


roger that, I'll be doing the same haha.


----------



## wermad

Quote:


> Originally Posted by *Coppy*
> 
> Hello everybody !
> 
> What temperatures do you guys with waterblocks have under load on your 295x2 ?
> 
> My Temperatures are mid 60´s (65-68°C) after an hour of playing Far Cry 4
> 
> My System is on TriFireX and the 290x has 10°C lower temps !
> 
> System has 2x 480 Rads (45mm/30mm), 1x 240 Rad (30mm), i7 4790K (4,[email protected],25V) 295x2 (stock), 290x (1060/1350) all in one loop.
> 
> Both Cards are on EK Waterblocks. The waterblock on the 290x was preinstalled (Powercolor 290x LCS). So i´m wondering if i did something wrong with my 295x2.
> (I used the thermal paste included with the EK Waterblock)
> 
> Are those temperatures still OK or will i have to disassemble the 295´s waterblock ?
> Or do i need more radiator space ?
> 
> Coppy


Mine hover in the 50s with fans in low speed and in the high 40s (now that summer is approaching) with the fans @ medium speed. Benching on low speed they may hit the low 60s.

What radiators and fans do you have?


----------



## Coppy

Quote:


> Originally Posted by *wermad*
> 
> Mine hover in the 50s with fans in low speed and in the high 40s (now that summer is approaching) with the fans @ medium speed. Benching on low speed they may hit the low 60s.
> 
> What radiators and fans do you have?


Hello !

That temps are a lot lower than mine.
If the temps of my 295x2 would be like the 290x i wouldn´t even think about it...

There is 1x Alphacool Nexxxos XT45 480, 1x Alphacool Nexxxos ST30 480, 1x Aquacomputer Airplex Pro 240 (30mm) in this System.
The fans on the Rads are Corsair SP120 (Quiet/LED Edition). Top Rad is in push-pull the others are push-only..


----------



## xer0h0ur

I get high 40's to mid 50's with my 295X2, 290X, 4930K on 120mm and 360mm Alphacool NexXxos Monsta radiators using Noctua iPPC NF-F12 2000RPM fans. CPU temp is dependent on load.

For what its worth, I used Prolimatech PK-3 on the GPUs and the PLX chip and Fujipoly Ultra Extreme pads on the VRMs, mosfets and vRAM.


----------



## Roxycon

my temps after a few runs of 3dmark









EK supremacy evo, EK 295x2, two coolstream pe 120mm and a coolstream pe 240mm with 4 F1 EK vardar fans and a 450 rpm noiseblocker 140 mm fan (just for cooling my harddisks). pump is a ek ddc 3.2 pwm


----------



## Coppy

Quote:


> Originally Posted by *xer0h0ur*
> 
> I get high 40's to mid 50's with my 295X2, 290X, 4930K on 120mm and 360mm Alphacool NexXxos Monsta radiators using Noctua iPPC NF-F12 2000RPM fans. CPU temp is dependent on load.
> 
> For what its worth, I used Prolimatech PK-3 on the GPUs and the PLX chip and Fujipoly Ultra Extreme pads on the VRMs, mosfets and vRAM.


I´ve allready bought Acrtic Silver MX-2. But i don´t want to take the 295 apart. But i think i will have to.
Perhaps also a good time to built a second Loop







!


----------



## wermad

Quote:


> Originally Posted by *Coppy*
> 
> Hello !
> 
> That temps are a lot lower than mine.
> If the temps of my 295x2 would be like the 290x i wouldn´t even think about it...
> 
> There is 1x Alphacool Nexxxos XT45 480, 1x Alphacool Nexxxos ST30 480, 1x Aquacomputer Airplex Pro 240 (30mm) in this System.
> The fans on the Rads are Corsair SP120 (Quiet/LED Edition). Top Rad is in push-pull the others are push-only..


Are your cards oc'd? I'll have to run mine when I get home to get #s though my ambient is approaching 30c as the summer gets closer. 30-35c will be a pita to deal with it.


----------



## Coppy

Quote:


> Originally Posted by *wermad*
> 
> Are your cards oc'd? I'll have to run mine when I get home to get #s though my ambient is approaching 30c as the summer gets closer. 30-35c will be a pita to deal with it.


The 295x runs @ stock speeds and the 290x has a factory oc. (1060/1350)
Ambiente Temperatures are 20-22*C.


----------



## wermad

Quote:


> Originally Posted by *Coppy*
> 
> The 295x runs @ stock speeds and the 290x has a factory oc. (1060/1350)
> Ambiente Temperatures are 20-22*C.


Pic of your build? The airflow is very important and a few tweaks if needed can help.


----------



## Coppy

Quote:


> Originally Posted by *wermad*
> 
> Pic of your build? The airflow is very important and a few tweaks if needed can help.


There are some pictures in my Profil (Rig builder).
The Rads in the buttom are pushing air INTO the case. Front Fans are intake,too. Top rad is pushing the air out of the case, also the 140mm fan in the back is exhausting.


----------



## xer0h0ur

Well it certainly helps me that my 360 is externally mounted because airflow inside of my case is pretty downright atrocious.


----------



## deep33

Hi guys,

I need some help here. Here are my specs.

Cpu: i7-5930K (6 core)
MB: Asrock x99 extreme6
Ram: 16GB Corsair Vengeance 2400mhz
ssd: Samsung Evo 500 GB
*GPU: Radeon R9 295x2*

Monitor: 28 inch Samsung U28D590D - 4k 60Hz monitor.

At a resolution of 3840 by 2160, I have been able to run with "ULTRA DETAIL", the following games and get 60fps.

skyrim
dragonage inqusition
mass effect 3
and so on,,,the list is long.

The only issue is Witcher 2. I get an abysmal 14 to 17 fps and hard on the eye @ ultra detail.

This sucks because the only reason i specd and built this rig above is to play the Witcher 3 (after 6 years of sticking to a i7 920 vintage machine) The game is getting released in 2 weeks.

Are any of you guys having some serious fps drops on witcher 2. Is there a way around it? Would i have the same issue on the witcher 3? Any feedback is greatly appreciated

Thanks, god bless.


----------



## Agent Smith1984

Well, not sure on witcher 2, but I'd guess you'll be looking at a harder to run game with Witcher 3... HOWEVER, with it being a newer game, odds are it will be much better optimized for newer systems.

Keep in mind, that even the beloved 295, won't run all titles at 4k with highest settings....

Wish I could be of more help...

What driver are you using? That is a mighty strong PC to have problems running ANYTHING....


----------



## DividebyZERO

Quote:


> Originally Posted by *deep33*
> 
> Hi guys,
> 
> I need some help here. Here are my specs.
> 
> Cpu: i7-5930K (6 core)
> MB: Asrock x99 extreme6
> Ram: 16GB Corsair Vengeance 2400mhz
> ssd: Samsung Evo 500 GB
> *GPU: Radeon R9 295x2*
> 
> Monitor: 28 inch Samsung U28D590D - 4k 60Hz monitor.
> 
> At a resolution of 3840 by 2160, I have been able to run with "ULTRA DETAIL", the following games and get 60fps.
> 
> skyrim
> dragonage inqusition
> mass effect 3
> and so on,,,the list is long.
> 
> The only issue is Witcher 2. I get an abysmal 14 to 17 fps and hard on the eye @ ultra detail.
> 
> This sucks because the only reason i specd and built this rig above is to play the Witcher 3 (after 6 years of sticking to a i7 920 vintage machine) The game is getting released in 2 weeks.
> 
> Are any of you guys having some serious fps drops on witcher 2. Is there a way around it? Would i have the same issue on the witcher 3? Any feedback is greatly appreciated
> 
> Thanks, god bless.


Turn off Uber Sampling for starters, and probably AA as well it really shouldnt be an issue at 4k


----------



## deep33

Quote:


> Originally Posted by *DividebyZERO*
> 
> Turn off Uber Sampling for starters, and probably AA as well it really shouldnt be an issue at 4k


That did it!! Turning off "Uber sampling" shot my fps back up to 50+. Everything has been smooth and buttery and extremely glorious at 4K, ultra high settings on all games with this card brothers and sisters. I don't think i could ever go back to a 1080 resolution again.

4K Witcher 3, here i come!!

thanks,


----------



## Agent Smith1984

Aww man... you got me ready to go 4k myself!!!

I have my second tri-x 290 going in tonight, but am still going to be on this aging 1080P.....

Who'd of thought 4k would of pushed 1080 out the door so fast?

Guess we have high end (and really affordable at the moment) PC hardware to thank for that.


----------



## wermad

4k is the schizzle







.


----------



## Qazax

Hi Guys,

Specs are as follows:

Rampage IV extreme
3930k @ stock
EVGA 1kw

MSI R9 295 x2

So I am playing at 1080p which should be pretty easy for this card, but I am not sure on the FPS I am getting - I just tried AC Unity and with max everything including 8xaa I was getting around 40fps. I went to look at GPU-Z which was running in the background and logging for both GPU's, and I can see the GPU usage on both cards is spiking between 0% and 100% really quickly. I was expecting it to be more like high 90's solid? I turned AA off and it did not break 60fps and dropped as low as the high 20's when running through a crowded place.

I tried Farcry 4 at everything max with AA off and I was getting around 80FPS outside surrounded by grass and trees, but most importantly solid 90%+ utilisation on both GPU's

I am running Catalyst 15.4 beta and had the same issues on the previous non beta drivers.

If I am honest, I am a lot more familiar with NVIDIA cards and I am a bit stuck as to how to troubleshoot this. Is AC Unity just really that bad?


----------



## xer0h0ur

AC Unity is literally that freakin bad. I get better framerate than that at 4K no AA and no Gameworks options. You definitely don't want to be running any of the Gameworks options on AMD cards. There was a short period of time in which I had perfect performance with AC Unity and that was after the 2nd game patch before the Omega driver came out. After that its all gone to hell in a hand basket. I get crashes just trying to load the game. Nevermind actually being IN the game and those other set of problems. Whenever I do manage to get in the game the GPU clocks are fluctuating all over the damn place and the GPU usage is also equally as wonky. What driver patching fixes, game patches break and vice versa. Its a giant cluster you know what.


----------



## deep33

, ti
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Aww man... you got me ready to go 4k myself!!!
> 
> I have my second tri-x 290 going in tonight, but am still going to be on this aging 1080P.....
> 
> Who'd of thought 4k would of pushed 1080 out the door so fast?
> 
> Guess we have high end (and really affordable at the moment) PC hardware to thank for that.


When this card was initially released for about 1500 bucks a piece, 4k was outta my reach. But, for 650 bucks?!?! its a steal!!! (in comparison to its nvidia competitors ....gtx 980/titan x and how they fare in 4k for the price). I picked up my monitor on sale for about 440 bucks.


----------



## wermad

I picked up a month old Sammy from a forum. Guy picked up an Rog Swift and didn't want the Samsung anymore. Got it for $400 shipped and was in great condition. Once you get used to it, its awesome! I was gonna go with a Korean 32" wqhd, but I passed as I preferred the challenge of going 4k. Its been a smooth experience so far. Only real bummer w/ the Sammy is the stand; its crap and wobbles like a drunk (







). No vesa mounts so this is it unless i mod it for a better stand.


----------



## xer0h0ur

I hate the stand as well. Really no idea why they decided to forgo the vesa mount compatibility. Its not even the wobble of the stand that bothers me. Its that I am constantly resetting the monitor angle since it doesn't hold it well.


----------



## wermad

so gpuz (082) says it can't read the asic from the other three cores cept' the one the display is driven from. Gonna disable ulps and see if I can read them







.


----------



## xer0h0ur

Meh, pretty meaningless number if you ask me. 290X is 69.1, 295X2 is 73.5 GPU1 78.1 GPU2.


----------



## wermad

I know, but might as well keep the conversion going in the club







.

edit: too hot right now to go ssj3 on my rig. Wife won't like the extra heat







. (and yes, i said ssj3, as I don't consider 4 canon).


----------



## Coppy

I changed the thermalpaste on my 295x. Now i´ve got temperatures in the mid 50´s !
Now there´s only ~5°C difference between my two Cards !


----------



## veaseomat

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Aww man... you got me ready to go 4k myself!!!
> 
> I have my second tri-x 290 going in tonight, but am still going to be on this aging 1080P.....
> 
> Who'd of thought 4k would of pushed 1080 out the door so fast?
> 
> Guess we have high end (and really affordable at the moment) PC hardware to thank for that.


I personally didn't like 4k. It is nice, but not for everyday use as ONE monitor. I returned my samsung UD590 not long after using it because I'm looking for the best single monitor solution. However, I think it's a great tool to have in the arsenal.

I'm currently rocking the new BenQ 1440p 144hz 1ms TN freesync monitor, just waiting for multi-gpu support on freesync. I thought I liked resolution>frame rates but now that I have this I feel like it's the best middle ground. Also I'm currently looking at the uqhd? 3440x1440 widescreen curved monitors (they aren't all curved I just want to test that style). that scale is 21:9. so basically it would keep the pixel size im at with my 1440 but widen it from 2560 to 3440. I think it would be the PERFECT alternative to an eyefinity setup (which I have also tried with 1080p monitors). I will say this BenQ monitor has so many settings on it that allow it to shine for almost any game or application.

If you're going to go 4k I highly recommend the 31.5" monitors and not the 28" size. The pixels are just... too small lol. You'll find yourself hunting for mods to enlarge things in all kinds of daily applications so you don't have to focus so much. I should also add that I don't have 20/20 vision, im near sighted (cant see far) about -2.5 in both eyes (not that bad, I can still bullseye at 100 meters with my m4 w/o glasses ). I feel like this also gives me an advantage on monitors... free anti-aliasing! lol j/k. but 4k does allow you to turn off AA or keep it very low which is nice, in 1440p i only really need 4x AA, anything above is too blurry. So again I think 1440p is the sweet spot for gaming right now and I'm sure many enthusiasts will agree with me... my 295x2's in crossfire will push almost any game in max settings over 100 frames and it looks beautiful. Also remember the 295x2's are not 8gb cards... they are 2x4gb cards.

That's all me, remember to do your own research and think about how it all applies in your personal situation. ; )


----------



## rdr09

i use 1440 for regular use and go 4K for games. takes 2 secs to switch.


----------



## wermad

Quote:


> Originally Posted by *Coppy*
> 
> Hello !
> 
> That temps are a lot lower than mine.
> If the temps of my 295x2 would be like the 290x i wouldn´t even think about it...
> 
> There is 1x Alphacool Nexxxos XT45 480, 1x Alphacool Nexxxos ST30 480, 1x Aquacomputer Airplex Pro 240 (30mm) in this System.
> The fans on the Rads are Corsair SP120 (Quiet/LED Edition). Top Rad is in push-pull the others are push-only..


The thinner Rads do better with high (>1500) rpm fans. The SP120 qe led run ~1600 max, do you have then undervolted?


----------



## Coppy

Quote:


> Originally Posted by *wermad*
> 
> The thinner Rads do better with high (>1500) rpm fans. The SP120 qe led run ~1600 max, do you have then undervolted?


For shure ! Would be too noisy.

But they are regulated from my aquaero. with higher temps they spin faster ! depenting on which rad from 35% to 70%.

But with the arctic mx-2 the temps are now ~55°C. But i think i will change the nexxxos st30 with a nexxxos ut60 !


----------



## wermad

Ut60 does good with med to low rpm's. Also, try to have positive air pressure.


----------



## Coppy

Quote:


> Originally Posted by *wermad*
> 
> Ut60 does good with med to low rpm's. Also, try to have positive air pressure.


Thats good to know !
Are there any benefitts with push/pull configuration ? I only have the top rad in push/pull. is it worth it to go on push/pull on the other two rads, too ?


----------



## Alex132

Quote:


> Originally Posted by *Coppy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Ut60 does good with med to low rpm's. Also, try to have positive air pressure.
> 
> 
> 
> Thats good to know !
> Are there any benefitts with push/pull configuration ? I only have the top rad in push/pull. is it worth it to go on push/pull on the other two rads, too ?
Click to expand...

Generally you can use lower RPM fans for the same temps in push/pull - but with less noise.
Or you can crank the RPM on them the same as you would have had with just push or pull and get lower temps, but more noise.

Push/pull also is better for denser radiators / heatsinks where good static pressure is important.


----------



## xer0h0ur

I would definitely suggest push/pull the thicker you go with radiators. Just make sure you're getting fans with good static pressure ratings. CFM isn't the rating to look at for radiator fans.


----------



## wermad

Yeah p/p with the Ut60 if you can. At most you may see up to 30% improvement vs one bank. If you can't, pull has a slight advantage.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Yeah p/p with the Ut60 if you can. At most you may see up to 30% improvement vs one bank. If you can't, pull has a slight advantage.


Depends on the fan.

I use push because less noise. If I used pull, I'd have the fan right against the case-top-grill - which causes a lot of turbulence, restiction and noise. There is a nice ~1cm gap between the rad fins and rad housing, so that means less noise than before


----------



## wermad

That's why I said "at most"







. I think the test where it hit 30% was with ap15 gt's (don't recall if with or without shroud). It maybe possible but it's not concrete tbh.


----------



## Coppy

Quote:


> Originally Posted by *wermad*
> 
> Yeah p/p with the Ut60 if you can. At most you may see up to 30% improvement vs one bank. If you can't, pull has a slight advantage.


Push/pull with an ut60 should not be a Problem. With the 900D there is enough space in the buttom.
But i´m also thinking about splitting the Loop. I don´t know if that´s really necessary, but this is something i allways wanted to do. And now i have the space..... and the Money....!








And so, the System doesn´t look the same all the time....


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> That's why I said "at most"
> 
> 
> 
> 
> 
> 
> 
> . I think the test where it hit 30% was with ap15 gt's (don't recall if with or without shroud). It maybe possible but it's not concrete tbh.


Interesting, I would have thought it were with a weaker SP fan. P/P greatly improves SP.... but yeah like you said too many variables to be concrete about it









Speaking of which, do you think P/P will work well on the stock 295X2 radiator with Cougar Vortex 800-1500rpm fans?


----------



## xer0h0ur

Quote:


> Originally Posted by *Coppy*
> 
> Push/pull with an ut60 should not be a Problem. With the 900D there is enough space in the buttom.
> But i´m also thinking about splitting the Loop. I don´t know if that´s really necessary, but this is something i allways wanted to do. And now i have the space..... and the Money....!
> 
> 
> 
> 
> 
> 
> 
> 
> And so, the System doesn´t look the same all the time....


Honestly that is probably the only doubt I left in my mind. I wondered if I would have been better off isolating the CPU loop from the GPU loop. I could have just as easily bought a dual pump reservoir instead of a single pump one. Screw it. I will give that a whirl on my Zen build.


----------



## wermad

Dual loops won't help tbh. I've stuck with single loops after my first major build in 09'.


----------



## PCModderMike

Leaving green, and going red.










Gotta put it through its paces over the next few days, then block it up once my package arrives from EK.


----------



## wermad

Sup dude









Welcomes...check psu list BTW


----------



## DividebyZERO

Quote:


> Originally Posted by *PCModderMike*
> 
> Leaving green, and going red.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gotta put it through its paces over the next few days, then block it up once my package arrives from EK.


Hope you enjoy the red side for a bit. Good luck!


----------



## PCModderMike

Quote:


> Originally Posted by *wermad*
> 
> Sup dude
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Welcomes...check psu list BTW


Hey werm!

Checked that list, I have a Strider ST85F-GS, all good.









Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> Leaving green, and going red.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gotta put it through its paces over the next few days, then block it up once my package arrives from EK.
> 
> 
> 
> Hope you enjoy the red side for a bit. Good luck!
Click to expand...

Thanks!


----------



## xer0h0ur

Just as a suggestion, don't use that EK TIM they give you with the blocks. Not only do they give you very little of it, its not exactly a high performer. So basically grab yourself your TIM of choice for the GPUs and PLX chip.

You may also want to get yourself some better thermal pads for the VRMs at least.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just as a suggestion, don't use that EK TIM they give you with the blocks. Not only do they give you very little of it, its not exactly a high performer. So basically grab yourself your TIM of choice for the GPUs and PLX chip.


EK TIM is actually gelid extreme. Its quite good.


----------



## xer0h0ur

Quote:


> Originally Posted by *electro2u*
> 
> EK TIM is actually gelid extreme. Its quite good.


Well then, never mind the aforementioned. GC Extreme is great stuff. But now I am confused. I could have sworn someone had complained before about their temps with that TIM. Perhaps it was a misapplication.


----------



## PCModderMike

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Just as a suggestion, don't use that EK TIM they give you with the blocks. Not only do they give you very little of it, its not exactly a high performer. So basically grab yourself your TIM of choice for the GPUs and PLX chip.
> 
> 
> 
> EK TIM is actually gelid extreme. Its quite good.
Click to expand...

^QFT

But even if they had some no name TIM included with their blocks, this isn't my first rodeo with water cooling and I've got several tubes of other stuff laying around. Including some Gelid GC extreme.


----------



## Coppy

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well then, never mind the aforementioned. GC Extreme is great stuff. But now I am confused. I could have sworn someone had complained before about their temps with that TIM. Perhaps it was a misapplication.


I just changed the original thermalpaste yesterday on my 295x2´s waterblock. Temps are now ~10°C down ! Now i use arctic silver mx-2.


----------



## wermad

I'm using Ceramique 2 and its pretty decent.

As for the ek stuff, there was a tim w/ no name they included. I got Gelid w/ my "Clean" cpu block but, with the earlier CSQ "crop circle" version I bought to match my ek Lightning crop-circle blocks, it didn't say gelid. It was a bit runny from what I could observe. I think it was the EK manfacturer-rep (I think it was tiborr), said it was gelid but just generic packaging. I didn't pay too much attention to the temps since I had switched from a 4670k to a 4820K. my ek 780/titan blocks did come w/ gelid.

Btw, was on my phone all day and so it was hard to keep up w/ posts and updates. my offices wifi can get spotty but I managed through the day







. Remember to fill out the form to be added to the owner's list!


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Dual loops won't help tbh. I've stuck with single loops after my first major build in 09'.


This. They don't actually do anything other than convenience or isolated increased water temps when stressing only CPU or GPU. Not worth the money.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Just as a suggestion, don't use that EK TIM they give you with the blocks. Not only do they give you very little of it, its not exactly a high performer. So basically grab yourself your TIM of choice for the GPUs and PLX chip.
> 
> You may also want to get yourself some better thermal pads for the VRMs at least.


I love MX4. By far my favorite paste.


----------



## AeroXbird

Count me in to the club











Big thanks to everyone in this topic, it really helped to push me over the edge to buy this thing.


----------



## TooManyAlpacas

Welcome I think you will really enjoy this card


----------



## wermad

Quote:


> Originally Posted by *AeroXbird*
> 
> Count me in to the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Big thanks to everyone in this topic, it really helped to push me over the edge to buy this thing.


Tasty....yes....very tasty indeed







.

Welcome


----------



## professionnal

hi









I just disabled one gpu of my 295x2 and GTA 5 running better :/ no more stutterings and my freesync seems to work finally


----------



## rdr09

Quote:


> Originally Posted by *professionnal*
> 
> hi
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just disabled one gpu of my 295x2 and GTA 5 running better :/ no more stutterings and my freesync seems to work finally


hello, Welcome to OCN. GTAV works in crossfire. it even works with eyeinfinity. its freesync that does not have support.


----------



## professionnal

Yes but i have a lot of stutterings in crossfire


----------



## rdr09

Quote:


> Originally Posted by *professionnal*
> 
> Yes but i have a lot of stutterings in crossfire


upper right hand corner of the page is the Rigbuilder. fill it out, so other can help. if you like help.


----------



## joeh4384

GTA crossfire for me has been pretty smooth.


----------



## PCModderMike

Well I got my 295X2 installed and running, and it's a *beast*.......when my system isn't crashing :|

It started while playing GTA V, and my first thought was maybe it's an issue with just the game. But it's crashing for every game, or benchmark, basically whenever the system is under full load. It will just be running full steam ahead, and then completely cut off.
My PSU is a Silverstone Strider ST85F-GS and it _should_ be able to handle the card no problems, but I think there is an issue with my sleeved cables, specifically my 24 pin. It was my first attempt at fully sleeving a power supply, and my first time splicing those pesky double wires together that are common on most power supplies. I don't think I did the best job on my splices and my guess is it's contributing to the issue. I pulled out my sleeved PCIe cables and swapped in some stock cables I still had, but the crashing was still happening.
I don't have another unsleeved stock 24 pin cable to test though. I did order some custom sleeved cables from http://www.ensourced.net/, but it's going to be at least a week before they arrive.

So I did the next logical thing and ordered a new PSU.







I have a Silverstone Strider ST1500-GS coming tomorrow. Now the first thing I want to do, is use the stock 24 pin cable that comes with the 1500W unit and test it on my 850W. Hopefully I'm right, and the system no longer crashes. Then I'll install the 1500W unit and just use that from now on, since I have another use for the 850W PSU elsewhere.

But if the system still crashes, guess I'll have to start pursuing other potential causes. I really don't feel like going down that road.


----------



## professionnal

Quote:


> Originally Posted by *rdr09*
> 
> upper right hand corner of the page is the Rigbuilder. fill it out, so other can help. if you like help.


Done thx


----------



## deep33

Quote:


> Originally Posted by *professionnal*
> 
> Yes but i have a lot of stutterings in crossfire


All your stuttering issues will be over when you install MSI afterburner/Riva Tuner package (Free) and limit the framerate (60) there. MSI afterburner works great for this card. Solved all issues with skyrim, masseffect3 , etc (to name a few).


----------



## rdr09

Quote:


> Originally Posted by *deep33*
> 
> All your stuttering issues will be over when you install MSI afterburner/Riva Tuner package (Free) and limit the framerate (60) there. MSI afterburner works great for this card. Solved all issues with skyrim, masseffect3 , etc (to name a few).


stutter in skyrim? modded?


----------



## wermad

Quote:


> Originally Posted by *PCModderMike*
> 
> 
> 
> Well I got my 295X2 installed and running, and it's a *beast*.......when my system isn't crashing :|
> 
> It started while playing GTA V, and my first thought was maybe it's an issue with just the game. But it's crashing for every game, or benchmark, basically whenever the system is under full load. It will just be running full steam ahead, and then completely cut off.
> My PSU is a Silverstone Strider ST85F-GS and it _should_ be able to handle the card no problems, but I think there is an issue with my sleeved cables, specifically my 24 pin. It was my first attempt at fully sleeving a power supply, and my first time splicing those pesky double wires together that are common on most power supplies. I don't think I did the best job on my splices and my guess is it's contributing to the issue. I pulled out my sleeved PCIe cables and swapped in some stock cables I still had, but the crashing was still happening.
> I don't have another unsleeved stock 24 pin cable to test though. I did order some custom sleeved cables from http://www.ensourced.net/, but it's going to be at least a week before they arrive.
> 
> So I did the next logical thing and ordered a new PSU.
> 
> 
> 
> 
> 
> 
> 
> I have a Silverstone Strider ST1500-GS coming tomorrow. Now the first thing I want to do, is use the stock 24 pin cable that comes with the 1500W unit and test it on my 850W. Hopefully I'm right, and the system no longer crashes. Then I'll install the 1500W unit and just use that from now on, since I have another use for the 850W PSU elsewhere.
> 
> But if the system still crashes, guess I'll have to start pursuing other potential causes. I really don't feel like going down that road.


It can happen. someone was using adapters and after t/s it like crazy (with us as well), it was that simple. This beast is very picky on psu, and hence why I felt compelled to put the list together. Speaking of custom 24-pin, I bought a Cooler Master V series ps 24pin and swapped it for my stock Lepa (as well as the cpu). What a pita and I know about the double spliced cables. Luckily, both had the double cables and the the splice. Just needed to rearrange them. I made a diagram of both setups and it helped. Recently, one of the doubles pulled off the pin. I had to find my notes and diagrams to find out what it was and how to fix it. Without them, i would have scrapped the cable and gone back to stock. One thing i always recommend, test your setup before you start modding things, if possible. That way, you know if things worked or not. I spent a few days testing my stock 295x2 with the V1000 and just fired it up with two for boot up purposes. Once the G1600 arrived, I tested both to test stability.

Btw, all those with GTA v users, wish i could help but I found the previous version very boring and gave up after a few minutes. I do like 3p games but I just don't like the whole un-linear method. I prefer linear tbh









edit; freesync/crossfire news:

http://www.overclock.net/t/1553512/techreport-amd-delays-freesync-support-for-multi-gpu-systems


----------



## rakesh27

When i first played GTA V, very stuttery, when i browsed the web someone said AMD released driver v15.4beta for GTAV.

So i installed test again, vola no more stutter.

Wierd thing though im running trifire, i notice in MSI OC only 2 gpu's were running instead 3, i think they may have fixed it for dual GPU's but maybe the driver is not perfected for tri or quad fire.

Im just happy its not stuttering... at least the game sees all my vram 12gb, so i can set everything to maxium with playable framerates....

Try 15.4Beta, ive also tested on a few other games and they seem fine and fully working properly.


----------



## deep33

Quote:


> Originally Posted by *rdr09*
> 
> stutter in skyrim? modded?


I only got/played skyrim (for the first time) a week ago. Havn't in
Quote:


> Originally Posted by *PCModderMike*
> 
> 
> 
> Well I got my 295X2 installed and running, and it's a *beast*.......when my system isn't crashing :|
> 
> It started while playing GTA V, and my first thought was maybe it's an issue with just the game. But it's crashing for every game, or benchmark, basically whenever the system is under full load. It will just be running full steam ahead, and then completely cut off.
> My PSU is a Silverstone Strider ST85F-GS and it _should_ be able to handle the card no problems, but I think there is an issue with my sleeved cables, specifically my 24 pin. It was my first attempt at fully sleeving a power supply, and my first time splicing those pesky double wires together that are common on most power supplies. I don't think I did the best job on my splices and my guess is it's contributing to the issue. I pulled out my sleeved PCIe cables and swapped in some stock cables I still had, but the crashing was still happening.
> I don't have another unsleeved stock 24 pin cable to test though. I did order some custom sleeved cables from http://www.ensourced.net/, but it's going to be at least a week before they arrive.
> 
> So I did the next logical thing and ordered a new PSU.
> 
> 
> 
> 
> 
> 
> 
> I have a Silverstone Strider ST1500-GS coming tomorrow. Now the first thing I want to do, is use the stock 24 pin cable that comes with the 1500W unit and test it on my 850W. Hopefully I'm right, and the system no longer crashes. Then I'll install the 1500W unit and just use that from now on, since I have another use for the 850W PSU elsewhere.
> 
> But if the system still crashes, guess I'll have to start pursuing other potential causes. I really don't feel like going down that road.


AMD specifies atleast 28A per 8pin auxillary (i would leave a safety factor of atleast 1.35 or ~40 amps per connector to be on the safe side). Keeping this in mind,

i) How old is your ST85F-GS ( account for capacitor aging, etc). Account for how many powersuckers you have in your case (you could do the math yourself or use pcpartpicker or something similar for a coarse estimate) . But here are the outputs i'm seeing for this psu [email protected], [email protected], [email protected], [email protected], [email protected] . And it'd be interesting to know how you had this connected while you were crashing away (rail to connector)

ii) But, if you already have your "brand new" ST1500-GS ([email protected], [email protected], [email protected], [email protected], [email protected], [email protected]) and have them individually powered through 2 seperate rails, that's plenty, and you can simply forget about the math there


----------



## wermad

Quote:


> Originally Posted by *deep33*
> 
> I only got/played skyrim (for the first time) a week ago. Havn't in
> AMD specifies atleast 28A per 8pin auxillary (i would leave a safety factor of atleast 1.35 or ~40 amps per connector to be on the safe side). Keeping this in mind,
> 
> i) How old is your ST85F-GS ( account for capacitor aging, etc). Account for how many powersuckers you have in your case (you could do the math yourself or use pcpartpicker or something similar for a coarse estimate) . But here are the outputs i'm seeing for this psu [email protected], [email protected], [email protected], [email protected], [email protected] . And it'd be interesting to know how you had this connected while you were crashing away (rail to connector)
> 
> ii) But, if you already have your "brand new" ST1500-GS ([email protected], [email protected], [email protected], [email protected], [email protected], [email protected]) and have them individually powered through 2 seperate rails, that's plenty, and you can simply forget about the math there


You don't need 40amp per connector. The list exist for meeting or exceeding amd's requirements. So no need to start giving useless advise. I'm running on 30amp rails for each of my 8pins and I've had no power issues. The system runs off the 25amp rails. The only thing the list doesn't do is tell you the distribution. If you can't figure this, ask.

Edit: don't mean to come off as rude, but power needs shouldn't be taken lightly with this beast. I apologize in advance if you felt this was a bit harsh.

-wermad


----------



## PCModderMike

40 amps.....per connector....


----------



## Mega Man

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> 
> 
> Well I got my 295X2 installed and running, and it's a *beast*.......when my system isn't crashing :|
> 
> It started while playing GTA V, and my first thought was maybe it's an issue with just the game. But it's crashing for every game, or benchmark, basically whenever the system is under full load. It will just be running full steam ahead, and then completely cut off.
> My PSU is a Silverstone Strider ST85F-GS and it _should_ be able to handle the card no problems, but I think there is an issue with my sleeved cables, specifically my 24 pin. It was my first attempt at fully sleeving a power supply, and my first time splicing those pesky double wires together that are common on most power supplies. I don't think I did the best job on my splices and my guess is it's contributing to the issue. I pulled out my sleeved PCIe cables and swapped in some stock cables I still had, but the crashing was still happening.
> I don't have another unsleeved stock 24 pin cable to test though. I did order some custom sleeved cables from http://www.ensourced.net/, but it's going to be at least a week before they arrive.
> 
> So I did the next logical thing and ordered a new PSU.
> 
> 
> 
> 
> 
> 
> 
> I have a Silverstone Strider ST1500-GS coming tomorrow. Now the first thing I want to do, is use the stock 24 pin cable that comes with the 1500W unit and test it on my 850W. Hopefully I'm right, and the system no longer crashes. Then I'll install the 1500W unit and just use that from now on, since I have another use for the 850W PSU elsewhere.
> 
> But if the system still crashes, guess I'll have to start pursuing other potential causes. I really don't feel like going down that road.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It can happen. someone was using adapters and after t/s it like crazy (with us as well), it was that simple. This beast is very picky on psu, and hence why I felt compelled to put the list together. Speaking of custom 24-pin, I bought a Cooler Master V series ps 24pin and swapped it for my stock Lepa (as well as the cpu). What a pita and I know about the double spliced cables. Luckily, both had the double cables and the the splice. Just needed to rearrange them. I made a diagram of both setups and it helped. Recently, one of the doubles pulled off the pin. I had to find my notes and diagrams to find out what it was and how to fix it. Without them, i would have scrapped the cable and gone back to stock. One thing i always recommend, test your setup before you start modding things, if possible. That way, you know if things worked or not. I spent a few days testing my stock 295x2 with the V1000 and just fired it up with two for boot up purposes. Once the G1600 arrived, I tested both to test stability.
> 
> Btw, all those with GTA v users, wish i could help but I found the previous version very boring and gave up after a few minutes. I do like 3p games but I just don't like the whole un-linear method. I prefer linear tbh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit; freesync/crossfire news:
> 
> http://www.overclock.net/t/1553512/techreport-amd-delays-freesync-support-for-multi-gpu-systems
Click to expand...





i just had to make a lepa 24pin..... it has a wire that pigybacks to FIVE different pins .... dear god


----------



## xer0h0ur

Wat


----------



## wermad

12v or 3.3v?


----------



## PCModderMike

Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> 
> 
> Well I got my 295X2 installed and running, and it's a *beast*.......when my system isn't crashing :|
> 
> It started while playing GTA V, and my first thought was maybe it's an issue with just the game. But it's crashing for every game, or benchmark, basically whenever the system is under full load. It will just be running full steam ahead, and then completely cut off.
> My PSU is a Silverstone Strider ST85F-GS and it _should_ be able to handle the card no problems, but I think there is an issue with my sleeved cables, specifically my 24 pin. It was my first attempt at fully sleeving a power supply, and my first time splicing those pesky double wires together that are common on most power supplies. I don't think I did the best job on my splices and my guess is it's contributing to the issue. I pulled out my sleeved PCIe cables and swapped in some stock cables I still had, but the crashing was still happening.
> I don't have another unsleeved stock 24 pin cable to test though. I did order some custom sleeved cables from http://www.ensourced.net/, but it's going to be at least a week before they arrive.
> 
> So I did the next logical thing and ordered a new PSU.
> 
> 
> 
> 
> 
> 
> 
> I have a Silverstone Strider ST1500-GS coming tomorrow. Now the first thing I want to do, is use the stock 24 pin cable that comes with the 1500W unit and test it on my 850W. Hopefully I'm right, and the system no longer crashes. Then I'll install the 1500W unit and just use that from now on, since I have another use for the 850W PSU elsewhere.
> 
> But if the system still crashes, guess I'll have to start pursuing other potential causes. I really don't feel like going down that road.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It can happen. someone was using adapters and after t/s it like crazy (with us as well), it was that simple. This beast is very picky on psu, and hence why I felt compelled to put the list together. Speaking of custom 24-pin, I bought a Cooler Master V series ps 24pin and swapped it for my stock Lepa (as well as the cpu). What a pita and I know about the double spliced cables. Luckily, both had the double cables and the the splice. Just needed to rearrange them. I made a diagram of both setups and it helped. Recently, one of the doubles pulled off the pin. I had to find my notes and diagrams to find out what it was and how to fix it. Without them, i would have scrapped the cable and gone back to stock. One thing i always recommend, test your setup before you start modding things, if possible. That way, you know if things worked or not. I spent a few days testing my stock 295x2 with the V1000 and just fired it up with two for boot up purposes. Once the G1600 arrived, I tested both to test stability.
> 
> Btw, all those with GTA v users, wish i could help but I found the previous version very boring and gave up after a few minutes. I do like 3p games but I just don't like the whole un-linear method. I prefer linear tbh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit; freesync/crossfire news:
> 
> http://www.overclock.net/t/1553512/techreport-amd-delays-freesync-support-for-multi-gpu-systems
> 
> Click to expand...
> 
> 
> 
> 
> 
> i just had to make a lepa 24pin..... it has a wire that pigybacks to FIVE different pins .... dear god
Click to expand...



My condolences


----------



## professionnal

same thing with new patch , lot of stuttering in GTA 5 with crossfire activated , crossfire off : my game is very smooth, no stuttering :/

I have AMD driver v15.4beta


----------



## BradleyW

Quote:


> Originally Posted by *professionnal*
> 
> same thing with new patch , lot of stuttering in GTA 5 with crossfire activated , crossfire off : my game is very smooth, no stuttering :/
> 
> I have AMD driver v15.4beta


Enabling Half Vsync seemed to reduce 85% of stuttering in CFX for me. 144Hz + half = Locked 72fps.


----------



## Alex132

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *professionnal*
> 
> same thing with new patch , lot of stuttering in GTA 5 with crossfire activated , crossfire off : my game is very smooth, no stuttering :/
> 
> I have AMD driver v15.4beta
> 
> 
> 
> Enabling Half Vsync seemed to reduce 85% of stuttering in CFX for me. 144Hz + half = Locked 72fps.
Click to expand...

Half V-Sync on 60Hz monitors is also known as "Ubisoft Cinematic Mode".


----------



## rakesh27

Wouldn't you have v sync off in all games. If you have a good setup you don't need V sync as I thought you get better frame rates instead of it being locked to 60.

With it off you would always get 60 but anything higher Is bonus at least this way you would see the true fps you are getting.


----------



## PCModderMike

It's good to have vsync off if benchmarking to see what the maximum FPS your system is capable of getting. But during normal gaming, most people like to enable vsync to prevent screen tearing which is common when your monitor only supports 60Hz, yet you're getting much higher FPS.


----------



## electro2u

Quote:


> Originally Posted by *PCModderMike*
> 
> It's good to have vsync off if benchmarking to see what the maximum FPS your system is capable of getting. But during normal gaming, most people like to enable vsync to prevent screen tearing which is common when your monitor only supports 60Hz, yet you're getting much higher FPS.


Interestingly, i have better results using a frame limiter (same as vsync but different framerate than whatever windows says is the native refresh rate). A lot of 60 hz hdmi displays reprt to windows as the ntsc standard of 59.94hz. Setting frame limiter to exactly 60 hz can sometimes improve a cyclical frame skip every 17 seconds or so As a result of 59.94fps vs. 60fps. Besides that cyclical skipping usually tears at the exact same spot on the screen, and using a frame limit that doesnt match the refresh rate can help as well. If im getting a solid 120fps on iracing.com i use a frame limit of 122 fps instead of the monitors 120hz because the spot where i get screen tearing is less noticeable (higher up) on screen.


----------



## wermad

We should have a list of scores







. Should help those looking to compare or what to expect in benches.


----------



## Feyris

Quote:


> Originally Posted by *wermad*
> 
> We should have a list of scores
> 
> 
> 
> 
> 
> 
> 
> . Should help those looking to compare or what to expect in benches.


Then we shall~ still need to update videos too...


----------



## wermad

I'll see if I can run some. My cpu is average and won't budge from 4.6 without some serious tweaking (don't really care tbh







).


----------



## PCModderMike

Well my 24 pin theory was correct. I removed the sleeved 24 pin cable I made, and swapped in the 24 pin from the new 1500W unit, and the 850W unit with the non-sleeved 24 pin did not crash the system while under load.
Sooooo, my splicing sucked.









Once done testing, I installed the 1500W unit to use from now on anyway, and it's a beast.


----------



## Dagamus NM

I got mine up and running today. Didn't really have a need for it but with what the price has dropped down to on these I just couldn't help it.

I want to see how well the double point precision of the card translates into Adobe apps. I shot about 30 minutes of video on my 6D this afternoon. I will put it together and edit everything in premiere at some point this week and report back.

I was glad to read that the multiple rail Lepa 1600 isn't an issue with this card. I currently have it in a box with an older rosewill 1300w single rail and it seems fine. A couple of heaven runs were just about flawless in the few spots where it usually binds a bit.


----------



## electro2u

So I got my 295x2 back from the person who bought it from me last week. I tested it out the wazoo before sending it off and it was running great. Not sure what happened but the thing is in need of repair now. It displays video (with some noticeable artifacting) in system BIOS and Windows safe mode, but installing drivers causes black screen once the drivers try to load.

Turned in an RMA request to MSI and got immediately approved. Haven't heard of too many issues like this. I'm pretty surprised, there is absolutely no signs of damage on the card as far as I can tell. Sort of curious what MSI will do. I'd be surprised if they have any to replace it with, and I'm not entirely sure it can be fixed, though I have no idea what is wrong with it. I'm kind of hoping they will give me fair market value, but with my luck probably not.


----------



## wermad

MSI is great. Luv their cs. I would have loved to get two of their Vesuvius but I go a week old XFX and a brand new XFX instead. They were awesome with helping me rma a couple of Tahiti Lightnings.


----------



## AeroXbird

Hm, anyone here that has issues with this card downclocking itself during gaming?

While playing GTA 5 for example, the card randomly starts clocking down to 950 and sometimes even 700mhz.
All while the measured temperature is around 47C, which is WELL below the 75C throttle point...
Any ideas?

EDIT:

Just did some testing, when my VRAM usage in GTA5 goes above 7GB both cores start to throttle down regardless of temperature, could it be a game related issue, or driver related?
GPU Utilization seems to be going down once I get close to 6.5GB of VRAM, sometimes dropping down to 20% on both cores, causing a big framedrop.


----------



## xer0h0ur

Quote:


> Originally Posted by *electro2u*
> 
> So I got my 295x2 back from the person who bought it from me last week. I tested it out the wazoo before sending it off and it was running great. Not sure what happened but the thing is in need of repair now. It displays video (with some noticeable artifacting) in system BIOS and Windows safe mode, but installing drivers causes black screen once the drivers try to load.
> 
> Turned in an RMA request to MSI and got immediately approved. Haven't heard of too many issues like this. I'm pretty surprised, there is absolutely no signs of damage on the card as far as I can tell. Sort of curious what MSI will do. I'd be surprised if they have any to replace it with, and I'm not entirely sure it can be fixed, though I have no idea what is wrong with it. I'm kind of hoping they will give me fair market value, but with my luck probably not.


I wonder if they static shocked it. Other than that did you verify the same BIOS was present on both sides of the switch? Can't really think of anything else they may have messed with.


----------



## F4ze0ne

Quote:


> Originally Posted by *AeroXbird*
> 
> Hm, anyone here that has issues with this card downclocking itself during gaming?
> 
> While playing GTA 5 for example, the card randomly starts clocking down to 950 and sometimes even 700mhz.
> All while the measured temperature is around 47C, which is WELL below the 75C throttle point...
> Any ideas?
> 
> EDIT:
> 
> Just did some testing, when my VRAM usage in GTA5 goes above 7GB both cores start to throttle down regardless of temperature, could it be a game related issue, or driver related?
> GPU Utilization seems to be going down once I get close to 6.5GB of VRAM, sometimes dropping down to 20% on both cores, causing a big framedrop.


Yes. Mine does this as well after a few hours of playing. Only fix is to restart the game.


----------



## Alex132

6.5GB.... 7GB usage? Wuh?

Is GTA-V Mantle/DX12?


----------



## xer0h0ur

Its showing unified vRAM usage.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> I wonder if they static shocked it. Other than that did you verify the same BIOS was present on both sides of the switch? Can't really think of anything else they may have messed with.


Don't really think the buyer did anything to it necessarily. They have a very long history of good trading practices. I did test the card and it was running fine, but I seem to recall having a hard time getting it out of the PCIE slot when I was ready to pack it up. It's possible I damaged it somehow, although all the traces and connector look ok. Sometimes things get bumped around in shipping and die. Who knows.


----------



## xer0h0ur

Well either way I hope they replace it for you.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its showing unified vRAM usage.


So surely then 7Gb = 3.5GB, 6GB = 3GB. etc?

And is this in MSI AB or GTA V's VRAM measurer... because even on my 460 I can comfortably go ~200MB above that.


----------



## xer0h0ur

Quote:


> Originally Posted by *Alex132*
> 
> So surely then 7Gb = 3.5GB, 6GB = 3GB. etc?
> 
> And is this in MSI AB or GTA V's VRAM measurer... because even on my 460 I can comfortably go ~200MB above that.


I don't own the game so I can't tell you which method people are using however if you have unified monitoring enabled in Afterburner then it will combine used vRAM just the same. I am not a game developer or driver engineer so I can't give you an explanation on how or why its using 3.5 per card. Either way I would suspect vRAM usage is highly dependent on your game play settings.


----------



## AeroXbird

Quote:


> Originally Posted by *Alex132*
> 
> So surely then 7Gb = 3.5GB, 6GB = 3GB. etc?
> 
> And is this in MSI AB or GTA V's VRAM measurer... because even on my 460 I can comfortably go ~200MB above that.


The RAM on the R9 295X2 seems to be unified, as MSI AB only allows me to show GPU1 memory usage, and GPU-Z reports a dedicated memory usage only on the main chip


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well either way I hope they replace it for you.


thanks! Im seriously hoping they cant and just give me money.


----------



## Mega Man

When you are out of ram you can use page file pc ram


----------



## PCModderMike

Well that didn't last very long. Just submitted an RMA request with Newegg to have my 295X2 replaced.

https://www.dropbox.com/s/rjddb6xz8zjin4t/VIDEO0163.mp4?dl=0


----------



## Mega Man

I am lost what's wrong? Looks functional to me but I am on my phone


----------



## PCModderMike

Listen closely.....may be difficult if you're on a mobile phone.


----------



## AeroXbird

Quote:


> Originally Posted by *PCModderMike*
> 
> Well that didn't last very long. Just submitted an RMA request with Newegg to have my 295X2 replaced.
> 
> https://www.dropbox.com/s/rjddb6xz8zjin4t/VIDEO0163.mp4?dl=0


Holy cow that is a serious case of coil whine :S


----------



## PCModderMike

Is it coil whine? I wrote up the RMA request saying I thought it was the pump for the water cooling failing.


----------



## AeroXbird

Quote:


> Originally Posted by *PCModderMike*
> 
> Is it coil whine? I wrote up the RMA request saying I thought it was the pump for the water cooling failing.


It's either the pump indeed or coil whine, when it starts it sounds a lot like coil whine, but now that you mention the pump, that is very likely too.


----------



## xer0h0ur

Is that a wind down sound?


----------



## PCModderMike

Quote:


> Originally Posted by *AeroXbird*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> Is it coil whine? I wrote up the RMA request saying I thought it was the pump for the water cooling failing.
> 
> 
> 
> It's either the pump indeed or coil whine, when it starts it sounds a lot like coil whine, but now that you mention the pump, that is very likely too.
Click to expand...

Yea, either way, it's gotta go.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Is that a wind down sound?


That's while idling.


----------



## wermad

I got lucky (knock on wood) with xfx. Not really my first choice but there's a ton off these. Both still rocking


----------



## PCModderMike

I'm hoping it was just a fluke. That's why I asked for a replacement and not a refund. The card itself is beast, and I really like it. Just an unfortunate inconvenience.


----------



## kcuestag

Quote:


> Originally Posted by *PCModderMike*
> 
> I'm hoping it was just a fluke. That's why I asked for a replacement and not a refund. The card itself is beast, and I really like it. Just an unfortunate inconvenience.


Can't tell without having the card next to me, but that does sound like a pump issue indeed.


----------



## wermad

Quote:


> Originally Posted by *kcuestag*
> 
> Can't tell without having the card next to me, but that does sound like a pump issue indeed.


This. Sounds like a pump. Coil whine is very screechy (think old skool dial up) and changes as the load changes on the gpu's. This has a drone whine to it. Sounds like my old ddc when I dropped the voltage very low right before it stopped. Either way, its rma time


----------



## Mega Man

Meh that's what happens with patent troll products. Wish they would juat put full block or at least give me a no cooler option.


----------



## xer0h0ur

So I need some help here. I wanted to put a personal touch on my build so I took the illuminated RADEON logo from the 295X2's shroud, made a cut out on my drive bay cover with a dremel and glued it on. So that part is done and ready:



My question then lies in powering it. Can I just get a mini 2 pin to molex cable and go directly to the PSU or do I need to connect this another way? I obviously don't just have another one laying around so instead of frying it or something I figured I would differ to someone who has done this.


----------



## wermad

I would say it may be fed via a 12v line and reduced. You can try a 3.3v and if that is no good, 5v would be the most i would try. Someone might now the voltage it gets. IF you fry them, you can probably replace them tbh.


----------



## Mega Man

It is 12v. I have not tried lowering. But pwm down not help. ( ie doesn't seem to be leds.... )


----------



## PCModderMike

Quote:


> Originally Posted by *Mega Man*
> 
> Meh that's what happens with patent troll products. Wish they would juat put full block or at least give me a no cooler option.


I have a full cover block on the way, so I'll be taking that stock cooler off anyway. But I paid for a working card, I should have it.


----------



## AeroXbird

Quote:


> Originally Posted by *PCModderMike*
> 
> I have a full cover block on the way, so I'll be taking that stock cooler off anyway. But I paid for a working card, I should have it.


That's awesome, I'm considering putting on a waterblock on my 295x2 myself.
But the initial costs just seem so steep, I'm also considering waiting for the 395x2 first before buying a water cooling loop.


----------



## wermad

Koolance has their block for $130.

Ppcs.com @$110 but oos.

Ebay, saw a koolance and AC block there for ~150-175.


----------



## PCModderMike

What can I say, I'm a total EK fanboy.














At the time I bought mine, getting the EK block from PPCs was more expensive than ordering is straight from EK. So mine ended up coming from EK.


----------



## Mega Man

I am not arguing at all. Just won't pass up a chance to slam patent trolls. Really wish asecrap had a rep on ocn... I would follow then asking why do they feel like they have invented pc cooling


----------



## Roaches

Quote:


> Originally Posted by *Mega Man*
> 
> I am not arguing at all. Just won't pass up a chance to slam patent trolls. Really wish asecrap had a rep on ocn... I would follow then asking why do they feel like they have invented pc cooling


Fortunately we do but seems like he's been inactive since 2011...

http://www.overclock.net/u/156004/asetek-stu

Same reason why I'll never get anything CLC. Too many failure points to warrant any real reliability and don't expect them to last a year or two.
Better off slapping a block or go air.


----------



## wermad

Quote:


> Originally Posted by *PCModderMike*
> 
> What can I say, I'm a total EK fanboy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At the time I bought mine, getting the EK block from PPCs was more expensive than ordering is straight from EK. So mine ended up coming from EK.


Once I saw AC price block over $200, I knew others would follow. I was very surprised Koolance priced there's for $180. Was gonna pick them up from Koolance directly (already had their bridge and 380i cpu block), but ppcs put them on sale for $107 and picked up two asap! They're very lovely and the bridge system works perfect (using a diy block system for the middle ports).


----------



## xer0h0ur

I am a pretty big EK, Koolance and Alphacool fan myself. Always liked Koolance's CPU-380I block and as for GPU waterblocks I have always trusted EK for that. If I had to pick a block purely based on aesthetics then it would have been the Aquacomputer block. It just looks so damn sexy to me. I just had no idea if the rumors over its performance were true or not so I avoided buying those.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am a pretty big EK, Koolance and *Alphacool* fan myself. Always liked Koolance's CPU-380I block and as for GPU waterblocks I have always trusted EK for that. If I had to pick a block purely based on aesthetics then it would have been the Aquacomputer block. It just looks so damn sexy to me. I just had no idea if the rumors over its performance were true or not so I avoided buying those.


Avoid the wc gallery thread. Its pretty much a deathtrap for anyone who doesn't have a personal vendetta against them







.

Blocks are pretty heavy. Dry, w/ not port pieces or bridges, pad, tim, and mounting screws, the koolance block came in at 1kg. That's one reason I switched to horizontal mb layout to cope w/ the total weight and avoid sag.


----------



## xer0h0ur

Yeah, if I was going to be doing another build right now it would be within a similar style case with a horizontal layout for the motherboard as well since I would like to have the video card(s) visible from the side window(s) in their full glory.


----------



## AeroXbird

Correct me if I'm wrong, but doesn't the backplate with EK blocks help the rigidity of the card to reduce sag?


----------



## xer0h0ur

The block itself is already doing that job perfectly fine without the help of a backplate. The sagging issue is more related with the PCI-E slot itself than it is the PCB. Even the stock 295X2 is already rigid enough that the PCB doesn't sag. Its purely an issue of keeping the weight off the PCI-E slot.


----------



## wermad

This









Any using the Swiftech 6990 single slot bracket?


----------



## Agent Smith1984

Fishing line makes a nice "invisible" hanger...


----------



## PCModderMike

Quote:


> Originally Posted by *wermad*
> 
> This
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any using the Swiftech 6990 single slot bracket?


No, but with my EK block it will turn my card into a single slot version.


----------



## wermad

Quote:


> Originally Posted by *PCModderMike*
> 
> No, but with my EK block it will turn my card into a single slot version.


If I recall, derick or tiborr, said the 7990 single slot bracket was the same for the 295x2. Though you can't find any (I think their site still has em). From my memory archives, someone said the swiftech and ek 6990 single bracket was compatible w/ the 7990. So in turn, its plausible the swiftech (or the ek) 6990 single slot bracket will fit the 295x2. We had a good discussion on this but I never got one and don't know if anyone else went for it to try it.

Those who wanna get fancy or techie:

Single card users, the PC "power jack" if you can find them.



Dual or single card users, the Cooler master HAF-X vga support bracket is an option too:





Cooler master used to sell it separately, but now they include it in an "accessories kit" for the HAF-X for ~$15. Just go the the cooler master store to order.


----------



## electro2u

I got a bitspower vga support bracket but couldnt figure out how it would actually support a card so it just sots there among my pile of useless computer crap.


----------



## wermad

Anyone on still on the stock cooler, this little bracket might come in handy (especially for crossfire):





http://www.frozencpu.com/products/17728/slf-13/Expansion_Slot_Side_Fan_Mounting_Kit_-_Black.html

When I was planning my quad setup, I was seriously considering leaving them stock as the cooler is so purrtty. I was gonna pick up one of these guys to use w/ two or three Corsair sp120 hp's to help cool the cards. There's a few more designs out there but this one looks like it can handle more then one fan.

I know fcpu.com is closed but you should be able to search and find it elsewhere. Last i saw, ppcs.com had it but it was oos.


----------



## Mega Man

Quote:


> Originally Posted by *Roaches*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> I am not arguing at all. Just won't pass up a chance to slam patent trolls. Really wish asecrap had a rep on ocn... I would follow then asking why do they feel like they have invented pc cooling
> 
> 
> 
> Fortunately we do but seems like he's been inactive since 2011...
> 
> http://www.overclock.net/u/156004/asetek-stu
> 
> Same reason why I'll never get anything CLC. Too many failure points to warrant any real reliability and don't expect them to last a year or two.
> Better off slapping a block or go air.
Click to expand...

Quote:


> Originally Posted by *PCModderMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> This
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any using the Swiftech 6990 single slot bracket?
> 
> 
> 
> No, but with my EK block it will turn my card into a single slot version.
Click to expand...

as does mine :/

I won't buy Koolance as their warranty is bs. I do have to admit I like the 295 block. Look though. I prefer my ek


----------



## wermad

I was gonna go w/ EK plexi/copper for the price (vs the acetal nickel) but then ppcs.com dropped the price on the koolance to $107 and I couldn't say no







.


----------



## PCModderMike

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> No, but with my EK block it will turn my card into a single slot version.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I recall, derick or tiborr, said the 7990 single slot bracket was the same for the 295x2. Though you can't find any (I think their site still has em). From my memory archives, someone said the swiftech and ek 6990 single bracket was compatible w/ the 7990. So in turn, its plausible the swiftech (or the ek) 6990 single slot bracket will fit the 295x2. We had a good discussion on this but I never got one and don't know if anyone else went for it to try it.
> 
> Those who wanna get fancy or techie:
> 
> Single card users, the PC "power jack" if you can find them.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Dual or single card users, the Cooler master HAF-X vga support bracket is an option too:
> 
> 
> 
> 
> 
> 
> 
> Cooler master used to sell it separately, but now they include it in an "accessories kit" for the HAF-X for ~$15. Just go the the cooler master store to order.
Click to expand...

Love the horizontal motherboard layout on my S5. Honestly I would hate to have to use any of those options. None appear to be a clean option, and just look tacky.


----------



## wermad

Beauty is in the eye of the beholder







.

Most of the rigs are on air (check the owner's list) and some don't really care for the looks or may actually like these items. I think they're nice alternatives that have been proven to work. Its functional, and you can't take away from that. I tend to lean on function over form and why my rig isn't beautified to perfection. Just look at my last minute rig-up of my ssd







.


----------



## BootPirate

Hello there. I'm new here, joined up when I saw that there is a 295x2 owners club.

I have a question about the power connection that I haven't been able to find the answer to anywhere. I have a PCI-E power cable the runs from my PSU it splits into two 8 pin heads which are connected to my 295x2.
My question is should I actually have two separate PCI-E power cables running from my PSU to my card? My PSU has only one 12V rail rated at 70 amps.


----------



## xer0h0ur

LOL

#UglyComputersStillMatter


----------



## xer0h0ur

Quote:


> Originally Posted by *BootPirate*
> 
> Hello there. I'm new here, joined up when I saw that there is a 295x2 owners club.
> 
> I have a question about the power connection that I haven't been able to find the answer to anywhere. I have a PCI-E power cable the runs from my PSU it splits into two 8 pin heads which are connected to my 295x2.
> My question is should I actually have two separate PCI-E power cables running from my PSU to my card? My PSU has only one 12V rail rated at 70 amps.


Scenario 2 would be the most ideal since its not running 50A through a single cable. The cable itself can get quite hot in scenario 1. I presume its not a modular power supply and you're talking about the way your wiring harness has the cable coming out of the power supply right? If you have two separate PCI-E cables then run both otherwise you really don't have any option.


----------



## Alex132

Definitely situation 2. I have had issues with situation 1 with even a GTX690 - I would not want to imagine double that draw.

Also which PSU do you have?


----------



## BootPirate

It's a Thermaltake 850W. TPG 0850M to be exact. I believe it's listed in the spread sheet of PSUs on the first page.
It's a fully modular design.

The illustration is to show that one PCI-E cable has two 8 pin heads.


----------



## xer0h0ur

Okay, modular, always want to be running off two separate cables then. I figured it was a regular PSU and you were referencing how the cable was coming out of the PSU.


----------



## BootPirate

I'm going to switch it up right now.
Thanks for the quick reply guys.


----------



## wermad

16awg or 18 cables? i what's the limit on 18?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> 16awg or 18 cables? i what's the limit on 18?


How do you find out btw?


----------



## wermad

It's typically printed on the cable's plastic cover.


----------



## BootPirate

The font on the cable is so tiny I can't tell if it's 10AWGor 16AWG


----------



## Dagamus NM

It wouldn't be 10.


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> It's typically printed on the cable's plastic cover.


not in my experience,

though most are 18ga

not all, the old lepa cords were 16 but since they changed to the flat ones they are now 18


----------



## BootPirate

Flat ones as in the ribbon style cables?


----------



## wermad

From what I can tell, most of all the ribbon cables were 18awg. Though, the CM 1500w unit had some thick ribbon cables. These were a pain to route. I'm guessing they were thicker coated 18s or some "thinner" 16awg. It is a heavy duty unit so I'm thinking it might have been the later. They felt soft and not stiff but were still a tad too thick to start origami cable management in tight quarters.


----------



## Alex132

My HX850 is definitely 16awg...

What do you guys think of this PSU?


----------



## Mega Man

Quote:


> Originally Posted by *BootPirate*
> 
> Flat ones as in the ribbon style cables?


yes

either way 18ga is plenty tbh


----------



## BootPirate

Quote:


> Originally Posted by *Mega Man*
> 
> yes
> 
> either way 18ga is plenty tbh


That's what mine are. I found a magnifying glass and took another look, they are 18awg.


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> My HX850 is definitely 16awg...
> 
> What do you guys think of this PSU?


I think its better to look for a full modular unit tbh. I sold my v1000 (gold, almost platinum, 83amps) for ~$130 and it comes w/ ribbon cables. A V850 would be enough as well for a single card.


----------



## wermad




----------



## Tokuzi

Does anyone happen to have the BIOS for the Devil 13 R9 295x2 from Powercolor? I ordered an open box one and it would seem someone bricked it. I cannot find the original BIOS anywhere online.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> My HX850 is definitely 16awg...
> 
> What do you guys think of this PSU?
> 
> 
> 
> I think its better to look for a full modular unit tbh. I sold my v1000 (gold, almost platinum, 83amps) for ~$130 and it comes w/ ribbon cables. A V850 would be enough as well for a single card.
Click to expand...

My options are:

Sell my HX850 (doing already) and keep the TP1350W. Essentially "saving" ~$100.

Or sell both and get a good 850-1000w unit. Problem is the only ones I can find are too expensive, or worse than the TP1350W.

And semi-modular isn't the worst.


----------



## wermad

Hmmm, I could have shipped my v1000 to you. I've shipped to a few countries already. New Zealand is a popular destination as their prices are very high and the high cost of shipping + preowned US prices are still less.

Quote:


> Originally Posted by *Tokuzi*
> 
> Does anyone happen to have the BIOS for the Devil 13 R9 295x2 from Powercolor? I ordered an open box one and it would seem someone bricked it. I cannot find the original BIOS anywhere online.


Any chance you can return or exchange it w/ the retailer/seller? Or try to rma with PC?


----------



## Roaches

Quote:


> Originally Posted by *Tokuzi*
> 
> Does anyone happen to have the BIOS for the Devil 13 R9 295x2 from Powercolor? I ordered an open box one and it would seem someone bricked it. I cannot find the original BIOS anywhere online.


Yeah I'll help.

Hit me with a PM for a dropbox link.


----------



## xer0h0ur

Roaches if you can you might as well just link it for everyone. I am pretty certain that he is not the first nor the last person to request the BIOS for the Devil 13 within this thread.


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> Roaches if you can you might as well just link it for everyone. I am pretty certain that he is not the first nor the last person to request the BIOS for the Devil 13 within this thread.


I guess why not then, well here ya go https://dl.dropboxusercontent.com/u/68747535/Hawaii.rom straight from GPUZ. Just uploaded it to TPU database which should take a day or two to show up there. Will delete the link tomorrow morning. Cheers.


----------



## Tokuzi

Quote:


> Originally Posted by *Roaches*
> 
> I guess why not then, well here ya go https://dl.dropboxusercontent.com/u/68747535/Hawaii.rom straight from GPUZ. Just uploaded it to TPU database which should take a day or two to show up there. Will delete the link tomorrow morning. Cheers.


Thanks! +REP


----------



## Mega Man

@EK-CEO if only there was a waterblock for the PowerColor Devil 13 Dual Core AXR9 290X

i bet it would be a beast card


----------



## wermad

hit up natemandoo


----------



## Roaches

Quote:


> Originally Posted by *Mega Man*
> 
> @EK-CEO if only there was a waterblock for the PowerColor Devil 13 Dual Core AXR9 290X
> 
> i bet it would be a beast card


If they did, I'd buy a third Devil 13 one just setup a mini-ITX build







Asrock's X99 board really temps me to build one. But seriously, no complaints when you can get 2 for the price of one. Fantastic cards upgrading from the green team side


----------



## xer0h0ur

Good man, I saved it in case anyone asks for it in the future.


----------



## xer0h0ur

Quote:


> Originally Posted by *Roaches*
> 
> If they did, I'd buy a third Devil 13 one just setup a mini-ITX build
> 
> 
> 
> 
> 
> 
> 
> Asrock's X99 board really temps me to build one. But seriously, no complaints when you can get 2 for the price of one. Fantastic cards upgrading from the green team side


For what its worth, I had told the first Devil 13 owner from this thread about Alphacool 3D scanning any video cards they have never had to make blocks. The first people to submit their cards for scanning would get a free block. He claimed that he did put in to get his card scanned so it may be worth asking Alphacool if they ended up making one or not after all. Then there is of course the possibility of red-modding aka making your own hybrid water cooled setup.


----------



## wermad

You can go the aio road like amd. Maybe keep the center fan? two h80 kits would be better then the single 120 rad amd uses.'

Custom, not sure if the EK thermospheres are long enough. Would be cool for two of them sticking out of the stock cooler


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> For what its worth, I had told the first Devil 13 owner from this thread about Alphacool 3D scanning any video cards they have never had to make blocks. The first people to submit their cards for scanning would get a free block. He claimed that he did put in to get his card scanned so it may be worth asking Alphacool if they ended up making one or not after all. Then there is of course the possibility of red-modding aka making your own hybrid water cooled setup.


Has there been any success about that in the community of people sending their cards in? Honestly feel pretty iffy to send a brand new card overseas to AlphaCool just for them to scan when if they already did scanned the same card once before if the owner you mentioned had any success with that and really came back with a block of its own. Also I fear when Fiji launches next month (if it does), the 295X2 tier cards might become EOL to not warrant any demands for a custom block.
Quote:


> Originally Posted by *wermad*
> 
> You can go the aio road like amd. Maybe keep the center fan? two h80 kits would be better then the single 120 rad amd uses.'
> 
> Custom, not sure if the EK thermospheres are long enough. Would be cool for two of them sticking out of the stock cooler


Not a fan of the whole AIO market scene, though it never hurts to get a reference card and block it day one.


----------



## tagaxxl

Heys Guys .. I need your help with my graphic card. Idont know if my card has optimal performance. Also my 2nd screen 17" (1280x1024) that is connected with mini displayport to vga adapter, every time that i open my pc i need to reconnect the adapter to get it to work.

here are my specs
motherboard : msi x79a-gd45
cpu : i7 3820 3.6 ghz oc to 4,2ghz
ram : corsair 16gb 1600 mhz
cpu cooling : corsair H100i
graphic card : r9 295x2
power supply : super flower platinum 1200w leadex






benchmark with unigine valley: file:///C:/Unigine_Valley_Benchmark_1.0_20150506_1620.html

benchmark with tomb raider on ultimate preset : max fps 184
average fps 126

maximum temperature on gpu is 73 C


----------



## xer0h0ur

You're going to have to be a bit more specific about what you mean by "I dont know if my card has optimal performance."

If you're referencing your GPU temp(s) then you can add a 2nd fan for push/pull to drop those temps a bit. Just looking at your pictures I can see that you're pulling the hoses far too much in the opposite direction of the barbs on the radiator essentially making the hoses nearly kinked and likely affecting the coolant's flow.

As for your issue with your monitor, are you using a passive adapter or an active adapter?


----------



## tagaxxl

fixed a bit the hoses, but the on the box of the adapter doesnt say if its active or passive. where i can see what is it? also on most of the game that i play on msi afterburn gpu and cpu dont work at 100%.


----------



## wermad

link to the adapter?


----------



## tagaxxl

http://bionic.com.cy/products/mini-display-port-to-vga-adapter-cable-for-mac

here you go


----------



## xer0h0ur

Do you have the model number for the adapter? As for GPU usage it will depend entirely on the game you're playing and your settings for that game. For instance I wouldn't expect to see the GPU usage the same @ 4K versus @ 1080p. People also say that the CPU can also affect GPU usage but I can't really speak as to that and you're running yours at 4.2GHz which seems fine to me.


----------



## Feyris

Swapped review video to linus, also considering removing the Quadfire video and just condensing it into the user submitted benchmark section, it will be an auto-form like for registration.

Maybe add this into op too?


----------



## tagaxxl

the website doesnt have any info

here some pics




also when i press print screen in game and press ctr+v the screen is black

didnt have that problem with the 706 gtx


----------



## xer0h0ur

Man that is a rip off price for a passive adapter and for what its worth that link says its for Mac use only. Amazon.com sells active adapters for much less than that.


----------



## tagaxxl

so an active adapter will solve my problem? if so, any suggestion to buy from ebay or amazon?


----------



## xer0h0ur

To be entirely honest I can't say with absolute certainty if the active adapter will do the trick since I have not used a VGA connection in ages. However when I was setting up my cousin's 6 monitor setup, the displayports needed active adapters to work properly as they refused to work with passive adapters. He was going displayport to hdmi but regardless of what it was being converted to, active adapters were necessary.


----------



## tagaxxl

any good benchmark to see if my graphic card works ok?


----------



## xer0h0ur

3DMARK Firestrike/Firestrike Extreme/Firestrike Ultra

Heaven is another common benchmark that is used.

More than anything else you want to use a monitoring and logging application like gpu-z or afterburner so you can verify your gpu clocks are maintaining full speed throughout testing and not throttling down.


----------



## tagaxxl

did the heaven benchmark cuase its free and the results are :

file:///C:/Users/Tagaxxl/Unigine_Heaven_Benchmark_4.0_20150506_2219.html

90%-100% gpus most of the time ..max 74C

20% - 70% cpu


----------



## wermad

sorry, busy w/ work. I'll check to see if you need an active adapter. I know Mega might have some input on this. Do you have the second monitor setup as an accessories monitor in windows or ccc? Also, when you loose video, does windows and ccc still see the second monitor (device manager as well)?


----------



## xer0h0ur

If I had to guess I would bet that you were thermal throttling during Heaven if you registered 74C. You will rarely if ever see it show 75C. If at all only for a brief moment. Like I said before though, you need to check your clock speeds when your GPU(s) is showing 74C. That is why I was saying you need to use an application that will log it for you over a period of time.


----------



## wermad

You can detach afterburner's monitor and stretch it (think you cal also do this w/ asus and trixx).

So far, I'm not really seeing the need for active adapters unless its going into eyefinity. But, tbh, an active adapter would be ideal non-the less.


----------



## wermad

double post


----------



## tagaxxl

what do you mean "second monitor setup as an accessories monitor in windows or ccc"
when i close and open, my pc lose the second screen and cant be detect and windows screen resolution and ccc and i need to reconnect the adapter minidisplayport to vga to get the 2nd screen working

here some pictures of the settings at moment





good night for today....cya tomorrow


----------



## wermad

accessory monitor refers to an extra monitor that isn't necessarily an eyefinity group trying to create a single resolution. You screen shot tells me its correct. If you wanna order an active adapter, up to you. I would get it from amazon (not one of their sellers) if possible to return in case you don't need it. Do you have any shops or stores that sell these? I'm sure there must be something where there's stuff for mac's that sells active dp to vga adapters.

btw, make the 1080 monitor your primary display.


----------



## xer0h0ur

Normally the shenanigans begin once you get to 3 monitors or more, with needing active adapters on the displayport connections. However he is connecting his better monitor through DVI and the lower quality screen through displayport so I don't know if that is playing a factor here.

I still don't really understand under what circumstances he ends up losing the video signal on the VGA connection though.


----------



## wermad

Maybe the gpu thinks that's the primary monitor and not getting a signal from it due to the passive adapter?

I have a spare 1080 I can try to replicate if I have similar adapters.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> Normally the shenanigans begin once you get to 3 monitors or more, with needing active adapters on the displayport connections. However he is connecting his better monitor through DVI and the lower quality screen through displayport so I don't know if that is playing a factor here.
> 
> I still don't really understand under what circumstances he ends up losing the video signal on the VGA connection though.


This is correct. But those El cheapo adapters ate known to fail.

If I am reading correctly the vga monitor is disconnecting? Sounds like adapter failure to me


----------



## Dagamus NM

So what do you hope to gain by continuing to use the low res monitor? Might a better or matching monitor to the rest be a better option?

I am anxious to see how well my x2 performs in Adobe apps. Double precision on the card is better than Titan x. Not really saying much there as Tx is not a prosumer card, but x2 should still be.

I kind of want to get a second one just for the sake of doing it. Don't need it but still, the emptiness in my case is nagging me.


----------



## xer0h0ur

If I were him and still wanted to keep using that monitor then I would get a cheapo dvi to vga adapter for that secondary monitor and I would run the 1080p panel off a displayport adapter.


----------



## tagaxxl

i use my old screen to be able to see all the extra stuff (skype, firefox and other chat programs) while i play on the main screen

i will try to see if your idea is cheaper to do xer0h0ur


----------



## Dagamus NM

Fair enough. I second the DVI to VGA and then mDP to whatever you use on the other monitors.


----------



## wermad

Quote:


> Originally Posted by *tagaxxl*
> 
> i use my old screen to be able to see all the extra stuff (skype, firefox and other chat programs) while i play on the main screen
> 
> i will try to see if your idea is cheaper to do xer0h0ur


did you make the 1080 monitor the primary one btw?


----------



## tagaxxl

yes i made my main screen

its a shame that i need to waste so much money for adapters


----------



## xer0h0ur

Are displayport adapters really that expensive over there? These things can be had here for well under $20USD for active adapters and under $10USD for passive adapters.


----------



## tagaxxl

20 euro for passive and 32 -36 for an average active adapter here in cyprus.

http://www.amazon.co.uk/gp/product/B004071ZXA/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=A3P5ROKL5A1OLE

http://www.amazon.co.uk/gp/product/B0041O7XHY/ref=ox_sc_act_title_2?ie=UTF8&psc=1&smid=A3P5ROKL5A1OLE

91 euro for both with shipping


----------



## xer0h0ur

Holy hell. They whack you guys over the head on prices over there.


----------



## Mega Man

You also gotta realize in the uk site to local laws prices are higher. Going from memory the store has to handle warranty on pc equipment for 1-2 years

That is one example of the thinks that are done to increase cost


----------



## xer0h0ur

By the way when I said a DVI to VGA adapter I was talking about like the free adapters they used to give with ATI/AMD video cards. I have loads of those things laying around.


----------



## wermad

My two xfx only came w/ dp to hdmi (passive) adapters.


----------



## xer0h0ur

Yeah my 295X2 only came with a dvi to hdmi passive adapter. 290X nothing lol


----------



## Mega Man

They're is no such thing as an active DVI to hdmi adapter


----------



## BootPirate

I had to pay about the same amount for my DVI to mini DP adapters. They were about $25-$30 Canadian.


----------



## Orivaa

Wow, it's been a while since I was last here.

So I just tried to hit up Dragon Age Inquisition for the first time. Went very well for a while, until some lighting effects came into play, at which point it dropped to the 40s. I was recording, so I figured it was because of that, but when I turned off the recording, the framerate stayed the same - In the 40s.
So I hit up Afterburner, and it seems my second GPU is not being used at all. Why is this? I uninstalled the beta driver and reinstalled the Omega driver to no help. According to the driver notes, there _should_ be a DA:I CF profile in Omega.


----------



## xer0h0ur

Try the 15.4.1 driver. Even though I have zero idea what was changed from 15.4 to 15.4.1, it fixed my BSODing and my overclocked stability is back.


----------



## Orivaa

I think that was the one I was using before, but I'll give it a bash.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Try the 15.4.1 driver. Even though I have zero idea what was changed from 15.4 to 15.4.1, it fixed my BSODing and my overclocked stability is back.


Didn't make any difference. Second GPU won't even go over 430-ish mhz in the game.


----------



## xer0h0ur

Hmm. Well that is odd. At least it fixed my Dying Light tri-fire performance. I now have both cores on the 295X2 pegged nearly 100% and the 3rd GPU fluctuates. Sometimes they alternate on which ones are at high usage but for the most part I get full usage out of two GPUs and the 3rd is up and down. I finally started playing it as I can maintain good framerate @ 4K.


----------



## Orivaa

Maybe you could test Inquisition for me, if you own it. See what results you get.


----------



## Orivaa

Well, now it says it's being used, but the mhz stays at 400-ish, even though the uses goes up and down.


----------



## xer0h0ur

Would if I could but I am at work and don't own the game


----------



## Mega Man

just a quick run shows over 100 fps and all 4 gpus being properly used but this is in eyefinity and ultra all but MSAA

have you moved the power slider to max ?


----------



## Orivaa

Quote:


> Originally Posted by *Mega Man*
> 
> just a quick run shows over 100 fps and all 4 gpus being properly used but this is in eyefinity and ultra all but MSAA
> 
> have you moved the power slider to max ?


Power slider? What power slider?


----------



## xer0h0ur

I presume he means the power limit slider in Afterburner.


----------



## Orivaa

Didn't change a thing.


----------



## xer0h0ur

So have you verified its not thermal throttling on you? As long as you're able to keep the temps in check you can give increasing the power limit a try.


----------



## Orivaa

How would it be throttle? It's not like it gets any time to actually heat up. At any rate, I did up the power limit. Made no difference.


----------



## xer0h0ur

Yeah I knew it wouldn't likely be temp throttling that quickly, just figured I would ask. Could also throttle because of a hot PLX chip or overheating VRMs but again not likely if clocks are fluctuating from the get go. Sounds like a driver/game issue after all. I get the same problem with AC:U ever since the Omega but have zero idea if its a problem introduced by a game patch or if its the driver as no other game does that to me.

For what its worth, the workaround fix I was using in AC:U to force steady clocks was to create an application profile and force crossfire into AFR Friendly. That in itself worked but then would give me flashing textures for water.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I knew it wouldn't likely be temp throttling that quickly, just figured I would ask. Could also throttle because of a hot PLX chip or overheating VRMs but again not likely if clocks are fluctuating from the get go. Sounds like a driver/game issue after all. I get the same problem with AC:U ever since the Omega but have zero idea if its a problem introduced by a game patch or if its the driver as no other game does that to me.
> 
> For what its worth, the workaround fix I was using in AC:U to force steady clocks was to create an application profile and force crossfire into AFR Friendly. That in itself worked but then would give me flashing textures for water.


Already tried AFR and it didn't work.
I really do not understand why it isn't working. The game is decently optimized and I'm using the latest version. I also have one of the latest drivers, which include a profile for DA:I. It _should_ work.


----------



## Mega Man

let me ask a few things

1 how much page file do you have ? in my experience you need it i used to run 1gb now i run far more.

2 what settings and what resolution? ( graphics of course )

3 how much aa are your running


----------



## Orivaa

Quote:


> Originally Posted by *Mega Man*
> 
> let me ask a few things
> 
> 1 how much page file do you have ? in my experience you need it i used to run 1gb now i run far more.
> 
> 2 what settings and what resolution? ( graphics of course )
> 
> 3 how much aa are your running


1. Does page file affect GPU usage? Regardless, I don't know how to answer. Where can I find page file, and how can I allocate it?

2.
Mesh - Ultra
Tessellation - High
Textures - Ultra
Shadows - High
Terrain - High
Vegetation - High
Water - High
Post-process - Low
Ambient occlusion - SSAO
Effects - Ultra
Post-process AA - High
AA - Off
Shader - High
Resolution - 1080p

Mind you, those are tweaked. If I were to choose automatic, the game'd set most of that to Ultra or higher. (Like textures can be "fade-touched")

I fail to see how any of that would affect the fact that my 2nd GPU is not being used, however.


----------



## xer0h0ur

Does that game have any Nvidia GameWorks options? If so don't use them. Bad performance on AMD cards for the majority of GameWorks titles.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> Does that game have any Nvidia GameWorks options? If so don't use them. Bad performance on AMD cards for the majority of GameWorks titles.


I think it has HBAO and HBAO+, but otherwise, no. Unless I don't know what Nvidia GameWorks options are.


----------



## Orivaa

I also fired up Crysis 3, and it utilizes both GPUs well, and at the correct core clocks.

I've been holding off on DA:I for a long time, and I do not want to wait until I get an 390x before actually playing the game simply because I can't get CF to work.


----------



## Mega Man

I'll load up my settings next time I can for you.

The page file can affect usage although usually you get kicked outta the game when you don't have enough. I have seen some performance suffering too.

I literally just installed the game though and it works fine with all 4 gpus


----------



## Tizzie

I'm looking to change the stock radiator fan to Noctua fans, has anyone have experience with this? Noctua has 4 pins so it doesn't fit the 3 pin connector that's on the card...

Is it okay to just plug it into the motherboard?


----------



## xer0h0ur

Quote:


> Originally Posted by *Tizzie*
> 
> I'm looking to change the stock radiator fan to Noctua fans, has anyone have experience with this? Noctua has 4 pins so it doesn't fit the 3 pin connector that's on the card...
> 
> Is it okay to just plug it into the motherboard?


Yeah you can connect it up however you please really as long as you're not trying to draw the power of two noctua fans from the connector on the video card. That apparently would be more than likely too much power draw. Can you manually control the fans if you connect them to your motherboard?


----------



## smoggysky

I just finished building a new rig around the r9 295x2. I have a question. I was using the dvi rear connector with the hdmi converter that came with the card and was running the video and audio through my sc05 receiver (no problems). I just purchased 2 mini displayport 1.2 cables. they work fine. the problem is I cannot get the m/b bios screen to appear through either port. the m/b uefi bios screen will only appear through the dvi/hdmi adapter. is there a setting I am missing? after I installed the mini displayport cables, I didn't remove the dvi/hdmi connector. could the card still see the adapter in the port and will not switch to the mini displayports? or is this a issue that I have to live with? the displayports load to windows fine. there appears to be a marked improvement in audio and video through the mini displayports. any and all advice is appreciated
system configuration:
rampage v extreme
egva 1600 psu
x5960 @ 4.00ghz 1.2v
gskill [email protected]
Samsung xp941 m.2 ultra drive w/7 64bit ultim.
(2) sapphire r9 295x2's
(2) custom waterloops (1-video,1-cpu)


----------



## professionnal

My old corsair tx750w is ok for a 295x2 ?

i5 4670k oc @ 4ghz
2 ssd
2 hdd


----------



## Orivaa

Quote:


> Originally Posted by *Mega Man*
> 
> I'll load up my settings next time I can for you.
> 
> The page file can affect usage although usually you get kicked outta the game when you don't have enough. I have seen some performance suffering too.
> 
> I literally just installed the game though and it works fine with all 4 gpus


I solved it, and it was the stupidest freaking thing. So apparently, despite having set the damn thing to Fullscreen, it decided to change itself to Windowed Fullscreen.


----------



## xer0h0ur

Quote:


> Originally Posted by *Orivaa*
> 
> I solved it, and it was the stupidest freaking thing. So apparently, despite having set the damn thing to Fullscreen, it decided to change itself to Windowed Fullscreen.


----------



## xer0h0ur

Quote:


> Originally Posted by *smoggysky*
> 
> I just finished building a new rig around the r9 295x2. I have a question. I was using the dvi rear connector with the hdmi converter that came with the card and was running the video and audio through my sc05 receiver (no problems). I just purchased 2 mini displayport 1.2 cables. they work fine. the problem is I cannot get the m/b bios screen to appear through either port. the m/b uefi bios screen will only appear through the dvi/hdmi adapter. is there a setting I am missing? after I installed the mini displayport cables, I didn't remove the dvi/hdmi connector. could the card still see the adapter in the port and will not switch to the mini displayports? or is this a issue that I have to live with? the displayports load to windows fine. there appears to be a marked improvement in audio and video through the mini displayports. any and all advice is appreciated
> system configuration:
> rampage v extreme
> egva 1600 psu
> x5960 @ 4.00ghz 1.2v
> gskill [email protected]
> Samsung xp941 m.2 ultra drive w/7 64bit ultim.
> (2) sapphire r9 295x2's
> (2) custom waterloops (1-video,1-cpu)


How many total monitors are you using?


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*


I KNOW, RIGHT?!


----------



## xer0h0ur

Yeah I completely forgot about crossfire disabling itself when windowed since its not running on Mantle.


----------



## BradleyW

What do you all thing of this?



Look at the mouse input graph. FreeSync ON + Vsync OFF and FreeSync ON + Vsync ON can sometimes have the same amount of input delay.


----------



## smoggysky

xerohour, I have one plasma tv, my original setup was card 1 using the dvi/hdmi connector through my pioneer sc05 receiver to the tv. and I had a second input from the from the second card through its dvi/hdmi to a second hdmi on the tv.( this input was mostly disabled unless I wanted to use my receiver in "amplifier mode" for music or games from my pc). I would switch back and forth OCASIONALLY.(lol)..now I have both inputs on the primary card. my goal is to us a mini displayport (1) for my total pc usage. no more switching. the audio and video signal through the mini displayport is far superior to any signal I have seen before. I am blown away by the improvement in quality of sound and graphics.
please advise..


----------



## xer0h0ur

Well as far as I understand it you can't mix video connections from both cards when crossfired. If I am correct you can only use video outputs on one of the cards if crossfired. So I don't know if this is part of your problem. Someone else may be able to clarify on this.

I stick to using the video outputs on the video card in the primary PCI-E slot.


----------



## smoggysky

right now I am using only the primary card I have one mini displayport cable/ hdmi going through my receiver to the tv and the dvi/hdmi port going directly to the tv. my UEFI bios screen does appear on the dvi/hdmi cable without problem. I just switch the tv input from "game"(dvi/hdmi) to "dvd"(mini displayport) to get the ultra rich audio/video. it works, its just an inconvenience. I guess later i'll try unplugging the dvi/hdmi adapter and see if the total signal goes through the mini displayport. this I have not tried yet.


----------



## smoggysky

the catalyst center shows both. I just pick the mini displayport as the primary default.


----------



## xer0h0ur

I don't know off hand if the video card is interpreting something being connected by having the DVI to HDMI adapter connected. If you're not using that output there is no reason to keep it connected though. You're using a unique setup I have no experience with though. I don't know why you're connecting two outputs for use on a single TV.


----------



## rakesh27

I think i understand what hes doing.

1) One cable is going from PC to Audio Reciever - Sound HDMI
2) Another cable is going from PC to HDTV - Picture...DVI to HDMI

I would have thought if your audio reciever has hdmi pass through you can do away with second cable instead you could stick with PC to Audio Reciever then Audio Reciever to HDTV....

The way you should do it i think is like this

1) PC to HDTV - Picture
2) PC to Audio Reciever via Toslink cable Optical for best sound setup.

The advantages doing it like above is you get the best picture quality and resolutions and sound is at the best via optical cable.

This is the way i do it, it gives you the best results, i could be entirely wrong what ive explained, if so my bad....


----------



## glenn37216

Well I finally decided to join the club. Loving this card so far . (Although I am a little disappointed on how AMD dropped the ball with current drivers in Project Cars) I haven't had any overheating/thermal throttling in games but I have noticed I can't overclock this specific card at all. No idea why that is . Setting clocks just defaults back to stock once running a stress test. I think this could be an Msi Afterburner issue... don't know really but have to do some more digging later.

I managed 17305 in Firestrike at the default settings. (Pc Specs in Sig.) http://www.3dmark.com/fs/4743494 Temps after benchmark; 58C Max gpu @ (68f Ambient room temperature) I'm impressed that this little rad on this card is so efficient.

For the best posssible temps I ended up mounting the rad in a push pull config (Corsair SP120's) as an exhaust in the front of the case. I did have it mounted in the rear but I moved it because temps were 7-10c higher ...

A few pics;






Overall I'm happy . Glad I found this thread... It was good encouragement on deciding if the time was right in buying a dual gpu solution.







Just hope this one lasts me as long as my others have. I still have my 4870x2 and 5970 lying around..


----------



## xer0h0ur

Did you go into the CCC and accept the overdrive disclaimer then enable overdrive? It may be conflicting with Afterburner as the two don't play nice together. I keep it disabled in the CCC and control clocks strictly through Afterburner.


----------



## glenn37216

Quote:


> Originally Posted by *xer0h0ur*
> 
> Did you go into the CCC and accept the overdrive disclaimer then enable overdrive? It may be conflicting with Afterburner as the two don't play nice together. I keep it disabled in the CCC and control clocks strictly through Afterburner.


...I think your right. I accidently left CCC clocks on. Crap. hours of tweaking settings and it's the obvious that I end up overlooking .


----------



## xer0h0ur

No doubt, usually seems to be the simplest things we forget about. Like the guy having issues with getting crossfire to work in DA:I. No full screen = no crossfire. Simple enough however easy to forget.


----------



## Orivaa

Quote:


> Originally Posted by *xer0h0ur*
> 
> No doubt, usually seems to be the simplest things we forget about. Like the guy having issues with getting crossfire to work in DA:I. No full screen = no crossfire. Simple enough however easy to forget.


It changed on its own. I set it to normal Fullscreen when I first booted it up, so I figured it'd stay that way.


----------



## xer0h0ur

Not placing blame on anyone or anything. Just saying that we went through multiple possibilities before you even considered one of the simplest ones which I forgot about altogether. Which turned out to be the reason why crossfire was not enabling.


----------



## GreenGoblinGHz

Build is coming along nicely (Intel build logs - "Blue Beast" )

i7-4790k
R R290 + R R295X2.
ASRock Extreme 9, z97
1600w Leadex super flower psu

Some pics... will add moar later : 





Backplate is for 295x2 (painted cloudy blue)

Sincerely :
Druizza

FYI : Test configuration. Usin 2x loops, so I have tested everything b4 final assembly. Also waiting for Plextor 128Gb ultra m2 ssd , gonna be my bootdrive + if I can stretch my funds also gettin basic m2 ssd (500gb)


----------



## PCModderMike

Seeing all these posts about people having fun with their 295X2's......and I'm over here waiting on mine to return from RMA like....


----------



## Alex132

Quote:


> Originally Posted by *PCModderMike*
> 
> Seeing all these posts about people having fun with their 295X2's......and I'm over here waiting on mine to return from RMA like....


Seeing all these posts about people having fun with their 295X2's......and I'm over here waiting on mine to return ship since 3rd March like....


----------



## xer0h0ur

The feels are legit


----------



## Roaches

I've run into a bit of an issue today: running any 3D related applications and games causes my PC to lock up. I suspect my second card might be faulty but it may not since the onboard usb header port at the bottom next to the 7th PCI-E slot of the board have issues detecting devices connected. The first card is fine since I can run anything in 2 way CFX but anything 3 or 4 way CFX enabled, PC locks up instantly when starting games. I'm currently swapping hardware and testing these cards in another rig to verify if the second card is faulty. I'd hate to RMA one of my Devil 13s since its never been touched since installed day one in my main system


----------



## Roaches

Yep she's dead jim. No display signal on the primary DVI port but the rest works and output signal. Card locks up system when CFX enabled when running any 3D applications. Second GPU core on the second card is faulty but primary GPU core works fine when CFX is isabled.









Now how to explain a half dead card to Powercolor.....


----------



## xer0h0ur

Your avatar makes me lol so hard. I end up staring at it for a dozen loops every time.

So back on topic, when you were testing the card you deemed faulty were you doing so in the PCI-E slot you were using for the fully functioning card? I only say for the sake of eliminating the possibility you were stating of PCI-E slot problems.

I have twice over had issues with certain drivers causing BSODs immediately upon trying to load games in the past. When disabling crossfire and only leaving the 295X2 crossfired it worked fine but tri-fire would go emo on me. First time I was able to limit it by not running the Afterburner OSD at all or flat out closing Afterburner so RivaTuner doesn't run. The 2nd time was with the 15.4 driver and nothing stoped the BSODing with select games until I DDUed and installed the 15.4.1 driver.


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> Your avatar makes me lol so hard. I end up staring at it for a dozen loops every time.
> 
> So back on topic, when you were testing the card you deemed faulty were you doing so in the PCI-E slot you were using for the fully functioning card? I only say for the sake of eliminating the possibility you were stating of PCI-E slot problems.


PCI-E slots work fine since I swapped positions from the other card and used the 4th PCI-E slot as a primary GPU display to test the fully functioning primary card with no issues to be found... I've already narrowed that the second card is faulty but interestingly the card is half dead thanks to the second GPU core hard locking up the system when CFX is enabled if running any games and 3D accelerated programs like SolidWorks.

I'm currently doing a fresh driver install on another rig to double verify that my second card is truely faulty before contacting Powercolor.
But I don't have high hopes as of now since after doing a full driver wipe and reinstall on my main system hasn't remedied the issue my second card has.


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have twice over had issues with certain drivers causing BSODs immediately upon trying to load games in the past. When disabling crossfire and only leaving the 295X2 crossfired it worked fine but tri-fire would go emo on me. First time I was able to limit it by not running the Afterburner OSD at all or flat out closing Afterburner so RivaTuner doesn't run. The 2nd time was with the 15.4 driver and nothing stoped the BSODing with select games until I DDUed and installed the 15.4.1 driver.


Have not run into any BSODs yet with my 290X2 Quadfire since day one; just the black screen on wake issue that pretty much most encountered on this forum, other than that solid experience until now that happens to relate to hardware instead of drivers.

Update: Yep its half dead. Primary DVI Port still has no signal after post; after installing drivers on the other PC and run anything that uses 3D acceleration and system locks up instantly. I'm gonna have to fill an RMA form for this one.

Wish me luck folks.


----------



## xer0h0ur

Well at least you still have one working fine. Hopefully you get sorted out quickly.


----------



## BradleyW

How do I enable Crossfire in Windows 10? No option in CCC. Both cards detected in device manager. No issues in Win 7/8/8.1.

Thank you.


----------



## smoggysky

xerohour I solved my bios screen loading problem with the mini displayports. my bios was set to uefi/legacy boot. when I set this to uefi boot, my bios screen can now be seen through the mini displayports. who new. just sharing my discovery in case any of our members have a similar issue.








I appreciate your help
thank you


----------



## Alex132

Quote:


> Originally Posted by *BradleyW*
> 
> How do I enable Crossfire in Windows 10? No option in CCC. Both cards detected in device manager. No issues in Win 7/8/8.1.
> 
> Thank you.


Should be automatically enabled when you install the drivers. Test in a game/benchmark to see if it is.


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well at least you still have one working fine. Hopefully you get sorted out quickly.


Just submitted an RMA form and finally got a RMA number after many attempts refilling the form since I keep getting "Serial Number does not exist in our database".
Their website is pretty wonky at times as if their servers are overloaded by traffic or something..

Well here's hoping they'll let me send in my card. Right now I have a freaking 5 pounds of paperweight sitting right next to me lol.


----------



## wermad

That sucks







.

I had similar issues w/ MSI and getting a Lightning Tahiti added to the form. After several days of trying, I was about to call, but it went through eventually. Maybe it was a system issue? Well, at least you're one step closer







.


----------



## xer0h0ur

Quote:


> Originally Posted by *smoggysky*
> 
> xerohour I solved my bios screen loading problem with the mini displayports. my bios was set to uefi/legacy boot. when I set this to uefi boot, my bios screen can now be seen through the mini displayports. who new. just sharing my discovery in case any of our members have a similar issue.
> 
> 
> 
> 
> 
> 
> 
> 
> I appreciate your help
> thank you


Good to know, thanks for the update. Cheers


----------



## Dagamus NM

So at the moment I have my x2 hooked through the mini display port to hdmi adapter. I also have the DVI to HDMI adapter. Is there any difference in running one adapter vs the other?


----------



## BradleyW

Quote:


> Originally Posted by *Alex132*
> 
> Should be automatically enabled when you install the drivers. Test in a game/benchmark to see if it is.


Does not appear to be enabled.


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> So at the moment I have my x2 hooked through the mini display port to hdmi adapter. I also have the DVI to HDMI adapter. Is there any difference in running one adapter vs the other?


What res are you running? If you're not doing Eyefinity or 4k, you should be fine. You have both displays working properly?


----------



## xer0h0ur

The main difference between the dual link DVI port and the displayport port is the bandwidth available. A displayport 1.2a port on the 295X2 is capable of driving up to 4 1080p monitors, 2 1440p monitors or 1 4K monitor.


----------



## PCModderMike

I recall trying to use the mini DisplayPort to HDMI adapter when I first received my card, but at 3440x1440 it only allowed me to run at a 30Hz refresh rate and it looked terrible. Eventually I bought a mini DisplayPort to regular DisplayPort adapter.


----------



## wermad

my 4k sammy goes to ththtththththththththtiriririrririrriritytyttytytytytty hertz using the mini dp to hdmi


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> That sucks
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I had similar issues w/ MSI and getting a Lightning Tahiti added to the form. After several days of trying, I was about to call, but it went through eventually. Maybe it was a system issue? Well, at least you're one step closer
> 
> 
> 
> 
> 
> 
> 
> .


They was quick to respond, after coming back from a break they already sent me an email with an RMA number and shipping address.

At the moment, I decided to backup everything important and do a format on my main drive as my primary card would sometimes lock up the system at random times today. Something tells me I should RMA both of my cards though I'm doing further testing to see if I can recreate it in a newly installed windows.

Since the morning I couldn't run anything 3D related, from Solidworks, 3ds max, Skyrim and any other game. It would lock up my system instantly unless I disable Crossfire. The day before, everything was working like a charm.


----------



## xer0h0ur

Well that is an HDMI 1.4 port limitation to 30Hz. That is why people were using dual HDMI cables to get a 60Hz refresh rate on some 4K monitors. HDMI 2.0 supports 60Hz 4K. I am curious to see when we will get DP 1.3 for 120Hz.


----------



## wermad

Quote:


> Originally Posted by *Roaches*
> 
> They was quick to respond, after coming back from a break they already sent me an email with an RMA number and shipping address.
> 
> At the moment, I decided to backup everything important and do a format on my main drive as my primary card would sometimes lock up the system at random times today. Something tells me I should RMA both of my cards though I'm doing further testing to see if I can recreate it in a newly installed windows.
> 
> Since the morning I couldn't run anything 3D related, from Solidworks, 3ds max, Skyrim and any other game. It would lock up my system instantly unless I disable Crossfire. The day before, everything was working like a charm.


Possible clashes w/ the PLX chip on your mb and the d13 ones? I've heard something like this before, but tbh, my board has a plx and I never ran into issues with one and two cards. i actually tested everything on Sniper Z87 w/ plx as well. But, it don't hurt to check if your mb needs a bios update. Any heavy use prior to the issues? The heat these beast put out must be insane at full speed. I can barely tolerate the heat of my wc setup w/ the fans at 40%. If the other one is giving you issues, I would wait for the rma return. At least you can test the returned one while the other goes off to rma.


----------



## Dagamus NM

It did strike me as a little odd that the x2 does not come with a mini display port to display port adapter. I have a bunch from my 7950's and 7970's.

Anyhow, I am using a Sony TV in my living room at 1080p so it probably won't make any difference but as far as bandwidth goes, is HDMI just more restrictive than either of the two?

I wish there was a TV that could use displayport. Anybody know of a 55-65" 4k TV that does displayport? Preferably one with a thin bezel.


----------



## xer0h0ur

That I know of the best you can do is HDMI 2.0 on TVs as of now and I don't believe anyone has put DP into a TV. So 60Hz on a 4K TV is as good as it gets(connected to a PC). HDMI 2.0 carries up to 18Gbps HDMI 1.4 carries 10Gbps DP 1.2a I believe is 21.6Gbps and 1.3 will be 32.4Gbps.


----------



## Dagamus NM

Technology is annoying sometimes. I wouldn't mind using a monitor in place of a TV as I don't care about the smart app BS that comes with tv's but I don't want to get away from using a remote for selecting the desired input. I also want to retain a large enough screen size that I can sit 20' away and still be able see what I am doing.

It is one thing to split a computer display across several screens but I have no idea how to make my stupid directv box do that. Probably can't.


----------



## Roaches

Quote:


> Originally Posted by *wermad*
> 
> Possible clashes w/ the PLX chip on your mb and the d13 ones? I've heard something like this before, but tbh, my board has a plx and I never ran into issues with one and two cards. i actually tested everything on Sniper Z87 w/ plx as well. But, it don't hurt to check if your mb needs a bios update. Any heavy use prior to the issues? The heat these beast put out must be insane at full speed. I can barely tolerate the heat of my wc setup w/ the fans at 40%. If the other one is giving you issues, I would wait for the rma return. At least you can test the returned one while the other goes off to rma.


I hope not, and if it is, why all of a sudden it started crapping on me when its been working fine for about 2 months out of the box?
X79 boards are already going the way of the Dodo like X58. Recent X79 boards are already demanding a price premium for used boards on ebay already.

Another update: So after having a clean run with my primary Devil 13 after all the formatting and windows reinstall on the main system SSD drive, I decided to put my second card back in. At first It bluescreened during the hardware driver install and after a hard reboot; ran a game, instantly freezes. Reboot again and now everything runs fine; all games etc.....I kinda wanna pull my hair over this if it turns out drivers ended up corrupting themselves out of nowhere this whole time.

I'm gonna keep running some test until late night....I just hope I really don't have to send the card in, though honestly not hesitant to do so.


----------



## Mega Man

Quote:


> Originally Posted by *Dagamus NM*
> 
> It did strike me as a little odd that the x2 does not come with a mini display port to display port adapter. I have a bunch from my 7950's and 7970's.
> 
> Anyhow, I am using a Sony TV in my living room at 1080p so it probably won't make any difference but as far as bandwidth goes, is HDMI just more restrictive than either of the two?
> 
> I wish there was a TV that could use displayport. Anybody know of a 55-65" 4k TV that does displayport? Preferably one with a thin bezel.


there is one but idk which iirc @tsm106 mentioned it in the 290x thread


----------



## xer0h0ur

It would be nice to see DP 1.3 make its way into TVs. 120Hz 4K glory ftw.


----------



## rakesh27

i think ive seen a few LG's or Samsung that have DP1.2, only a few though, i presume the next generation TV's again only a few i mean 1 or 2 will have DP1.3.

Its a shame though the whole point of these new 4k HDTV's that they are more versatile.

They should be able to produce all 4K TV's with the DP1.3 connection, the tecnology is there, why not get rid of the vga connector and replace it with a DP1.3

Hardly anyone would use the VGA connector since even simple DVI-D to hdmi is 10 better the standard VGA.

Get with the times Samsung, LG, Panasonic, Philips, Sony etc...

It would be great if they had this along with PIP on every 4k HDTV, picture in picture, PIP allows you to watch another input source unlike at the moment you can only watch normal digital TV...


----------



## BradleyW

Quote:


> Originally Posted by *Alex132*
> 
> Should be automatically enabled when you install the drivers. Test in a game/benchmark to see if it is.


Does not appear to be enabled.


----------



## xer0h0ur

Pfffffffffffffffft. Well. That did not last very long. The illuminated RADEON logo I took from the 295X2's shroud already burned out connected directly to the PSU with a molex to 2mm 2 pin connector cable. Anyone know where I can buy another one? Or does anyone who waterblocked their 295X2's have their logos for sale?


----------



## wermad

Did you try the 3.3v or 5v? I suspected the 12v would have been too much.


----------



## xer0h0ur

Well a molex connector is 5V(RED) GND (BLK) GND (BLK) 12V(YELLOW) and I didn't pay attention which it was connected to so I am pretty sure it was connecting through 12V. I should have flipped the pin over to the 5V. Whoops. Either way doesn't matter if I can't get another illuminated logo from someone. I already lost my patience googling around trying to find them for sale.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well a molex connector is 5V(RED) GND (BLK) GND (BLK) 12V(YELLOW) and I didn't pay attention which it was connected to so I am pretty sure it was connecting through 12V. I should have flipped the pin over to the 5V. Whoops. Either way doesn't matter if I can't get another illuminated logo from someone. I already lost my patience googling around trying to find them for sale.


i couldnt find them either,

and they are 12v fyi


----------



## xer0h0ur

So if its 12V then I just had bad luck with it dying?


----------



## Mega Man

i think so i tested the card at the output
Quote:


> Originally Posted by *xer0h0ur*
> 
> So I need some help here. I wanted to put a personal touch on my build so I took the illuminated RADEON logo from the 295X2's shroud, made a cut out on my drive bay cover with a dremel and glued it on. So that part is done and ready:
> 
> 
> 
> My question then lies in powering it. Can I just get a mini 2 pin to molex cable and go directly to the PSU or do I need to connect this another way? I obviously don't just have another one laying around so instead of frying it or something I figured I would differ to someone who has done this.


Quote:


> Originally Posted by *wermad*
> 
> I would say it may be fed via a 12v line and reduced. You can try a 3.3v and if that is no good, 5v would be the most i would try. Someone might now the voltage it gets. IF you fry them, you can probably replace them tbh.


Quote:


> Originally Posted by *Mega Man*
> 
> It is 12v. I have not tried lowering. But pwm down not help. ( ie doesn't seem to be leds.... )


this is the date i did it


----------



## wermad

Can't you mod a new led in it?


----------



## Mega Man

I don't think it a led


----------



## Dagamus NM

So the Panasonic AX800 is a 4K display that has DP 1.2, it will do [email protected] The 65" unit is about $2200. Looks like I found my next TV.

So now the question is will a single x2 run top titles with optimal frame rates and settings or should I pick up another one since they are super cheap right now? XFX unit is under $700


----------



## tsm106

Quote:


> Originally Posted by *Dagamus NM*
> 
> So the Panasonic AX800 is a 4K display that has DP 1.2, it will do [email protected] The 65" unit is about $2200. Looks like I found my next TV.
> 
> So now the question is will a single x2 run top titles with optimal frame rates and settings or should I pick up another one since they are super cheap right now? XFX unit is under $700


The only caveat with the Pannys is how they are constructed. The previous model had a design flaw where you would see these lines/banding on the screen with certain backgrounds. They were from manufacturing so there was no fix. I haven't kept up with this refreshed model so I dunno if they fixed it. Check it out though, because once you realize it's there its impossible to ignore it.

http://www.avsforum.com/forum/166-lcd-flat-panel-displays/1524183-2014-panasonic-ax800-ax900-4k-fald-led-lcd-lineup-10.html#post26039034


----------



## blarty

Anyone had any problems with the VRM fan? Playing the Witcher 2 with everything dialled up, after a couple of minutes I hear a clicking noise, come out of game and 15-20 seconds later it's gone, go into something like Valley or 3dMark and soon enough, there it is again.

It's not all the time while in game though so I don't believe its a wire or something else touching the fan, but it does only kick in under load, but even then it's not there 100% of the time.

I'm pretty sure it's the VRM fan, have considered it might be my bequiet! PSU as well, only because it was bought at the same time. Has anyone else encountered this issue, considering just popping the shroud and tightening the screws on the VRM fan and see if that makes a difference, but before I do that, thought I might ask if anyone else had any ideas to throw in.

Cheers


----------



## joeh4384

For the people running trifire with a 290x, which card do you put on top or run inputs from? If you run a 290x on top, do you have the option to setup crossfire to use 2 GPUs?


----------



## xer0h0ur

Quote:


> Originally Posted by *joeh4384*
> 
> For the people running trifire with a 290x, which card do you put on top or run inputs from? If you run a 290x on top, do you have the option to setup crossfire to use 2 GPUs?


I run the 295X2 below the 290X purely because my primary PCI-E slot is the bottom one. I only connect my monitor to the primary slotted card. If I disable crossfire in the CCC it will turn off tri-fire but leave the 295X2 crossfired.


----------



## GavenBylander

I have a question for the community. I just got a r9 295X2 via XFX I want to know would I gain any benefit to adding a 290x to my rig and putting it in crossfire? Is it even possible? Also whats a good stock overclocking setup for a straight out the box XFX R9 295x2?


----------



## ColeriaX

Been a while since I posted but here's my new loop with quadfire 295x2s 

Case is frankensteined til I decide on a need one. Blocks are aquacomputer Vesuvius with active backplate temps don't break 50c


----------



## wermad

Quote:


> Originally Posted by *GavenBylander*
> 
> I have a question for the community. I just got a r9 295X2 via XFX I want to know would I gain any benefit to adding a 290x to my rig and putting it in crossfire? Is it even possible? Also whats a good stock overclocking setup for a straight out the box XFX R9 295x2?


Welcome!

-you can crossfire a 290x with a 295x2. If you run triple or more monitors or 4k, it will help. If not, it's a waste imho.

-I've been hearing 1100 is really obtainable. Though make sure you're not hitting the thermal limit of 75c. If you're well under it, proceed w/ minor increments. If you plan to go custom water, a good loop should keep you in the 40s to 50s.

Quote:


> Originally Posted by *ColeriaX*
> 
> Been a while since I posted but here's my new loop with quadfire 295x2s
> 
> 
> 
> Case is frankensteined til I decide on a need one. Blocks are aquacomputer Vesuvius with active backplate temps don't break 50c


Both cards running 8x 2.0? How's your performance?


----------



## xer0h0ur

Quote:


> Originally Posted by *ColeriaX*
> 
> Been a while since I posted but here's my new loop with quadfire 295x2s
> 
> Case is frankensteined til I decide on a need one. Blocks are aquacomputer Vesuvius with active backplate temps don't break 50c


Would you be willing to sell me one or both of your illuminated RADEON logos from the original cooler's shroud? Need dat!


----------



## GavenBylander

Quote:


> Originally Posted by *wermad*
> 
> Welcome!
> 
> -you can crossfire a 290x with a 295x2. If you run triple or more monitors or 4k, it will help. If not, it's a waste imho.
> 
> -I've been hearing 1100 is really obtainable. Though make sure you're not hitting the thermal limit of 75c. If you're well under it, proceed w/ minor increments. If you plan to go custom water, a good loop should keep you in the 40s to 50s.


Thanks! I'll give it a shoot my temps are very low right now I'll let you know how it goes.


----------



## Dagamus NM

Quote:


> Originally Posted by *ColeriaX*
> 
> Been a while since I posted but here's my new loop with quadfire 295x2s
> 
> Case is frankensteined til I decide on a need one. Blocks are aquacomputer Vesuvius with active backplate temps don't break 50c


What is going on with the tube from your pump to the block? It looks almost flat going into the block.

Anyhow, couldn't help myself. Ordered a second xfx x2 today.


----------



## ColeriaX

Quote:


> Originally Posted by *Dagamus NM*
> 
> What is going on with the tube from your pump to the block? It looks almost flat going into the block.
> 
> Anyhow, couldn't help myself. Ordered a second xfx x2 today.


Just the way the photo looks. I know the tubing and fittings look bad but this is just going to be transplanted when i upgrade to skylake or whatever is faster whe ln i buy my new case. Anyone have any advice for the compression fittings not going over the tubing to tighten down? Fittings are 1/2 id 5/8 OD and tubing is 7/16 id 5/8 od primochill lrt. I tried hot water and hulking it onto the threads but it just wont go any further so i ziptied it for now. Could the tube wall be to thick on the primochill tubing? Fittings are all bitspower.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Would you be willing to sell me one or both of your illuminated RADEON logos from the original cooler's shroud? Need dat!


Sorry bud im keeping them for when i sell em :/.
Quote:


> Originally Posted by *wermad*
> 
> Welcome!
> 
> -you can crossfire a 290x with a 295x2. If you run triple or more monitors or 4k, it will help. If not, it's a waste imho.
> 
> -I've been hearing 1100 is really obtainable. Though make sure you're not hitting the thermal limit of 75c. If you're well under it, proceed w/ minor increments. If you plan to go custom water, a good loop should keep you in the 40s to 50s.
> Both cards running 8x 2.0? How's your performance?


Havent found something i cant run at 60 fps 4k yet. Monitor is UD590D @72hz


----------



## ColeriaX

Bte for anyone thinking of watercooling theirs with aftermarket blocks. The aquacomputer blocks with active backplates are fantastic. Bought some fujipoly 17 w/mk and used on vrms. With active cooling i can now run much higher daily freqs with more mv while still being stable. Very worth it IMO


----------



## xer0h0ur

Quote:


> Originally Posted by *ColeriaX*
> 
> Just the way the photo looks. I know the tubing and fittings look bad but this is just going to be transplanted when i upgrade to skylake or whatever is faster whe ln i buy my new case. Anyone have any advice for the compression fittings not going over the tubing to tighten down? Fittings are 1/2 id 5/8 OD and tubing is 7/16 id 5/8 od primochill lrt. I tried hot water and hulking it onto the threads but it just wont go any further so i ziptied it for now. Could the tube wall be to thick on the primochill tubing? Fittings are all bitspower.
> Sorry bud im keeping them for when i sell em :/.
> Havent found something i cant run at 60 fps 4k yet. Monitor is UD590D @72hz


Dang, well. Had to ask.


----------



## Dagamus NM

Yes it is the thick wall of the primochill lrt. My bits power fittings were tough so I started using the ek compressions, not the cheapest variant but the next one up and they work pretty well with it. At least I can get them down to the threads and mostly screwed down. I just got a set of rubber jawed plumbing pliers and will use those to finish the job without separating my fingernails from my skin.

What is the story with the sticker to the right of your computer?


----------



## gopheer

Quote:


> So I just put in my xfx 295x2 and the vrm fan is really fing loud is their any reason why or are they all this loud? I did install noctua 2000 rpms and the temps are not too bad


Same issue for me and i replaced it with the same fan as you. Did you find a solution? Gpuz says the vram fan is spinning at 2300

is that normal?

While playing gta v for a bit, 1440p frames are pretty horrible.


----------



## ColeriaX

Quote:


> Originally Posted by *Dagamus NM*
> 
> Yes it is the thick wall of the primochill lrt. My bits power fittings were tough so I started using the ek compressions, not the cheapest variant but the next one up and they work pretty well with it. At least I can get them down to the threads and mostly screwed down. I just got a set of rubber jawed plumbing pliers and will use those to finish the job without separating my fingernails from my skin.
> 
> What is the story with the sticker to the right of your computer?


If you knew my wife youd understand lets just say that. So whats another brand of tubing thats plasticizer free and will work with the bitspower fittings?


----------



## fat4l

Here is the ss of temps with Aquacomputer Kryographics Vesuvius + Active backplate









30mins of Furmark_v1.15
Max temp *40C* on both cores.
MSI 295X2 with Sapphire OC Bios (1030/5200MHz).
Temp of water 25C, (ΔT=3C)
Cpu was at 5G_1.38v.


----------



## xer0h0ur

Christ on a cracker. What radiators are you using to get your GPUs that cool?


----------



## electro2u

MoRAs


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Christ on a cracker. What radiators are you using to get your GPUs that cool?


Mora 3 420, 8x230mm Bitfenix spectre pro push/pull
2x EK Coolstream PE 240mm, Corsair sp120 performance, push/pull.









Has anyone found any way how to add more voltage(more than +100mV) to the card with msi afterburner ?
Modify afterburner or something ?


----------



## xer0h0ur

I vaguely remember someone doing so using both Afterburner and Trixx was it?


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> I vaguely remember someone doing so using both Afterburner and Trixx was it?


yes I did it







But im looking for more convenient way, just to use 1 program


----------



## Alex132

Aw yea



Also even disabling ULPS means I can only monitor 1 GPU in hwmonitor









Need to get a new PSU too - this one doesnt fit.

edit - I can monitor 2 GPUs in MSI AB though.

Oh and good info on this PSU. It's 2 rails, each 750w. But only 1 rail can feed the GPU. So 750w for all the GPUs, what crap from a 1350w PSU.

Gonna get a new one anyway, this doesn't fit. Looking at EVGA G2 1300. Sexy and powerful


----------



## Alex132

wth okay, so since I got a new GPU and PSU my G700S will not work in wireless mode.

Like... it just will freeze up and go stutter-crazy in wireless mode - but in wired mode its fine....

What do?

Plans are to test with same drivers and 7990 tomorrow, then ???

edit- trying to see if it was the aluminum radiator causing issues somehow, with it basically covering the receiver. All seems good so far...


----------



## Feyris

If someone cared to message me. Could help.


----------



## wermad

I had some whacky issues w/ my Razer keyboard and mouse. Stopping the software restored it but I lost any special features/functions. In the end, when I was ready to toss them out the door, it ended up being some stupid macross auto setting and saving causing the issues. I got rid of these and all was fine. So, first start w/ disabling or turning off the factory software. Try a restart and let windows install the basic mouse driver and see what it does then. Make sure you're on the latest driver/software for your mouse.

If you still have no luck ,and to rule out any gpu correlation, remove the amd drivers and do the mouse first. I doubt it but if things changed after you put in your 295x2, its worth a shot


----------



## Alex132

Okay so I got the fans working off of a mobo header, only problem is they seem to go 100% load when it hits 50'c no matter what.

Here are my settings:




The fans are Cougar Vortex.


----------



## y4h3ll

Quote:


> Originally Posted by *y4h3ll*
> 
> Here is what happens when I apply full screen.
> if you look at the top right you will notice it isn't in full screen.
> 
> Also here is a pic of what my gpu are doing "r9-295x2"
> Xfire isn't working I have icons enabled aswell. New beta driver everything.


THE R9 295X2 runs fine. Just disable msi afterburner, TeamViewer,Fraps and anything else you have for an overlay. Wait for the game to boot fully to where you can play. Alt tab then reopen Fraps and msi. Done fixed. If not try alt Enter then re-enable the programs.

**** I'm stupid. Hahahaha that's my post maybe I should read first bahahahha well if any one else didn't figure it out I did so there you go.


----------



## Alex132

This graphics card seems to be a ton of problems.

I am getting loads of slow-downs in games. I can't measure it (will try now) but it feels VERY stuttery in games. GPU usage keeps shifting from 100% / 0% to 50% / 50%. Game is Diablo 3.

edit - now all of a sudden its fine and not trying to use both GPUs in D3 (no crossfire / sli support).









GTA V had similar issues, but I never saw if it was the same cause.

edit 2 - it seems to be if the 2nd GPU is in use at all. Watching YT videos while playing D3 yields drops too, however it is much less frequent and noticeable.

Oh and temps never get beyond 50'c thanks to my borked fan profile where my fans seem to go HAM at 50'c even though I tell them not to


----------



## xer0h0ur

Quote:


> Originally Posted by *Alex132*
> 
> This graphics card seems to be a ton of problems.
> 
> I am getting loads of slow-downs in games. I can't measure it (will try now) but it feels VERY stuttery in games. GPU usage keeps shifting from 100% / 0% to 50% / 50%. Game is Diablo 3.
> 
> edit - now all of a sudden its fine and not trying to use both GPUs in D3 (no crossfire / sli support).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTA V had similar issues, but I never saw if it was the same cause.
> 
> edit 2 - it seems to be if the 2nd GPU is in use at all. Watching YT videos while playing D3 yields drops too, however it is much less frequent and noticeable.
> 
> Oh and temps never get beyond 50'c thanks to my borked fan profile where my fans seem to go HAM at 50'c even though I tell them not to


Ugh, simple. If the game was never designed to be SLIed or Crossfired then create an application profile for that game and manually set Crossfire to disabled. You shouldn't get that problem ever again. That is until you change the driver then you need to make the profile again.


----------



## wermad

I've been perfect since I got mine. I luv em to death







.

Test using games that use both cores (and avoid 1080 as I've seen that is just asking for more trouble). If you have any concerns, at least these games can tell if you if the card is working to its max. I typically launch metro ll, crysis 3, or bf3/bf4.


----------



## seblura

Can any1 with an EKWB assembled to the Cate measure how tall the Card are where the ports are?
Wanna try fit it in corsair 240 air.

Thanks in advance
Cheers


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> This graphics card seems to be a ton of problems.
> 
> I am getting loads of slow-downs in games. I can't measure it (will try now) but it feels VERY stuttery in games. GPU usage keeps shifting from 100% / 0% to 50% / 50%. Game is Diablo 3.
> 
> edit - now all of a sudden its fine and not trying to use both GPUs in D3 (no crossfire / sli support).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTA V had similar issues, but I never saw if it was the same cause.
> 
> edit 2 - it seems to be if the 2nd GPU is in use at all. Watching YT videos while playing D3 yields drops too, however it is much less frequent and noticeable.
> 
> Oh and temps never get beyond 50'c thanks to my borked fan profile where my fans seem to go HAM at 50'c even though I tell them not to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ugh, simple. If the game was never designed to be SLIed or Crossfired then create an application profile for that game and manually set Crossfire to disabled. You shouldn't get that problem ever again. That is until you change the driver then you need to make the profile again.
Click to expand...

I am running Windowed mode, thought Crossfire couldn't work in windowed mode. Also if it isn't designed to work, surely then it just won't use it? Nvidia did this perfectly fine...


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> I've been perfect since I got mine. I luv em to death
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Test using games that use both cores (and avoid 1080 as I've seen that is just asking for more trouble). If you have any concerns, at least these games can tell if you if the card is working to its max. I typically launch metro ll, crysis 3, or bf3/bf4.


Tried GTA V and it was stuttery too, gonna wait for new PSU before judging anything.


----------



## wermad

GTA V, I'm avoiding like a plague. I've heard lots of moans and groans to just wait until the bugs are worked out. Look for a slightly older game that does put a healthy load on the cores.


----------



## joeh4384

Quote:


> Originally Posted by *wermad*
> 
> GTA V, I'm avoiding like a plague. I've heard lots of moans and groans to just wait until the bugs are worked out. Look for a slightly older game that does put a healthy load on the cores.


GTA V works great for me on my 295x2 at 1440p. Pretty smooth with both GPUs being utilized nearly fully.


----------



## vonalka

Quote:


> Originally Posted by *wermad*
> 
> GTA V, I'm avoiding like a plague. I've heard lots of moans and groans to just wait until the bugs are worked out. Look for a slightly older game that does put a healthy load on the cores.


There have been quite a few bugs with GTA V PC, but it seems like there have been solutions to all of them (at least the ones I have experienced).

I agree with your point about trying the older games that put a good load on the GPU - Metro Last light was good for that, and of course BF4 as you mentioned.


----------



## xer0h0ur

Quote:


> Originally Posted by *Alex132*
> 
> I am running Windowed mode, thought Crossfire couldn't work in windowed mode. Also if it isn't designed to work, surely then it just won't use it? Nvidia did this perfectly fine...


The point is that you're seeing it jump from one GPU to the other and its causing your problems. Disabling it altogether manually should not allow it to do that.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I am running Windowed mode, thought Crossfire couldn't work in windowed mode. Also if it isn't designed to work, surely then it just won't use it? Nvidia did this perfectly fine...
> 
> 
> 
> The point is that you're seeing it jump from one GPU to the other and its causing your problems. Disabling it altogether manually should not allow it to do that.
Click to expand...

It seems to be okay so far for D3, haven't tested out other games though. (games that will use both).

And any idea why I don't have the performance tab / overdrive in CCC? I did an express install (then remove Raptr PoS) and no performance tab?


----------



## wermad

tempting to get Project cars though. Anyone running it yet?

Good to know GTA V is being ironed out. I'll keep it on my radar for now


----------



## xer0h0ur

Quote:


> Originally Posted by *Alex132*
> 
> It seems to be okay so far for D3, haven't tested out other games though. (games that will use both).
> 
> And any idea why I don't have the performance tab / overdrive in CCC? I did an express install (then remove Raptr PoS) and no performance tab?


I have actually had that happen to me multiple times. The vast majority of the time I just restart the PC and its back. There was once or twice though where I had to DDU the installation and re-install the Catalyst to finally get the Overdrive tab back.


----------



## Alex132

I've done both of those, well I didn't use DDU. I'll do that tomorrow.

Is there anything I'd need in there anyway?


----------



## xer0h0ur

Not at all, just download the latest DDU and run it. It will prompt you to restart so it does its job in safe mode. The only thing I would say is just make sure AMD is selected in the drop down box at the top. A few older versions of DDU have not automatically selected AMD for you before so just make sure that one is selected.


----------



## PCModderMike

Quote:


> Originally Posted by *vonalka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> GTA V, I'm avoiding like a plague. I've heard lots of moans and groans to just wait until the bugs are worked out. Look for a slightly older game that does put a healthy load on the cores.
> 
> 
> 
> There have been quite a few bugs with GTA V PC, but it seems like there have been solutions to all of them (at least the ones I have experienced).
> 
> I agree with your point about trying the older games that put a good load on the GPU - Metro Last light was good for that, and of course BF4 as you mentioned.
Click to expand...

This. GTA V seemed to be optimized quickly and it ran good for me.

Quote:


> Originally Posted by *wermad*
> 
> tempting to get Project cars though. Anyone running it yet?
> 
> Good to know GTA V is being ironed out. I'll keep it on my radar for now


I would love to pick up Project Cars eventually. Visually it looks like a great game.


----------



## glenn37216

Selling my 295x2... Got a good deal on 2 titans..
If anyone is interested.. Pm me. If this is wrong to post this here, please mods delete this post.

Used (6 months old )XFX core edition 295x2 for sale . $570.00 firm.via paypal -Lower US 48 only please. Firestrike Benchmark - 17305
Validation: http://www.3dmark.com/fs/4743494
Comes with all original accessories, 2- sp120 corsair fans for push/pull config.


----------



## xer0h0ur

Nope you shouldn't have any problems with saying you're selling it, here.


----------



## joztdarb

Quote:


> Originally Posted by *seblura*
> 
> Can any1 with an EKWB assembled to the Cate measure how tall the Card are where the ports are?
> Wanna try fit it in corsair 240 air.
> 
> Thanks in advance
> Cheers


Hey @seblura

The 295X2 is 300mm long and 20mm thick with an EKWB fitted. The ports are about 65mm and 95mm in.


----------



## seblura

Cheers for measuring.

Dont understand how tall is the card from the top of the port bracket to the bottom og the card?

Cheers.


----------



## seblura

Quote:


> Originally Posted by *joztdarb*
> 
> Hey @seblura
> 
> The 295X2 is 300mm long and 20mm thick with an EKWB fitted. The ports are about 65mm and 95mm in.


Cheers for measuring.

Dont understand how tall is the card from the top of the port bracket to the bottom og the card?

Cheers.


----------



## joztdarb

Quote:


> Originally Posted by *seblura*
> 
> Cheers for measuring.
> 
> Dont understand how tall is the card from the top of the port bracket to the bottom og the card?
> 
> Cheers.


Ah do you mean including the EK-FC Terminal ?



Its 125mm total (not including the actual PCI slot), the terminal adds 25mm


----------



## seblura

Quote:


> Originally Posted by *joztdarb*
> 
> Ah do you mean including the EK-FC Terminal ?
> 
> 
> 
> Its 125mm total (not including the actual PCI slot), the terminal adds 25mm


Cheers man, that was the measure i was looking for








Thanks alot champ


----------



## Roaches

Its been over 2 days since my package arrived at PowerColor's RMA center at City of Industries. I wonder whats taking them so long to get back to my emails. First time RMAing with them so I wonder if its like this with other people that own their cards and had problems. :/


----------



## xer0h0ur

These are large corporations were talking about here. I presume that like most other businesses packages are received, unpacked and processed in the order they come in. If you want to expedite it then you can call them to inquire about your RMA and mention they got the package X days ago.


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> These are large corporations were talking about here. I presume that like most other businesses packages are received, unpacked and processed in the order they come in. If you want to expedite it then you can call them to inquire about your RMA and mention they got the package X days ago.


I figured they're big as any other GPU AIB. I'm willing to give them a week of time before I call them. EVGA in comparison was surprisingly fast in response and ship out replacements when it comes to returning a few cards to replace.

I'm already missed my Quadfire Eyefinity experience.


----------



## xer0h0ur

Yeah I can imagine, I would be missing one of those bad boys too.


----------



## electro2u

Quote:


> Originally Posted by *Roaches*
> 
> Its been over 2 days since my package arrived at PowerColor's RMA center at City of Industries. I wonder whats taking them so long to get back to my emails. First time RMAing with them so I wonder if its like this with other people that own their cards and had problems. :/


I've done RMA with PowerColor previously and they took care of me after a week or so. The normal thing is 10-15 *business days* turn around for RMAs.

MSI actually states exactly that. My 295x2 has been at MSI for 2 weeks. If I don't hear anything by the end of the 3rd week, then I'll be pushing for some answers.


----------



## TooManyAlpacas

I currently have a R9 295x2 with a 1000w PSU and intel i7-4790k I was wondering if anyone knows if I could put a R9 290x in and do trifire without exceeding my PSU wattage limit


----------



## xer0h0ur

Every firestrike test draws more than 1000 watts for me. I wouldn't do it.


----------



## Dagamus NM

Quote:


> Originally Posted by *glenn37216*
> 
> Selling my 295x2... Got a good deal on 2 titans..
> If anyone is interested.. Pm me. If this is wrong to post this here, please mods delete this post.
> 
> Used (6 months old )XFX core edition 295x2 for sale . $570.00 firm.via paypal -Lower US 48 only please. Firestrike Benchmark - 17305
> Validation: http://www.3dmark.com/fs/4743494
> Comes with all original accessories, 2- sp120 corsair fans for push/pull config


Just a suggestion, you might want to visit tiger direct and newegg to get an idea of what the XFX card is selling for new. How firm are you on your price?


----------



## DividebyZERO

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> I currently have a R9 295x2 with a 1000w PSU and intel i7-4790k I was wondering if anyone knows if I could put a R9 290x in and do trifire without exceeding my PSU wattage limit


You will exceed 1kwatts on tri-fire for sure. Even stock, you will push your PSU to OCP or an early grave.


----------



## TooManyAlpacas

Thanks for the help I will hold off on buying it until I get a higher wattage PSU


----------



## glenn37216

Quote:


> Originally Posted by *Dagamus NM*
> 
> Just a suggestion, you might want to visit tiger direct and newegg to get an idea of what the XFX card is selling for new. How firm are you on your price?


Yup.. done that. Appreciate the suggestion.







Just didn't post it because the prices are fluctuating on this card right now.

Cards current sale prices...
Card + CorsairSP Performance fans
Tigerdirect - $659.00 shipped
NewEgg (Free 2 Day Shipping) $663.98
Ebay - Cheapest Used Sale as of 5/22/15- $560.00 (Stock fan)

-Sale is pending [email protected] [H]ard Forum ... If it doesn't sell I'll post back if someone else is interested.


----------



## kayan

Did anybody grab the Witcher 3? If so, what does performance look like at 3440x1440 everything ultra with nVidia Hairworks off? I'm sitting at 30-33fps in the first area after the tutorial.


----------



## DividebyZERO

Quote:


> Originally Posted by *kayan*
> 
> Did anybody grab the Witcher 3? If so, what does performance look like at 3440x1440 everything ultra with nVidia Hairworks off? I'm sitting at 30-33fps in the first area after the tutorial.


I don't have it, I'm going to get it when it gets crossfire support. Which i hope is soon.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kayan*
> 
> Did anybody grab the Witcher 3? If so, what does performance look like at 3440x1440 everything ultra with nVidia Hairworks off? I'm sitting at 30-33fps in the first area after the tutorial.


My Witcher 3 settings:
No hairworks, No ambient occlusion, High Settings, Medium Grass Density and OpenGL Triple Buffering set to Off in CCC, on one core at 1150/1500 = 60-75fps solid so far

Thats at 2560 x 1440 though, with Crossfire support i imagine i'll be hitting the 90-100fps mark.


----------



## kayan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> My Witcher 3 settings:
> No hairworks, No ambient occlusion, High Settings, Medium Grass Density and OpenGL Triple Buffering set to Off in CCC, on one core at 1150/1500 = 60-75fps solid so far
> 
> Thats at 2560 x 1440 though, with Crossfire support i imagine i'll be hitting the 90-100fps mark.


Hmm, honestly its smooth and not jumpy. I was surprised when I reinstalled FRAPS and saw only around 30. If i would have guessed it would have been around 40-45.

I haven't had any luck at all with overclocking my card at all, so this is at stock. The card isn't thermal throttling either, bad luck I guess.


----------



## rakesh27

Quote "I currently have a R9 295x2 with a 1000w PSU and intel i7-4790k I was wondering if anyone knows if I could put a R9 290x in and do trifire without exceeding my PSU wattage limit"

i suggest you google PSU calculator, it will give you a rough estimate of what you need.

Really get the Corsair AXi1500, you would be good to go for years to come, and you can sli/dual fire or even trifire... maybe quad at a push...

Check my specs im runnig trifire with a AXi1500, not a problem...


----------



## Coppy

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> I currently have a R9 295x2 with a 1000w PSU and intel i7-4790k I was wondering if anyone knows if I could put a R9 290x in and do trifire without exceeding my PSU wattage limit


I´m running TriFire Sapphire 295x2 and a Powercolor 290x LCS with a Corsair HX1050 (+ 4790K, 17 fans, one pump) and i never had Problems !
Cards are running stock Speeds. The 4790k is overclocked @4,6GHz 1,25V
Maximum Powerconsumation i´ve ever seen (Firestrike) was ~950W.

BUT....

I still want to upgrade my PSU, because there are more things to come


----------



## Sgt Bilko

Quote:


> Originally Posted by *rakesh27*
> 
> Quote "I currently have a R9 295x2 with a 1000w PSU and intel i7-4790k I was wondering if anyone knows if I could put a R9 290x in and do trifire without exceeding my PSU wattage limit"
> 
> i suggest you google PSU calculator, it will give you a rough estimate of what you need.
> 
> Really get the Corsair AXi1500, you would be good to go for years to come, and you can sli/dual fire or even trifire... maybe quad at a push...
> 
> Check my specs im runnig trifire with a AXi1500, not a problem...


I've done Quad at stock speeds on my AX1200i but i wouldn't recommend it.

For Crossfire = 850w-1000w,
Trifire = 1200w
Quad = 1300w+

Gives you room on each setup to allow for overclocking

just my


----------



## Alex132

How much could/does the 295X2 draw (by itself) when stock / overclocked anyway?

My 2500k draws a large amount of load, surprisingly. I wouldn't be surprised if it were around 200w.


----------



## wermad

I saw around 450w-500w each card @ stock. My system (cpu, mb, oc, 32x fasn, pump, etc.) ~200-250w. Hawaii is known to get a bit drunk once you dial up the clocks. I've yet to oc mine but once I finish my next mod/upgrade, I'll see if I can hit 1100.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> How much could/does the 295X2 draw (by itself) when stock / overclocked anyway?
> 
> My 2500k draws a large amount of load, surprisingly. I wouldn't be surprised if it were around 200w.


At stock i'd work off 500w or so but it can peak up to 550-600w for a couple of seconds.

I've got a Killawatt thing here but i haven't used it since my entire PC is plugged into a board and afaik that could throw the readings off?


----------



## Intelligents

Just ordered a 295x2 today on the cheap from the egg. I'm stoked to get started with this beast!


----------



## Alex132

Folding on the R9 295X2 with a very old and abused 1350w TP1350w with '748w' on the 12v rail - wonder how much of the PSU I am using


----------



## Alex132

Quote:


> Originally Posted by *Intelligents*
> 
> Just ordered a 295x2 today on the cheap from the egg. I'm stoked to get started with this beast!


Nice









My favourite graphics card by far, loading at 55/56'c while FOLDING on this GPU with stock cooling and it being rather quiet (not silent, just not awfully loud) just still leaves me a bit in awe.

Software is yeah, well, eh









edit- and the VRMs really do not get hot? I dont know what the reviewers were complaining about. I'd say they were in the 60-70s range. Much cooler than my GTX 690.


----------



## Intelligents

Nice! This card can't be any louder than my CF'd 7950s when they're begging for mercy. I'm sure this card will be just fine


----------



## Dagamus NM

Woot woot, just got my second x2 from UPS. Quad fire mine sweeper here I come.


----------



## Alex132

Quote:


> Originally Posted by *Intelligents*
> 
> Nice! This card can't be any louder than my CF'd 7950s when they're begging for mercy. I'm sure this card will be just fine


I highly recommend using custom fans on it that can start at ~5v.

I am using 2 Cougar Vortex fans with a fan profile set via speedfan, and controlled via my motherboard.


----------



## Intelligents

That's my plan


----------



## xer0h0ur

Quote:


> Originally Posted by *Dagamus NM*
> 
> Woot woot, just got my second x2 from UPS. Quad fire mine sweeper here I come.


10,000 FPS solid in Solitaire. LETS GO!


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> Woot woot, just got my second x2 from UPS. Quad fire mine sweeper here I come.
> 
> 
> 
> 10,000 FPS solid in Solitaire. LETS GO!
Click to expand...

nah....only 2500fps, no crossfire support


----------



## wermad

i'm over 9000!









I'm probably gonna go w/ a standard or reverse atx setup. I'll see how bad these guys droop/sag. I may pick up the CM vga bracket.


----------



## Alex132

Never had sagging issue with any GPUs


----------



## BradleyW

Quote:


> Originally Posted by *xer0h0ur*
> 
> 10,000 FPS solid in Solitaire. LETS GO!


Lock it to 30 for a more cinematic experience.


----------



## xer0h0ur

Quote:


> Originally Posted by *BradleyW*
> 
> Lock it to 30 for a more cinematic experience.


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> Never had sagging issue with any GPUs


I'm on blocks, more weight. just the block no water, no fittings, no adapters, and no pcb, 1kg.....

fyi, the HAF X and 690 II accessories kit have the bracket (not sold separately anymore) ~$15 + s/h.



I may take off the mesh if I don't like it


----------



## Dagamus NM

Well I cannot say that I didn't see this coming. My living room is clearly much much warmer. My UPS is rated for 1750W but damn is the fan noisy on that thing at load. I need to figure out how to watercool my UPS.

Without tweaking anything I started up dying light for the first time ever and it was terrible. I tried all of the game settings and it was awful. I imagine that there wild be a decent profile for it. I will have to play a game I am used to and see how it goes.

I am using a Rosewill 1300W PSU and I think that is not enough. I will move to one of my 1600W units and report back.

So after watching a couple attempts at completing real bench fail at the beginning of the multitask portion I know that I am power limited. Everything in the bench is memory, CPU, GPU or combination of two/three in intensity of which those three tests all scored about 115K but when it goes to the multitask portion it just stops it. No driver crash, no computer crash. Just says the mouse has moved when it has not.

So yeah, Lepa 1600W going in now. Everything is stock for the two x2's right now. I am kind of nervous about trying to push them hard with the puny water system. I should have some aquacomputer Vesuvius units in the next couple weeks. I will tear loose after that. So if at stock they are pulling over 500W each, then the CPU (3930K) is pulling in over 300W. It is running at 4.2GHz so it is likely that these GPU's run crazy high power.


----------



## DividebyZERO

Quote:


> Originally Posted by *Dagamus NM*
> 
> Well I cannot say that I didn't see this coming. My living room is clearly much much warmer. My UPS is rated for 1750W but damn is the fan noisy on that thing at load. I need to figure out how to watercool my UPS.
> 
> Without tweaking anything I started up dying light for the first time ever and it was terrible. I tried all of the game settings and it was awful. I imagine that there wild be a decent profile for it. I will have to play a game I am used to and see how it goes.
> 
> I am using a Rosewill 1300W PSU and I think that is not enough. I will move to one of my 1600W units and report back.


Dying light i think is 15.4 beta for crossfire?


----------



## Dagamus NM

Thank you much. I have been playing borderlands 2 with crossfire on and it is perfect. I will give that beta a shot


----------



## atnadeb

Hey guys, has anyone been able to get crossfire working with steam in home streaming? I would love to use the full power of my 295x2 but no games I have tested use crossfire while streaming. Any ideas?


----------



## xer0h0ur

Honestly I have no experience with streaming or else I would lend a hand.


----------



## F4ze0ne

Quote:


> Originally Posted by *Dagamus NM*
> 
> Without tweaking anything I started up dying light for the first time ever and it was terrible. I tried all of the game settings and it was awful. I imagine that there wild be a decent profile for it.


Try optimize 1x1 instead of the profile. I find this to work better from my experience.


----------



## xer0h0ur

I am playing Dying Light tri-fired right now using the 15.4.1 beta and I get good performance. The only thing that bothers me is the light from the sky. Its as if it flickers mimicking a blinding effect of adjusting to the light then stops. I don't know if its intentionally doing that or a bug.


----------



## Dagamus NM

Are you talking about in game or CCC?


----------



## The EX1

I am interested in the Devil 13. Ya ya I know it isn't practical and blah blah but that card is just so unique and has always captured my attention. I saw some open box specials on newegg for cheap but they sold out quick. Anyone have an idea where I can pick one up cheap for a quad fire setup? (no luck trying to find a used one anywhere). Keeping two of them cool is going to be a huge task, but my TJ11 with the dual 180mm fans blowing on that 90 degree MB has worked wonders for air cooling.


----------



## F4ze0ne

Quote:


> Originally Posted by *Dagamus NM*
> 
> Are you talking about in game or CCC?


In CCC.


----------



## Roaches

Quote:


> Originally Posted by *The EX1*
> 
> I am interested in the Devil 13. Ya ya I know it isn't practical and blah blah but that card is just so unique and has always captured my attention. I saw some open box specials on newegg for cheap but they sold out quick. Anyone have an idea where I can pick one up cheap for a quad fire setup? (no luck trying to find a used one anywhere). Keeping two of them cool is going to be a huge task, but my TJ11 with the dual 180mm fans blowing on that 90 degree MB has worked wonders for air cooling.


Replied to your PM.


----------



## SAFX

How common is this configuration at the front of the case, blowing IN air?

Link to video:


----------



## Feyris

Quote:


> Originally Posted by *SAFX*
> 
> How common is this configuration at the front of the case, blowing IN air?
> 
> Link to video:


if it is level or above gpu, fine. under is a big no no, Ive had to do it and temps were fine still so if you must go for it


----------



## SAFX

Ok, thank you,

btw, looking for a quality mini/DP vesa-complaint cable for this card, after some quick research I zeroed-in on this, appears to be the best with least issues, can you confirm?
http://amzn.com/B00A7R9I22


----------



## Dagamus NM

Yep, in that configuration you will be dumping 500W into your case and straight up into your VRMs. Better to flip that fan and if you have one in your case in front of that too. Exhaust out the front won't be as you have those two below as intake already.

Alright, changed power supply and gave it another run on rog real bench. It finished the process. It only netted me a 92126. My best with quadfire 6GB 280x's was 99224. Interesting, I should be able to get over 100K with my 3930K. I will expect more out of my 5960X's whenever my cases come in from caselabs. Scheduled for June 5th so over halfway there.


----------



## Orivaa

Quote:


> Originally Posted by *Dagamus NM*
> 
> Yep, in that configuration you will be dumping 500W into your case and straight up into your VRMs. Better to flip that fan and if you have one in your case in front of that too. Exhaust out the front won't be as you have those two below as intake already.
> 
> Alright, changed power supply and gave it another run on rog real bench. It finished the process. It only netted me a 92126. My best with quadfire 6GB 280x's was 99224. Interesting, I should be able to get over 100K with my 3930K. I will expect more out of my 5960X's whenever my cases come in from caselabs. Scheduled for June 5th so over halfway there.


Ha, I'm also waiting for a Caselab's case, and also scheduled for June 5th. Which one you getting?


----------



## Mega Man

D: kinda mad i never saw this https://www.visiontek.com/graphics-cards/liquid-cooled-series/visiontek-cryovenom-r9-295x2-detail.html


----------



## wermad

EK still selling that block?

So, my current setup is just too hot. I'm hitting the low sixties and idling in the low 40s. Even if it's been a bit chilly in SoCal, my temps are not the same. Also, my lepa is churning it's fan non stop as it obvious the entire cooling system isn't optimized with my current layout. I think I pushed the case's design in terms of wc thermal capability. I got a surprise when I got an offer for a different case. I hope to take delivery next week. Atx layout is something I'll solve with either a CM vga bracket or something diy.


----------



## xer0h0ur

I remember having listed that card hundreds of pages ago when it was still for sale and the 295X2 was well over 1000 bucks. I walked someone through installing the 240mm rad combo.


----------



## electro2u

Quote:


> Originally Posted by *wermad*
> 
> So, my current setup is just too hot. I'm hitting the low sixties and idling in the low 40s. Even if it's been a bit chilly in SoCal, my temps are not the same. Also, my lepa is churning it's fan non stop as it obvious the entire cooling system isn't optimized with my current layout. I think I pushed the case's design in terms of wc thermal capability. I got a surprise when I got an offer for a different case. I hope to take delivery next week. Atx layout is something I'll solve with either a CM vga bracket or something diy.


I was doing trifire with the 295x2 msi is deliberating about. Had a 360 and a 280 with a 4820k. Was near 60 with AC blocks and I hated the way it heated up the room. Not PC gaming much anymore. What software is it that's getting your system so hot?


----------



## Mega Man

sorry wearemad mine never break 40c :/

maybe you need more rad ? or pumps ?

and as far as i know they are, but i thought you already were blocked? koolance wasnt it ?


----------



## $k1||z_r0k

when will this card get freesync? sad its not working with dual-gpu card yet


----------



## Dagamus NM

Quote:


> Originally Posted by *Orivaa*
> 
> Ha, I'm also waiting for a Caselab's case, and also scheduled for June 5th. Which one you getting?


I have two Merlin SM8's coming, one standard and one reverse. A pair of 5960X workstations.

I opted for the pedestal as well as the 120mm top part for a horizontal rad and 3.140 fans. I have the pedestals and boy can you tell they are nice. I am still trying to decide what to do about paint, probably follow hanoverfist's guides on Daz and try and follow his method closely until I either develop my own or give up and pay somebody else to do it. I have the tools, just need to decide on the colors. I assume the rest of the case will be the same primer base but I think I wait until they are in hand before finalizing anything. At least I will have the motherboard tray and frame to start planning and test assembling everything while making my decision on paint. Yes cannot wait for the 5th of June.


----------



## Dagamus NM

Quote:


> Originally Posted by *Mega Man*
> 
> D: kinda mad i never saw this https://www.visiontek.com/graphics-cards/liquid-cooled-series/visiontek-cryovenom-r9-295x2-detail.html


At $1200? No thanks, the xfx units are sub $700 new. I can add a waterblock with backplate and still be under $1K. The Visiontek blocks EK made do look pretty sweet but not worth that much more especially when you multiply it by two.


----------



## Mega Man

did you also not see them OOS that was in the 1k pricing time frame which is actually a good deal ( 200 more for the block )


----------



## wermad

Quote:


> Originally Posted by *electro2u*
> 
> I was doing trifire with the 295x2 msi is deliberating about. Had a 360 and a 280 with a 4820k. Was near 60 with AC blocks and I hated the way it heated up the room. Not PC gaming much anymore. What software is it that's getting your system so hot?
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> sorry wearemad mine never break 40c :/
> 
> maybe you need more rad ? or pumps ?
> 
> and as far as i know they are, but i thought you already were blocked? koolance wasnt it ?
Click to expand...

I think my issue us that the heat is circulating inside more then properly exhausting it. My case is too cramped as I went beyond common sense with this case. The new has a more efficient layout. It was either revise change my wc drastically or get a new case.

Games are just punching out a lot of heat and my psu is not happy in its new spot. I blame myself for taking it too far and some poor designs of the case. In the end, it was either stacked vs diy vs a different case. I went with a different case for more flexibility on the water side and use.

Even at idle with a few chrome windows opened the system felt a tad more hot then what was used too. At cold startup it's good but after 10 minutes of browsing I can feel a good amount of heat pouring out of the case. Overall, just not efficient to my liking. With summer fast approaching, the boss don't want an oven tbh







.


----------



## Mega Man

what case then ?


----------



## SAFX

What are the standard post-installation steps/settings for this card?

Just got done setting up Sapphire 295x2, ran Valley, but numbers basically the same as my previous XFX 290 DD.

*What I've done so far:*

Installed 295x2 using separate 8pin VGA connectors to ports VGA1 and VGA3 on 1000w G2 SuperNova
Rad/fan installed in rear (push)
Driver - installed latest for W7 64, *14.501.1003.0*
MSIAFB - disabled *ULPS*
CCC - enabled *Frame Pacing*, set *Tesselation/Max Tesselation* to "Use application settings"
Ran GTA V benchmark, saw very high FPS compared to my previous 290, but crashed before scene 2 loaded.

*Questions*

GPU BIOS is outdated, can this impact FPS; should I update it?
Does AMD's beta driver work well with this gpu?
Did I miss anything else important?


----------



## xer0h0ur

I don't know which driver that is but I am currently using the latest Beta Catalyst 15.4.1 and its fine.


----------



## Elmy

Quote:


> Originally Posted by *SAFX*
> 
> What are the standard post-installation steps/settings for this card?
> 
> Just got done setting up Sapphire 295x2, ran Valley, but numbers basically the same as my previous XFX 290 DD.
> 
> *What I've done so far:*
> 
> Installed 295x2 using separate 8pin VGA connectors to ports VGA1 and VGA3 on 1000w G2 SuperNova
> Rad/fan installed in rear (push)
> Driver - installed latest for W7 64, *14.501.1003.0*
> MSIAFB - disabled *ULPS*
> CCC - enabled *Frame Pacing*, set *Tesselation/Max Tesselation* to "Use application settings"
> Ran GTA V benchmark, saw very high FPS compared to my previous 290, but crashed before scene 2 loaded.
> 
> *Questions*
> 
> GPU BIOS is outdated, can this impact FPS; should I update it?
> Does AMD's beta driver work well with this gpu?
> Did I miss anything else important?


Don't run the benchmark in windowed mode. Fullscreen only. Crossfire doesn't work in windowed mode.


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> what case then ?


I've yet to get it but I'll post pics once I take delivery.


----------



## Gambit74

Hello,

I got my 295x2 a couple of weeks ago and I pretty much built a whole new rig around it.
The rad is located in my Corsair 780T at the bottom front in push blowing air out.

Is this OK? I saw on one site that it should be above the card with the closed end tank at the highest point.

My case.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> 
> CCC - enabled *Frame Pacing*, set *Tesselation/Max Tesselation* to "Use application settings"


Should this be done? I haven't done this.

I have Frame Pacing enabled by default, and just changed Tesselation to "Use Application Settings" - whats the difference in the Tesselation? Does the AMD Tesselation look poopy or something?

Quote:


> Originally Posted by *Gambit74*
> 
> Hello,
> 
> I got my 295x2 a couple of weeks ago and I pretty much built a whole new rig around it.
> The rad is located in my Corsair 780T at the bottom front in push blowing air out.
> 
> Is this OK? I saw on one site that it should be above the card with the closed end tank at the highest point.
> 
> My case.
> 
> 
> Spoiler: Warning: Spoiler!


You want the end tank to be at the lowest point, but higher than the graphics card (just like you want a res above a pump). This is the ideal setup for the 295X2s radiator:


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know which driver that is but I am currently using the latest Beta Catalyst 15.4.1 and its fine.


I was using the beta 15 driver with my 290, it was way more stable than 14. GTA V benchmark always crashed on 14.
I started with 14 on the 295x2 to avoid introducing too many variables, but it seems the beta drive is in my future again.

What BIOS is your GPU using? 15.044, or 15.045?


----------



## SAFX

Quote:


> Originally Posted by *Elmy*
> 
> Don't run the benchmark in windowed mode. Fullscreen only. Crossfire doesn't work in windowed mode.


295x2 runs natively in CF mode? did not know that









Valley test performance was likely skewed in windowed mode, I'll try again with fullscreen


----------



## xer0h0ur

Quote:


> Originally Posted by *Alex132*
> 
> Should this be done? I haven't done this.
> 
> I have Frame Pacing enabled by default, and just changed Tesselation to "Use Application Settings" - whats the difference in the Tesselation? Does the AMD Tesselation look poopy or something?
> You want the end tank to be at the lowest point, but higher than the graphics card (just like you want a res above a pump). This is the ideal setup for the 295X2s radiator:


As far as I know there is no such thing as AMD tessellation or Nvidia tessellation. Its all the same with the only difference being the magnitude being employed. Nvidia's cards handle tessellation better so they intentionally employ 64X tessellation in Nvidia sponsored games when there is no visual difference between 8X and 64X tessellation. Its done to bring AMD's hardware, and coincidentally even their own 700 and 600 series, to its knees.
Quote:


> Originally Posted by *SAFX*
> 
> I was using the beta 15 driver with my 290, it was way more stable than 14. GTA V benchmark always crashed on 14.
> I started with 14 on the 295x2 to avoid introducing too many variables, but it seems the beta drive is in my future again.
> 
> What BIOS is your GPU using? 15.044, or 15.045?


Can't say off hand. I am at work so I will need to check my 295X2's BIOS version later.
Quote:


> Originally Posted by *SAFX*
> 
> 295x2 runs natively in CF mode? did not know that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Valley test performance was likely skewed in windowed mode, I'll try again with fullscreen


Yes the 295X2 explicitly runs in crossfire at all times unless you go into the catalyst control center, create an application profile for any given piece of software and disable crossfire within said profile.

For what its worth, this is convenient for tri-fire users as I can uncheck crossfire in the CCC and it will disable tri-fire yet leave the 295X2 crossfired.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> As far as I know there is no such thing as AMD tessellation or Nvidia tessellation. Its all the same with the only difference being the magnitude being employed. Nvidia's cards handle tessellation better so they intentionally employ 64X tessellation in Nvidia sponsored games when there is no visual difference between 8X and 64X tessellation. Its done to bring AMD's hardware, and coincidentally even their own 700 and 600 series, to its knees.


Ah lame. Gameworks is nothing but scummy practices really.

And I was more referring to this:


----------



## BradleyW

Quote:


> Originally Posted by *Alex132*
> 
> Ah lame. Gameworks is nothing but scummy practices really.
> 
> And I was more referring to this:


Don't think of it as "AMD's tessellation". No such thing. It is simply a gauge to control the level of tessellation in a game. AMD Optimized means it will use the profile default, which often there is no specified value, so it will just use the game's default tessellation level.

Example:
The only "game" that has a tessellation value in it's CCC gaming profile is CRYSIS 2, to my knowledge. So, in effect, using AMD Optimized will just call upon the value specified in the CRYSIS 2 profile.


----------



## Alex132

Ah okay cool, got it! Thanks


----------



## BradleyW

Quote:


> Originally Posted by *Alex132*
> 
> Ah okay cool, got it! Thanks


No problem buddy!


----------



## Gambit74

Quote:


> Originally Posted by *Alex132*
> 
> You want the end tank to be at the lowest point, but higher than the graphics card (just like you want a res above a pump). This is the ideal setup for the 295X2s radiator:


Well I moved the rad to the back exhaust position and ran AID64 Extreme's stability test for GPU and it hit 74c and throttled faster than it did in the base position....

Anyway, I played project cars for two hours this morning after exiting the game I could see in open hardware monitor that GPU 1 had hit 74c. So throttling must have occurred.

Deciding the stock fan was not substantial I put the stock fans from my H100i in push pull on the rad and wired them to the Corsair 780T's fan controller.... On max settings, yes it was a cool 62c,but OMG it was noisy.

What I've done now is take two of the Noctua NF-F12's off my H100i and used them in push pull going out the back, much better.

Project cars test time.


----------



## kayan

So, I finally applied to join the club, finally. I could really use some help overclocking my card. It just refuses to do so. I tried on my old AMD system with Trixx and MSIAB and blue screen before a full run of 3dMark. My card will run at stock, no problem. It hasn't throttled due to temp while at stock.

I don't get artifacts, it just blue screens. Here's my 3dMark benchmark:
http://www.3dmark.com/3dm/7083366?

Everything runs fine, except Witcher3 (as I posted a few days ago), and that's why I want to overclock the card. Any help is appreciated.


----------



## electro2u

These cards do not overclock well. They are too light on power delivery. Are you adding voltage and power %? Anyway I doubt an overclock would help much with a specific game running badly.

Try completely unintelligible trixx and AB (don't save settings). Reinstall AB and do the restart then turn ULPS off in AB settings and restart again.


----------



## SAFX

Just replaced stock rad fan with Corsair SP120 Performance Edition (no downvolting).
I ran gpuz bench for a few minutes while temps increased, but never heard increase in fan RPM.
GPUZ and MISAFB are showing mixed results, not sure if this is even accurate.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Just replaced stock rad fan with Corsair SP120 Performance Edition (not down-volted).
> Anything special that needs to be done? I ran gpuz bench for a few minutes while temps increased, but never heard increase in fan RPM.
> GPUZ and MISAFB are showing mixed results, not sure if this is even accurate


GPUZ shows the VRM fan while MSI AB should show the attached fan connector that can be used to power a fan for the radiator.

I have mine attached to my motherboard actually.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> I have mine attached to my motherboard actually.


Is the rad fan variable speed? I'm assuming with your setup it's constant RPM?


----------



## kayan

Quote:


> Originally Posted by *electro2u*
> 
> These cards do not overclock well. They are too light on power delivery. Are you adding voltage and power %? Anyway I doubt an overclock would help much with a specific game running badly.
> 
> Try completely unintelligible trixx and AB (don't save settings). Reinstall AB and do the restart then turn ULPS off in AB settings and restart again.


Yeah, I've tried adding 20-50% extra power, also various voltage steps, but still no dice. I just did as you suggested, so I'll play around with AB some now.

I bumped up core to 1030, PL% was 25, and the PC just shut down, no blue screen, no artifacts, this was during the 2nd part of 3dMark.


----------



## electro2u

Quote:


> Originally Posted by *kayan*
> 
> Yeah, I've tried adding 20-50% extra power, also various voltage steps, but still no dice. I just did as you suggested, so I'll play around with AB some now.
> 
> I bumped up core to 1030, PL% was 25, and the PC just shut down, no blue screen, no artifacts, this was during the 2nd part of 3dMark.


Hmmm. What Psu? Sounds like Over current protection went off. It's often the case that PSUs are multiracial even when marketed as single rail and with this one card it can be a problem. People with PSUs like this (my seasonic 1250 is an example) have to use separate PCIE cables for each 8 pin connector on the 295x2. Each connector needs to a minimum of 30amps to itself. Or 50 amp total in a single rail.


----------



## kayan

Quote:


> Originally Posted by *electro2u*
> 
> Hmmm. What Psu? Sounds like Over current protection went off. It's often the case that PSUs are multiracial even when marketed as single rail and with this one card it can be a problem. People with PSUs like this (my seasonic 1250 is an example) have to use separate PCIE cables for each 8 pin connector on the 295x2. Each connector needs to a minimum of 30amps to itself. Or 50 amp total in a single rail.


I updated my rig in my signature, but I have a CoolerMaster v1000. It is single rail. It's not even two years old. :/ I'm really unsure of what to do.

Edit: And yes, I've got 2 separate PCI-E 8pins going into the card.


----------



## wermad

I ran a V1000 on a single stock 295x2 (tested both of my cards one at a time) with no issues.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> I ran a V1000 on a single stock 295x2 (tested both of my cards one at a time) with no issues.


Do you think it could be an issue with my PSU, or maybe GPU? As long as the GPU is stock, everything is fine. Nothing is overclocked on my system currently.


----------



## xer0h0ur

To be honest I don't really even get very much of a performance boost from overclocking to 1100/1700. I have gone back to 24/7 clocks of 1060/1500.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I have mine attached to my motherboard actually.
> 
> 
> 
> Is the rad fan variable speed? I'm assuming with your setup it's constant RPM?
Click to expand...

800RPM till it hits 50'c then it goes to 1500rpm. I cant seem to change this 50'c change with speedfan, even if the profile is set to not ramp up.
Not sure why, probably motherboard related.


----------



## Alex132

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *electro2u*
> 
> Hmmm. What Psu? Sounds like Over current protection went off. It's often the case that PSUs are multiracial even when marketed as single rail and with this one card it can be a problem. People with PSUs like this (my seasonic 1250 is an example) have to use separate PCIE cables for each 8 pin connector on the 295x2. Each connector needs to a minimum of 30amps to itself. Or 50 amp total in a single rail.
> 
> 
> 
> I updated my rig in my signature, but I have a CoolerMaster v1000. It is single rail. It's not even two years old. :/ I'm really unsure of what to do.
> 
> Edit: And yes, I've got 2 separate PCI-E 8pins going into the card.
Click to expand...

Try different connectors / not using extensions / different ports on the PSU.

I had something similar on my HX850 with my GTX690 - I switched from the hardlined PCIE wires to the modular ones, and it went away...


----------



## wermad

Quote:


> Originally Posted by *kayan*
> 
> Do you think it could be an issue with my PSU, or maybe GPU? As long as the GPU is stock, everything is fine. Nothing is overclocked on my system currently.


So it only bsod after oc'ing? What bsod code do you get? Power issue what have shut down your psu. Vga issues what have gone to a blank screen.

Edit: try your cpu @ stock and then do a slight oc on the gpu.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> So it only bsod after oc'ing? What bsod code do you get? Power issue what have shut down your psu. Vga issues what have gone to a blank screen.
> 
> Edit: try your cpu @ stock and then do a slight oc on the gpu.


Nope, no bsod this time around, sometimes it blue screens sometimes it just shuts down. My CPU is NOT overclocked at all. My last attempt I bumped core clock to 1030, and then just bam...shut down, no warning.


----------



## Gambit74

Quote:


> Originally Posted by *kayan*
> 
> Everything runs fine, except Witcher3 (as I posted a few days ago), and that's why I want to overclock the card. Any help is appreciated.


I wouldn't bother over clocking your GPU. Wicher 3 runs poorly at the moment on AMD cards according to this.
http://attackofthefanboy.com/news/amd-promises-better-performance-for-the-witcher-3-and-project-cars/

Are you running the latest beta driver? That might help if AMD have made improvements for Witcher3.

Is your CPU at stock 3.3ghz, I ask because I did the same firestrike extreme test and finished with 8777 my physics score off the CPU was 15691


----------



## Alex132

PCars needs a boost, just got that game and I dont want it to run like poop


----------



## Gambit74

Quote:


> Originally Posted by *kayan*
> 
> Nope, no bsod this time around, sometimes it blue screens sometimes it just shuts down. My CPU is NOT overclocked at all. My last attempt I bumped core clock to 1030, and then just bam...shut down, no warning.


I typed the cpu oc bit before your last post. 

Why haven't you oc you cpu?
Seems strange to me to try to oc what is probably one of the most powerful gaming graphics cards and run cpu at stock.


----------



## kayan

Quote:


> Originally Posted by *Gambit74*
> 
> I wouldn't bother over clocking your GPU. Wicher 3 runs poorly at the moment on AMD cards according to this.
> http://attackofthefanboy.com/news/amd-promises-better-performance-for-the-witcher-3-and-project-cars/
> 
> Are you running the latest beta driver? That might help if AMD have made improvements for Witcher3.
> 
> Is your CPU at stock 3.3ghz, I ask because I did the same firestrike extreme test and finished with 8777 my physics score off the CPU was 15691


Yeah, the only games I care about this year are all NVidia optimized, bah (with the exception of Battlefront). Makes me feel like switching, but know that there's no point. I saw that article a couple days back, I'm using beta driver 15.4.1.

My CPU is at stock. Thought about OC'ing it, but don't really need to right now. May try to take it up to 4 though, since I have a custom loop for just my CPU now (360mm of rad space for nothing, lol).


----------



## Gambit74

Quote:


> Originally Posted by *Alex132*
> 
> PCars needs a boost, just got that game and I dont want it to run like poop


Pcars runs fine for me, I did 30 laps of le mans today in lmp1 with 55 bots on the track and settings set to max. No idea of the frame rate but it wasn't jerky.


----------



## Gambit74

Quote:


> Originally Posted by *kayan*
> 
> Yeah, the only games I care about this year are all NVidia optimized, bah (with the exception of Battlefront). Makes me feel like switching, but know that there's no point. I saw that article a couple days back, I'm using beta driver 15.4.1.
> 
> My CPU is at stock. Thought about OC'ing it, but don't really need to right now. May try to take it up to 4 though, since I have a custom loop for just my CPU now (360mm of rad space for nothing, lol).


My CPU is at 4ghz with 1.09volts.
With the H100i and two notua NF-F12's in push.

Value is current with fans at a walking pace, max was playing Pcars and the fans are at about 50% lol.
Sneaky Pam.


----------



## SAFX

What's the difference between 15.044 and 15.045 BIOS?


----------



## xer0h0ur

I have never changed the vBIOS on any video cards so really I wouldn't even know where to begin with finding changelogs for a vBIOS.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have never changed the vBIOS on any video cards so really I wouldn't even know where to begin with finding changelogs for a vBIOS.


thanks!


----------



## xer0h0ur

Quote:


> Originally Posted by *SAFX*
> 
> thanks!


I suppose if I had to begin somewhere I would check techpowerup since gpu-z uploads to them I believe.


----------



## wermad

@kayan

Try running a few loop sessions of 3d11 or 3dFs at stock. Make sure you have a monitoring program to graph your gpu's activity during the runs. Ensure both cores are loading about the same. Roll back on the drivers and make sure your bios for board are up to date.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> What's the difference between 15.044 and 15.045 BIOS?


States it right there.

Quote:


> Originally Posted by *15.044*
> 
> Manufacturer: Sapphire
> Model: R9 295X2
> Device Id: 1002 67B9
> Subsystem Id: 1002 0B2A
> Interface: PCI-E
> Memory Size: 4096 MB
> *GPU Clock: 1018 MHz
> Memory Clock: 1250 MHz*
> Memory Type: GDDR5


Quote:


> Originally Posted by *15.045*
> 
> Manufacturer: Sapphire
> Model: R9 295X2
> Device Id: 1002 67B9
> Subsystem Id: 1002 0B2A
> Interface: PCI-E
> Memory Size: 4096 MB
> *GPU Clock: 1030 MHz
> Memory Clock: 1300 MHz*
> Memory Type: GDDR5


15.045 is the Sapphire OC edition BIOS.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> States it right there.
> 
> 15.045 is the Sapphire OC edition BIOS.


I must be blind,









..doesn't seem worth the risk for such a mild OC, right?


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> @kayan
> 
> Try running a few loop sessions of 3d11 or 3dFs at stock. Make sure you have a monitoring program to graph your gpu's activity during the runs. Ensure both cores are loading about the same. Roll back on the drivers and make sure your bios for board are up to date.


Yup, everything loads as it should at stock. Did it on omega and both beta drivers since. On two different systems, my old AMD and the Intel now.


----------



## SAFX

Is it advisable to enable Overdrive --> Power Limit 50% in CCC without OC'ing?

I just ran a few tests in 3DMark with Default, Extreme, and Ultra settings, 50% Power Limit did not seem to make much difference.


----------



## wermad

Delete


----------



## wermad

Quote:


> Originally Posted by *kayan*
> 
> Yup, everything loads as it should at stock. Did it on omega and both beta drivers since. On two different systems, my old AMD and the Intel now.


Try 1020mhz. Maybe your core(s) is touchy. If you wanna try the ultimate torture test at stock (at your own risk): run prime + furmark simultaneously. I honestly don't recommend this unless you have no other testing methods left to try. If it fails, I would send it in for rma. If it survives then it could be just one or both cores don't oc at all. You running the stock 295x bios?

I ran this test with quad 6970s btw with a p67 board @4.9. Massive amounts of power, close to 1500w @ the kill-a-watt. This is one of the toughest test imho out there. And really risky. If you don't wanna run this, send in for rma.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Try 1020mhz. Maybe your core(s) is touchy. If you wanna try the ultimate torture test at stock (at your own risk): run prime + furmark simultaneously. I honestly don't recommend this unless you have no other testing methods left to try. If it fails, I would send it in for rma. If it survives then it could be just one or both cores don't oc at all. You running the stock 295x bios?
> 
> I ran this test with quad 6970s btw with a p67 board @4.9. Massive amounts of power, close to 1500w @ the kill-a-watt. This is one of the toughest test imho out there. And really risky. If you don't wanna run this, send in for rma.


Bios is stock. I've never messed with flashing before. Also, don't think I'll be running prime and Furmark. Sounds like my PC may implode, and I don't want that. It's brand new.


----------



## wermad

Lol, it's a crazy test. Very risky and not for the cautious one. I wouldn't try tbh. I've already spent enough dough.

I would rma it if it's new. 295x cores should be high binned and you may have a poor core or something else wrong. You also using the auxiliary gpu power on your mb (if it has one)?


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Lol, it's a crazy test. Very risky and not for the cautious one. I wouldn't try tbh. I've already spent enough dough.
> 
> I would rma it if it's new. 295x cores should be high binned and you may have a poor core or something else wrong. You also using the auxiliary gpu power on your mb (if it has one)?


Yeah, the card is new as of about 4 months ago, and it wasn't used for 2 months of that. Yeah, I am using the auxiliary power too. What do I tell them when I RMA it? It won't overclock, or something else?


----------



## wermad

Stability issue. I would refrain from saying oc as that could throw a red light for them. There's a chance you may get the same card back or a refurbished unit. Up to you


----------



## Alex132

Yeah just fold on both GPUs + 75% of your CPU power if you wanna test your PSU.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Stability issue. I would refrain from saying oc as that could throw a red light for them. There's a chance you may get the same card back or a refurbished unit. Up to you


Well, I tried what you said, 1020 was fine, as was 1025, no increase in volts or power settings.

I looked into my mobo, and it does have an auxiliary power via molex right next to the top pci-e slot. Kinda odd, but it says only to use if using more than 3 GPUs.

I updated my bios on the mobo, with no help, and then my system refused to boot, whee.


----------



## Juris

Hey guys sorry to bug with yet another "my 295x2 is throttling" post but wondering if your combined experiences can figure out a way out of this. My 295x2 doesn't have an easy life. Its driving this thing https://forums.frontier.co.uk/showthread.php?t=19065&page=45 (post 663 as Juris again) and hitting the magical 74c within a few minutes of play. Still get pretty good fps at 5680x2160.

Per the PC pic I'm running the 295x2 with standard rad & fan as a rear exhaust with the standard NZXT S340 non-PWM case fan at 1000rpm at the top also as exhaust. As intake/cpu cooling I'm running an NZXT Kraken x60 with standard loud fans, CPU temps idling at 33c going to 40c max (bloody amazing cooler).

The 295x2 rad has the tubes at the top idling at 44c odd. When I had them at the bottom it was 46c odd, which is odd







(am I causing long term problems with this?). I fitted the ModDiy cable to control the vrm fan via PWM but it literally had no effect whatsoever, not even 1c.









My last hope is to replace all the fans, 2x X60 140mm, the 295x2 fan and the nzxt standard case fan so just wondering what you think is the best option. I was looking at a combination of Corsair SP 140, SP 120 & AF140 as they will suit the look of the build but I'm open to options and as the 2 front rad fans are hidden looks aren't that important there. Sorry for the long post but going slightly nuts trying to find a way to sort this. Cheers.


----------



## joeh4384

Quote:


> Originally Posted by *Juris*
> 
> Hey guys sorry to bug with yet another "my 295x2 is throttling" post but wondering if your combined experiences can figure out a way out of this. My 295x2 doesn't have an easy life. Its driving this thing https://forums.frontier.co.uk/showthread.php?t=19065&page=45 (post 663 as Juris again) and hitting the magical 74c within a few minutes of play. Still get pretty good fps at 5680x2160.
> 
> Per the PC pic I'm running the 295x2 with standard rad & fan as a rear exhaust with the standard NZXT S340 non-PWM case fan at 1000rpm at the top also as exhaust. As intake/cpu cooling I'm running an NZXT Kraken x60 with standard loud fans, CPU temps idling at 33c going to 40c max (bloody amazing cooler).
> 
> The 295x2 rad has the tubes at the top idling at 44c odd. When I had them at the bottom it was 46c odd, which is odd
> 
> 
> 
> 
> 
> 
> 
> (am I causing long term problems with this?). I fitted the ModDiy cable to control the vrm fan via PWM but it literally had no effect whatsoever, not even 1c.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My last hope is to replace all the fans, 2x X60 140mm, the 295x2 fan and the nzxt standard case fan so just wondering what you think is the best option. I was looking at a combination of Corsair SP 140, SP 120 & AF140 as they will suit the look of the build but I'm open to options and as the 2 front rad fans are hidden looks aren't that important there. Sorry for the long post but going slightly nuts trying to find a way to sort this. Cheers.


I run Corsair SP120 performance editions but they are pretty loud. My temps will max on 72 in sleeping dogs which for some reason makes the card hotter than anything else, most other games 66-69, if I let the card control one sp120 and keep the other on medium with my fan controller. This setup is a tad warmer than maxing the 2nd fan but it is pretty quiet. I get temps maxed in 60s with both sp 120s ramped but it is loud.


----------



## Alex132

I don't get why everyone has such high temps, I settle out at around ~49 / 50'c _while folding_



I don't have A/C or anything, room temperature was 20.0 -> 20.2'c during that hour period.


----------



## SAFX

After several hours of benchmarking with varying settings and drivers, here's my results:
Quote:


> Originally Posted by *joeh4384*
> 
> I run Corsair SP120 performance editions but they are pretty loud..... I get temps maxed in 60s with both sp 120s ramped but it is loud.


How did you confifure the SP120?
Quote:


> Originally Posted by *wermad*
> 
> Lol, it's a crazy test. Very risky and not for the cautious one. I wouldn't try tbh. I've already spent enough dough.
> 
> I would rma it if it's new. 295x cores should be high binned and you may have a poor core or something else wrong. You also using the auxiliary gpu power on your mb (if it has one)?


Hey man, you selling your rig? I'll fly to Cali to buy it


----------



## SAFX

After several hours of benchmarking with varying settings and drivers, here's my results:
Quote:


> Originally Posted by *joeh4384*
> 
> I run Corsair SP120 performance editions but they are pretty loud..... I get temps maxed in 60s with both sp 120s ramped but it is loud.


How did you configure the SP120? I've got the same fan on the rad but it spins at constant speed, never adjusts for temps.


----------



## xer0h0ur

Quote:


> Originally Posted by *Juris*
> 
> Hey guys sorry to bug with yet another "my 295x2 is throttling" post but wondering if your combined experiences can figure out a way out of this. My 295x2 doesn't have an easy life. Its driving this thing https://forums.frontier.co.uk/showthread.php?t=19065&page=45 (post 663 as Juris again) and hitting the magical 74c within a few minutes of play. Still get pretty good fps at 5680x2160.
> 
> Per the PC pic I'm running the 295x2 with standard rad & fan as a rear exhaust with the standard NZXT S340 non-PWM case fan at 1000rpm at the top also as exhaust. As intake/cpu cooling I'm running an NZXT Kraken x60 with standard loud fans, CPU temps idling at 33c going to 40c max (bloody amazing cooler).
> 
> The 295x2 rad has the tubes at the top idling at 44c odd. When I had them at the bottom it was 46c odd, which is odd
> 
> 
> 
> 
> 
> 
> 
> (am I causing long term problems with this?). I fitted the ModDiy cable to control the vrm fan via PWM but it literally had no effect whatsoever, not even 1c.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My last hope is to replace all the fans, 2x X60 140mm, the 295x2 fan and the nzxt standard case fan so just wondering what you think is the best option. I was looking at a combination of Corsair SP 140, SP 120 & AF140 as they will suit the look of the build but I'm open to options and as the 2 front rad fans are hidden looks aren't that important there. Sorry for the long post but going slightly nuts trying to find a way to sort this. Cheers.


You may want to instead flip the fan so its drawing in fresh air from outside to push through the radiator instead of drawing hot air from inside the case to pull through the radiator. I would also go ahead and put a 2nd fan for push/pull. Those two changes alone should make a significant difference for you.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Stability issue. I would refrain from saying oc as that could throw a red light for them. There's a chance you may get the same card back or a refurbished unit. Up to you


I decided to give the variant BIOS a shot. Will see how that goes and post back.


----------



## Juris

Quote:


> Originally Posted by *xer0h0ur*
> 
> You may want to instead flip the fan so its drawing in fresh air from outside to push through the radiator instead of drawing hot air from inside the case to pull through the radiator. I would also go ahead and put a 2nd fan for push/pull. Those two changes alone should make a significant difference for you.


Cheers for the suggestion. Just wondering though wouldn't that mean I've got the 295x2 exhausting into the case as well as the Kraken x60 doing the same with only the top single fan to remove the heat?. Like a hot air pincer movement on the GPU VRMS. I suppose I could set the front mount Kraken as exhaust.

Either way I guess it can't hurt to try for a an hour or so.


----------



## SAFX

Quote:


> Originally Posted by *kayan*
> 
> I decided to give the variant BIOS a shot. Will see how that goes and post back.


How do you intent to flash the bios? I want to flash to 045 (currently on 044).


----------



## kayan

Quote:


> Originally Posted by *SAFX*
> 
> How do you intent to flash the bios? I want to flash to 045 (currently on 044).


Oh, I have no idea about flashing the bios, I'm just going to try the onboard switch.


----------



## Alex132

How to flash an ATI/AMD vBIOS.

I did it with my 5870s, it's not that hard. Just wouldn't do it for the sake of going 1018/1250 -> 1030/1300. These cards output enough heat as is.


----------



## xer0h0ur

Quote:


> Originally Posted by *Juris*
> 
> Cheers for the suggestion. Just wondering though wouldn't that mean I've got the 295x2 exhausting into the case as well as the Kraken x60 doing the same with only the top single fan to remove the heat?. Like a hot air pincer movement on the GPU VRMS. I suppose I could set the front mount Kraken as exhaust.
> 
> Either way I guess it can't hurt to try for a an hour or so.


You may be surprised once you try it. I also thought it wouldn't be good for my micro ATX case but when I was running air cooled I dropped my temps well enough to hold back throttling by adding a 2nd fan and flipping the airflow. I also didn't have the advantage you do of manually controlling your VRM fan. By the way you won't notice the difference doing that with the VRM fan makes since you can't monitor the VRM temperatures on the 295X2 but its making a difference for you.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> How to flash an ATI/AMD vBIOS.
> 
> I did it with my 5870s, it's not that hard. Just wouldn't do it for the sake of going 1018/1250 -> 1030/1300. These cards output enough heat as is.


I hear ya, just can't resist the urge to try, see what happens

bytw, what's the max temp before this card throttles, 75c?


----------



## xer0h0ur

Its supposed to throttle at 75C but my card throttled from 74C when it was on the original cooler.


----------



## Alex132

It throttles at 75'c. But just stay below 73'c really.


----------



## wermad

Quote:


> Originally Posted by *kayan*
> 
> Well, I tried what you said, 1020 was fine, as was 1025, no increase in volts or power settings.
> 
> I looked into my mobo, and it does have an auxiliary power via molex right next to the top pci-e slot. Kinda odd, but it says only to use if using more than 3 GPUs.
> 
> I updated my bios on the mobo, with no help, and then my system refused to boot, whee.


Damn, your board has dual bios?
Quote:


> Originally Posted by *SAFX*
> 
> After several hours of benchmarking with varying settings and drivers, here's my results:
> How did you confifure the SP120?
> Hey man, you selling your rig? I'll fly to Cali to buy it


I'm switching cases and uprading/revising the cooling. I'm still luving my quads







. Can't let them go. I've yet to get the new case though.
Quote:


> Originally Posted by *kayan*
> 
> I decided to give the variant BIOS a shot. Will see how that goes and post back.


Cool, let us know what you find. You did try both bios on the card I assume?


----------



## xer0h0ur

I was under the impression all 295X2's had the 2 way BIOS switch.


----------



## Mega Man

all 295x2s should have dual bios !~ afaik they are all ref cards which have it


----------



## kayan

Quote:


> Originally Posted by *xer0h0ur*
> 
> I was under the impression all 295X2's had the 2 way BIOS switch.


I think he meant the dual bios on my motherboard. I'm trying both alternate bioses (?) for GPU and mobo.


----------



## xer0h0ur

Ohhhh, brain fart. Yeah he was talking about the motherboard on that one.


----------



## Juris

Ok results. Some good and some very very bad.
Original: 295x2 push exhaust + Kraken Intake (silent or performance mode) = Throttle
1: 295x2 pull intake + Kraken intake = Throttle + tons of internal heat
2: 295x2 pull intake + Kraken exhaust (silent or performance mode) = throttle with less internal heat
3: 295x2 push intake + Kraken exhaust in performance mode (ie jumbo jet) = 68c - 69c with one huge problem...

My 1250w XFX Black Edition PSU (really a rebranded Seasonic) sounded like this after a few minutes (this isn't me obviously but it sounds the same) 



I turned the system off and repeated the process. Each time the 295x2 reached its 68c - 69c stable temps mark a few minutes later the PSU would do the same thing. Think I'll go back to looking at some replacement fans as a possible solution. Hopefully won't need to RMA the PSU but going to let the system cool down for a while. Scary.


----------



## electro2u

Quote:


> Originally Posted by *Juris*
> 
> Ok results. Some good and some very very bad.
> Original: 295x2 push exhaust + Kraken Intake (silent or performance mode) = Throttle
> 1: 295x2 pull intake + Kraken intake = Throttle + tons of internal heat
> 2: 295x2 pull intake + Kraken exhaust (silent or performance mode) = throttle with less internal heat
> 3: 295x2 push intake + Kraken exhaust in performance mode (ie jumbo jet) = 68c - 69c with one huge problem...
> 
> My 1250w XFX Black Edition PSU (really a rebranded Seasonic) sounded like this after a few minutes (this isn't me obviously but it sounds the same)
> 
> 
> 
> I turned the system off and repeated the process. Each time the 295x2 reached its 68c - 69c stable temps mark a few minutes later the PSU would do the same thing. Think I'll go back to looking at some replacement fans as a possible solution. Hopefully won't need to RMA the PSU but going to let the system cool down for a while. Scary.


Do you have the card powered off of 2 separate PCIE cables running 1 to each of the 2 8-pin PCIE connectors on the card? That PSU (I have the Seasonic 1250 it's based on) is actually multirail and running a 295x2 off a single PCIE output from the PSU is improper.


----------



## Juris

Quote:


> Originally Posted by *electro2u*
> 
> Do you have the card powered off of 2 separate PCIE cables running 1 to each of the 2 8-pin PCIE connectors on the card? That PSU (I have the Seasonic 1250 it's based on) is actually multirail and running a 295x2 off a single PCIE output from the PSU is improper.


Yep 2 separate 8-pin connectors, no splitters involved so this really shouldn't happen. Not good.


----------



## SAFX

Asked this question yesterday,

What setting is most typical for *Power Limit* in Overdrive/MSI, assuming no intent to OC? 25%, 50%, don't change, etc?


----------



## Alex132

Lol those capacitors. It's probably fine, but I'd RMA it.

My V1000 doesn't even get warm while folding on the R9 295X2. Like honestly, the air coming out of it is cold. Gotta love a platinum PSU on 230v though, essentially titanium rated









And with regards to temperatures, I have 0 intakes and no side-panel on (yay 800D) The 295X2 radiator is exposed to open-air.


Spoiler: bad pic







Push/pull greatly helps this dense radiator and I would look for a good static-pressure orientated fan like EK Vardar, GTs, Cougar, Noise Blockers, bequiet! fans. (In that order).

~1500 peak RPM on both should be more than enough, allows you to throttle them down to ~800rpm when idle too.

Quote:


> Originally Posted by *SAFX*
> 
> Asked this question yesterday,
> 
> What setting is most typical for *Power Limit* in Overdrive/MSI, assuming no intent to OC? 25%, 50%, don't change, etc?


Stock.

You'd only want to raise it if it was downclocking due to it trying to stay within power constraints and not temperatures - which I have never seen this card do.


----------



## xer0h0ur

I don't increase the power limit unless I am overclocking. Hell I don't even need to increase the power limit to overclock to 1060/1700.


----------



## Mega Man

i do on any and all my cards first thing, but as i have enough psu and watercooling, it just does not matter,


----------



## xer0h0ur

Yeah now that its all properly watercooled for me I see no difference in temperatures with raising the power limit all the way and the +100 mV.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't increase the power limit unless I am overclocking. Hell I don't even need to increase the power limit to overclock to 1060/1700.


Quote:


> Originally Posted by *Alex132*
> 
> You'd only want to raise it if it was downclocking due to it trying to stay within power constraints and not temperatures - which I have never seen this card do.


Got it, and based on recent benchmarking, doesn't seem to make any difference, I'll set it back to stock.


----------



## MIGhunter

Good evening, so I decided to build a new PC so that I could move my current PC into my Son's room. I ordered the 295x2 1st before the other parts. Then something came up so I didn't get a chance to order the other stuff and might have to wait. Now I have a 295x2 sitting on my desk teasing me! I'm half tempted to put it in my current system but IDK if it will be a real improvement over my 290x or if my i5 OC 4.2ghz would bottleneck it?

Anyway, I'll add a better picture later. You can add me to the list?


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah now that its all properly watercooled for me I see no difference in temperatures with raising the power limit all the way and the +100 mV.


Are you saying that's where you keep it, +100mv?


----------



## xer0h0ur

No I do not run any voltage increase or any power limit increase at all as a 24/7 overclock. I am just using 1060/1500 clocks right now.


----------



## SAFX

Why is the EVGA 1000w G2 SuperNova Gold not listed as compatible psu for single card setup?
I'm currently using that psu, no issues so far.


----------



## SAFX

Tha
Quote:


> Originally Posted by *xer0h0ur*
> 
> No I do not run any voltage increase or any power limit increase at all as a 24/7 overclock. I am just using 1060/1500 clocks right now.


That's a nice little OC with no voltage increase!








What's your performance increase, 5-10%?


----------



## PCModderMike

Quote:


> Originally Posted by *MIGhunter*
> 
> 
> 
> 
> 
> 
> 
> 
> Good evening, so I decided to build a new PC so that I could move my current PC into my Son's room. I ordered the 295x2 1st before the other parts. Then something came up so I didn't get a chance to order the other stuff and might have to wait. Now I have a 295x2 sitting on my desk teasing me! I'm half tempted to put it in my current system but IDK if it will be a real improvement over my 290x or if my i5 OC 4.2ghz would bottleneck it?
> 
> Anyway, I'll add a better picture later. You can add me to the list?


HTC1?


----------



## xer0h0ur

I can't say that I actually compared framerates at stock versus those clocks in any games. I will give that a shot later. Watching the Blackhawks making a comeback after getting a 3 goal shellacking in the first period.


----------



## SAFX

Quote:


> Originally Posted by *MIGhunter*
> 
> 
> 
> 
> 
> 
> 
> 
> Good evening, so I decided to build a new PC so that I could move my current PC into my Son's room. I ordered the 295x2 1st before the other parts. Then something came up so I didn't get a chance to order the other stuff and might have to wait. Now I have a 295x2 sitting on my desk teasing me! I'm half tempted to put it in my current system but IDK if it will be a real improvement over my 290x or if my i5 OC 4.2ghz would bottleneck it?
> 
> Anyway, I'll add a better picture later. You can add me to the list?


lol, photo looks like you're at a concert or something,

Actually, more like pyrotechnics going off for WWE...lol


Welcome!


----------



## MIGhunter

Quote:


> Originally Posted by *PCModderMike*
> 
> HTC1?


ya, my camera has some stupid red tinge to it. How'd you know? It only does it on the camera, if I use the video camera function, it's fine.


----------



## Alex132

Quote:


> Originally Posted by *MIGhunter*
> 
> IDK if it will be a real improvement over my 290x or if my i5 OC 4.2ghz would bottleneck it?


It won't bottleneck it.
Quote:


> Originally Posted by *PCModderMike*
> 
> HTC1?


Lol HTC One broken sensors unite!


----------



## rdr09

Quote:


> Originally Posted by *Alex132*
> 
> It won't bottleneck it.
> Lol HTC One broken sensors unite!


some games might. here was BF4 with a single 290 stock with an i7 4.5 HT off . . .


----------



## Alex132

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> It won't bottleneck it.
> Lol HTC One broken sensors unite!
> 
> 
> 
> some games might. here was BF4 with a single 290 stock with an i7 4.5 HT off . . .
Click to expand...

That's not bottlenecking the GPU.

Bottlenecking is where the CPU is being over-utilized and cannot fully utilize the graphics card because of this.
Bottlenecking would look like a maxed out CPU utilization graph, and then the GPU utilization dipping and not being able to reach 99%.

So just because an i7 5960X gets more FPS in a game than a 2500k with the same GPU - doesn't mean that the CPU is bottlenecking the GPU. Unless the above conditions are met. It just means the CPU is faster


----------



## rdr09

Quote:


> Originally Posted by *Alex132*
> 
> That's not bottlenecking the GPU.
> 
> Bottlenecking is where the CPU is being over-utilized and cannot fully utilize the graphics card because of this.
> Bottlenecking would look like a maxed out CPU utilization graph, and then the GPU utilization dipping and not being able to reach 99%.
> 
> So just because an i7 5960X gets more FPS in a game than a 2500k with the same GPU - doesn't mean that the CPU is bottlenecking the GPU. Unless the above conditions are met. It just means the CPU is faster


never tried with 2 290s turning off HT but in BF3 MP with only 2 7900 series cars . . . with HT off my minimum fps drops during huge fights. so i know not to turn off HT in BF4.



maxing out all cores at some points. that's with 2 7900 cars.


----------



## Gambit74

Just drove 30 laps of Bathurst in the GT3 Porsche, 32 cars on the circuit, settings on Ultra. After exiting to windows GPU1 62c, GPU2 64%

Noctua NF-F12s in push pull do the job nicely, venting out the back.

Running them off the case fan controller works. Shame they didn't think of something better to begin with.


----------



## Alex132

Quote:


> Originally Posted by *Gambit74*
> 
> Just drove 30 laps of Bathurst in the GT3 Porsche, 32 cars on the circuit, settings on Ultra. After exiting to windows GPU1 62c, GPU2 64%
> 
> Noctua NF-F12s in push pull do the job nicely, venting out the back.
> 
> Running them off the case fan controller works. Shame they didn't think of something better to begin with.
> 
> 
> Spoiler: Warning: Spoiler!


PCars? Wheel?


----------



## Gambit74

Quote:


> Originally Posted by *Alex132*
> 
> PCars? Wheel?


Yes and yes.


----------



## Alex132

Just sold my wheel, but got gifted PCars... wanted to know how it was with a controller


----------



## Gambit74

Quote:


> Originally Posted by *Alex132*
> 
> Just sold my wheel, but got gifted PCars... wanted to know how it was with a controller


I can answer that for you.... SUCKS!!!

Actually I've not tried a controller, but if you've driven with a wheel a controller is going to suck.


----------



## Alex132

Quote:


> Originally Posted by *Gambit74*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Just sold my wheel, but got gifted PCars... wanted to know how it was with a controller
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can answer that for you.... SUCKS!!!
> 
> Actually I've not tried a controller, but if you've driven with a wheel a controller is going to suck.
Click to expand...

Yeah probably gonna look into getting a DF 2 GT when that comes out


----------



## PCModderMike

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MIGhunter*
> 
> IDK if it will be a real improvement over my 290x or if my i5 OC 4.2ghz would bottleneck it?
> 
> 
> 
> It won't bottleneck it.
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> HTC1?
> 
> Click to expand...
> 
> Lol HTC One broken sensors unite!
Click to expand...


----------



## The EX1

Talk about coming aboard the AMD train with a punch. I have THREE Devil 13s heading my way haha









Two for a quad fire build and another to put in a friends computer. I'll be posting up some FPS charts, benches, and nice write ups for you guys. I know the card doesn't have a strong presence on here but I like to be wayyyy different and quad fire Devils is a good way to stand out *cough Roaches*.

EDIT: spelling


----------



## DividebyZERO

Quote:


> Originally Posted by *The EX1*
> 
> Talk about coming aboard the AMD train with a punch. I have THREE Devil 13s heading my way haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Two for a quad fire build and another to put in a friends computer. I'll be posting up some FPS charts, benches, and nice write ups for you guys. I know the card doesn't have a strong presence on here but I like to be wayyyy different and quad fire Devils is a good way to stand out *cough Roaches*.
> 
> EDIT: spelling


I look forward to hearing about them as we haven't seen much news on them. I almost though about getting an open box one the deals were pretty good


----------



## The EX1

Quote:


> Originally Posted by *DividebyZERO*
> 
> I look forward to hearing about them as we haven't seen much news on them. I almost though about getting an open box one the deals were pretty good


I had my eye on a Devil even when I bought and water blocked my 780 classifieds. The card though was $1599 and was just too much for me to spend at that time. When I saw a open box Devil for $519 I had to buy it. That card just intrigues me and I love companies like Powercolor who just do something different. Will the Devil be a big upgrade in GPU power? Probably not and may even be a downgrade. I run my 780 classies at 1306 core and over 8 ghz on memory. I am Gsync pegged at 144hz on my Swift 95% of the time while gaming. I decided if I was going to take the possible graphics "side grade", I was going to go big and just do a nasty quad fire setup







. Now I have an excuse to sell my Swift and pick up a 4K Freesync monitor when they come out. I am rebuilding my entire system for this dual Devil build but the cards, new motherboard, and new water gear will be here on Friday. I'll try and knock out as much of the build as I can over the weekend. I will likely just create a build thread and link it here for present and future readers. It should be a nasty build.


----------



## Mega Man

Quote:


> Originally Posted by *The EX1*
> 
> Talk about coming aboard the AMD train with a punch. I have THREE Devil 13s heading my way haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Two for a quad fire build and another to put in a friends computer. I'll be posting up some FPS charts, benches, and nice write ups for you guys. I know the card doesn't have a strong presence on here but I like to be wayyyy different and quad fire Devils is a good way to stand out *cough Roaches*.
> 
> EDIT: spelling


if onry they made waterblock :/


----------



## wermad

Has anyone tried putting some uni-blocks on the devil? (wondering if the pcb is too wide???)


----------



## Mega Man

i am sure you can, with one of the myriad of blocks out, but i hate uniblocks !~


----------



## wermad

They're ok. If you can integrate them w/ the stock shroud, it looks very sharp!

So I took delivery of more cooling stuff. I'm not gaming right now as just 15 minutes warms up the room excesively imho. Its obvious my case is not circulating the air properly and I'm taking action. I'll have a new setup going soon. I blew most of my cash on this upgrade so it will be something half done but I'll have to settle on slowly but progressive upgrades. Time to hit the overtime button @ work


----------



## TooManyAlpacas

Has anyone done or seen any guilds on modding the color of the Radeon logo on the front of the card like people do with 980 and other Nvidia cards. I have a the NZXT H440 Designed by Razer and I want the color of the Radeon logo to be green. If it comes down to it I will do it without and guild, but I would prefer one


----------



## Alex132

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> Has anyone done or seen any guilds on modding the color of the Radeon logo on the front of the card like people do with 980 and other Nvidia cards. I have a the NZXT H440 Designed by Razer and I want the color of the Radeon logo to be green. If it comes down to it I will do it without and guild, but I would prefer one


guild? You mean guide?

Just remove the LED so you have no clashing colors, easy to do and you keep the warranty.


----------



## TooManyAlpacas

yes I do mean guide sorry messed up while typing and I know I can remove the LED but I really want a it and would like a light on my GPU to match my case


----------



## The EX1

Quote:


> Originally Posted by *Mega Man*
> 
> if onry they made waterblock :/


This is where the limited production acts a double edged sword. The card is exclusive but aftermarket support is non-existent







. The massive triple slot cooler is what gives the Devil part of its personality though. I am moving to a EATX board to give as much room between the cards as possible. I also will have these mounted 90 degrees with twin 180mm blowing straight up at them in my case. I have a portable A/C unit I use for benching so if things get out of control, I can always rely on that.

Quote:


> Originally Posted by *wermad*
> 
> They're ok. If you can integrate them w/ the stock shroud, it looks very sharp!


Ya the aluminum heatsinks and copper blocks can be removed while keeping the two piece shroud on the card. If the setup is just unplayable I will be doing this. Glad I have 1,200mm of radiator space


----------



## joeh4384

Quote:


> Originally Posted by *Mega Man*
> 
> if onry they made waterblock :/


I always wondered what kind of overclocking you could get with the improved power delivery on those under water.


----------



## electro2u

Quote:


> Originally Posted by *joeh4384*
> 
> I always wondered what kind of overclocking you could get with the improved power delivery on those under water.


It's still just 3 8-pin conns. 290s have minimum 2x8 each. I bet it makes almost no difference.

Heck the 290x Lightning has 2x8 pin and a 6-pin. =)


----------



## The EX1

Quote:


> Originally Posted by *electro2u*
> 
> It's still just 3 8-pin conns. 290s have minimum 2x8 each. I bet it makes almost no difference.
> 
> Heck the 290x Lightning has 2x8 pin and a 6-pin. =)


Devils have 4 eight pins. They can pull over 700 watts just by themselves when overclocked.



EDIT: pic


----------



## electro2u

I looked up a pic jut before I posted... Apparently I'm blind. Yes that should help considerably.


----------



## Mega Man

scary huh ?


----------



## Mega Man

soon we will need 3 1600w psus to run 1 card (







)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> soon we will need 3 1600w psus to run 1 card (
> 
> 
> 
> 
> 
> 
> 
> )


So long as the performance is there, power consumption means little to me


----------



## Mega Man

agreed, quad gpu card anyone >.>


----------



## wermad

Better hit Homes for 30amp breaker and 240 outlets







.

I haven't seen anything above [email protected] the kill-a-watt and my cards are still stock. Though, from what tsm gets once he's cranked the clocks, these Hawaiian cores start guzzling power like an H1 drinks diesel.


----------



## XCruzzerX89

So I have a r9 295x2 and I think we all agree that, it isn't enough gpu power (because we can never have enough gpu power). So i'm thinking of getting an r9 290x for tri crossfire. will my Rm 1000w be able to hold it's ground, or will I have to get an EVGA Supernova 1200 P2 Power Supply? Also I play in 4k that is why I want more gpu power. Also I want to get a msi lightning r9 290x


----------



## wermad

1200 if you plan to oc.


----------



## rakesh27

ive got trifire setup and i remember before i changed out my PSU i was running Enermax Galaxy Series SLI/Crossfire 1000Watt PSU...

It was ok not a problem, but you need a a quality PSU to pull this off, i dont see any problems it should be ok...

You could always test it, and if you experience problems then upgrade your PSU....

Good Luck


----------



## Alex132

Quote:


> Originally Posted by *XCruzzerX89*
> 
> So I have a r9 295x2 and I think we all agree that, it isn't enough gpu power (because we can never have enough gpu power). So i'm thinking of getting an r9 290x for tri crossfire. will my Rm 1000w be able to hold it's ground, or will I have to get an EVGA Supernova 1200 P2 Power Supply? Also I play in 4k that is why I want more gpu power. Also I want to get a msi lightning r9 290x


Even if you ignore how overpriced they are, the RM 1000 is still a very bad PSU - especially at higher loads, I would not trust it with a single R9 295X2 - let alone tri-fire.









Quote:


> This is some of the worst ripple i have ever seen on a big brand named unit its worse then even the HighPower made Thermaltake Tough Power Grand and the HEC made EVGA 430/500 and thats pretty shocking.
> 
> Quote from techpowerup
> 
> The PSU couldn't deliver its full power at or above 44°C-45°C ambient (OTP triggered) over prolonged periods
> Ripple suppression at +12V and 3.3V rails was bad
> Less than 16 ms hold-up time
> Short distance between peripheral connectors
> The fan should spin during start-up to make sure it is working properly


Cooler Master V1200 or EVGA 1200 G2 should be very good choices for 1200w PSUs for tri-fire.


----------



## kayan

So, I got a chance to test out the 2ndary BIOSes on both my 295 and the motherboard. After updating BIOS on the 2nd switch on the mobo at least my PC is booting each time. My benchmark score went down by about 5-600 points in 3dMark with an overclock on the GPU of 1050. Is this due to not enough power or something else?

Honestly I'm not all too concerned with scores, but rather just game performance, hence the attempt at overclocks.


----------



## The EX1

So I started a build log for my quad-fire Devils. Don't want to clutter this thread with my setup.









My Build

I have some nice preliminary OCs with this Devil so far! Staying around 74c while gaming too.


----------



## xer0h0ur

After overclocking my cards, playing around with power limit and adding voltage I never once saw elevated temps while gaming or benchmarking because of it. Now that I began folding I definitely noticed my temps spike up with +50 power limit and +100 mV. Temps go from high 40's thru mid 50's to mid 50's thru low 60's. Folding has shown me there is an entirely different type of system stability that benchmarking and gaming doesn't reveal. While folding I get stability with no added voltage, lower overclocks and +50 power limit.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> After overclocking my cards, playing around with power limit and adding voltage I never once saw elevated temps while gaming or benchmarking because of it. Now that I began folding I definitely noticed my temps spike up with +50 power limit and +100 mV. Temps go from high 40's thru mid 50's to mid 50's thru low 60's. Folding has shown me there is an entirely different type of system stability that benchmarking and gaming doesn't reveal. While folding I get stability with no added voltage, lower overclocks and +50 power limit.


"My rig is 100% stable, I've tested with benchmarks and games!"

Okay, but is it folding stable?









With folding I can sit 1040 core (memory does nothing for folding IIRC, so haven't OC'ed that yet) with no power increase and be fine. I'm guessing most OCs taper out around 1100?


----------



## xer0h0ur

Well that is good to know. I will need to set up a 2nd profile within afterburner for folding with stock vRAM clocks, +50 power limit and light clocks of 1030MHz. Those settings were stable for me for 24 hours+. I just found out yesterday that disabling SLI/Crossfire will boost PPD a little. Still haven't tried that yet.


----------



## joeh4384

Quote:


> Originally Posted by *xer0h0ur*
> 
> After overclocking my cards, playing around with power limit and adding voltage I never once saw elevated temps while gaming or benchmarking because of it. Now that I began folding I definitely noticed my temps spike up with +50 power limit and +100 mV. Temps go from high 40's thru mid 50's to mid 50's thru low 60's. Folding has shown me there is an entirely different type of system stability that benchmarking and gaming doesn't reveal. While folding I get stability with no added voltage, lower overclocks and +50 power limit.


That is odd, my temps seem lower with folding compared to gaming.


----------



## xer0h0ur

Quote:


> Originally Posted by *joeh4384*
> 
> That is odd, my temps seem lower with folding compared to gaming.


Oh don't get me wrong. Generally speaking you're absolutely right. If its a taxing DX11 game @ 4K like say AC:U, Dying Light, Hitman Absolution, Tombraider etc. etc. then my temps are going to be low 50's to mid 60's regardless depending on the load. If I am not boosting the voltage then my folding temps are in the mid to high 40's. It was only with the added voltage that my temps spiked while I had not observed this before with added voltage while gaming. It was just a result I didn't expect.


----------



## Mega Man

Quote:


> Originally Posted by *XCruzzerX89*
> 
> So I have a r9 295x2 and I think we all agree that, it isn't enough gpu power (because we can never have enough gpu power). So i'm thinking of getting an r9 290x for tri crossfire. will my Rm 1000w be able to hold it's ground, or will I have to get an EVGA Supernova 1200 P2 Power Supply? Also I play in 4k that is why I want more gpu power. Also I want to get a msi lightning r9 290x


I think 290 is a bad ideas do another 295x2!

In All seriousness. I recommend 300 for cpu and 300 (min) Power gpu.
Quote:


> Originally Posted by *The EX1*
> 
> So I started a build log for my quad-fire Devils. Don't want to clutter this thread with my setup.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Build
> 
> I have some nice preliminary OCs with this Devil so far! Staying around 74c while gaming too.


Dear god that is hot. .. I could never let mine get that warm


----------



## Sgt Bilko

And we have the 15.5 driver now

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

Crossfire Profile for The Witcher 3 and updates for Project Cars being the highlights


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> And we have the 15.5 driver now
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Crossfire Profile for The Witcher 3 and updates for Project Cars being the highlights


Quote:


> Know Issues:
> 
> The Witcher 3 - Wild Hunt : To enable the best performance and experience in Crossfire, users must disable Anti-Aliasing from the games video-post processing options. Some random flickering may occur when using Crossfire. If the issue is affecting the game experience, as a work around we suggest disabling Crossfire while we continue to work with CD Projekt Red to resolve this issue
> 
> Project CARS: Corruption may be observed if Anti-Aliasing is set to DS2M when run in Crossfire mode. As a workaround we suggest using a different Anti-Aliasing method.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> And we have the 15.5 driver now
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Crossfire Profile for The Witcher 3 and updates for Project Cars being the highlights
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Know Issues:
> 
> The Witcher 3 - Wild Hunt : To enable the best performance and experience in Crossfire, users must disable Anti-Aliasing from the games video-post processing options. Some random flickering may occur when using Crossfire. If the issue is affecting the game experience, as a work around we suggest disabling Crossfire while we continue to work with CD Projekt Red to resolve this issue
> 
> Project CARS: Corruption may be observed if Anti-Aliasing is set to DS2M when run in Crossfire mode. As a workaround we suggest using a different Anti-Aliasing method.
> 
> Click to expand...
Click to expand...

Thank you, copy pasta is a pain on phones


----------



## kayan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thank you, copy pasta is a pain on phones


Why are you copying Pasta? ;-)


----------



## Sgt Bilko

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Thank you, copy pasta is a pain on phones
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why are you copying Pasta? ;-)
Click to expand...

Because spaghetti


----------



## DividebyZERO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Because spaghetti


----------



## kayan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> And we have the 15.5 driver now
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Crossfire Profile for The Witcher 3 and updates for Project Cars being the highlights


Installed 15.5, my frames in W3 jumped from 30 to 55-60 everything Ultra, except Hairworks OFF and AA OFF. If Hairworks is turned on it drops down to around 35ish. I didn't play out in the world yet, but that is the FPS for in the Baron's Keep.

Can't wait to play more over the weekend!


----------



## xer0h0ur

I don't know if you guys are just stubborn or what but you can lower the tessellation from 64X to 16X for a good performance hike and you can edit a file to lower the AA on Geralt's hair so you can run hairworks without the super drop. This has been floating around the net for days.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know if you guys are just stubborn or what but you can lower the tessellation from 64X to 16X for a good performance hike and you can edit a file to lower the AA on Geralt's hair so you can run hairworks without the super drop. This has been floating around the net for days.


Not stubborn, even with Hairworks tweaked i was dropping below 45fps at the settings i liked at 1440p (single core obviously) so i was waiting for a crossfire fix to be implemented so i could have the performance and quality i wanted


----------



## ColeriaX

Breathing some new life into an old friend. X99 and stuffz
Edit: Sorry for potato phone quality. Not cool like some people with awesome DSLRs


----------



## xer0h0ur

Potatoes are evolving.


----------



## Dagamus NM

Not bad shots for a phone. Looking good overall. I see you have the blocks I want. Stupid PPCs always running out of stuff. It seems like I have to order one or two parts at a time otherwise they end up not having half of the order after I place it. I tried to get a couple of those Vesuvius blocks, no go. Looks like I will stick with EK. Might as well, they have been good to me.

Anyhow, looking good. I need my blocks asap. I just finished playing borderlands for a couple of hours and when I close steam my PC froze. I went to press the reset button and noticed just how hot the front of my PC was. Grotesque.

I am still perplexed that a dual Hawaii GPU card comes with a 120mm cooler. I would think that 480mm per card would be ok, not a total of 120x 240mm cooling space for four Hawaii GPU's as that is just crazy town.

So yeah, blocks blocks blocks. I want them, I need them, they are all I hunger for. Ok, maybe not.


----------



## glenn37216

the 15.5's Drivers are fubar on my 295x2 rig . Crossfire is enabled ,ULPS is off and yet only one gpu is showing usage during gaming.
http://oi57.tinypic.com/2yowojo.jpg

http://oi62.tinypic.com/2exqxc3.jpg

anyone else having issues with these drivers? going to do a complete re-install and try again I guess.


----------



## Alex132

Quote:


> Originally Posted by *glenn37216*
> 
> the 15.5's Drivers are fubar on my 295x2 rig . Crossfire is enabled ,ULPS is off and yet only one gpu is showing usage during gaming.
> http://oi57.tinypic.com/2yowojo.jpg
> 
> http://oi62.tinypic.com/2exqxc3.jpg
> 
> anyone else having issues with these drivers? going to do a complete re-install and try again I guess.


Just roll back to 15.4 if you arent going to play PCars / TW3 in Crossfire.


----------



## Orivaa

Quote:


> Originally Posted by *glenn37216*
> 
> the 15.5's Drivers are fubar on my 295x2 rig . Crossfire is enabled ,ULPS is off and yet only one gpu is showing usage during gaming.
> http://oi57.tinypic.com/2yowojo.jpg
> 
> http://oi62.tinypic.com/2exqxc3.jpg
> 
> anyone else having issues with these drivers? going to do a complete re-install and try again I guess.


Are you running the game in fullscreen?


----------



## glenn37216

Quote:


> Originally Posted by *Orivaa*
> 
> Are you running the game in fullscreen?


Yup. I reinstalled (clean boot in safemode uninstalled everything+registry)

I don't know why it is but my 295x2 hates these official 15.5's .It's almost as if my second card is detected but isn't running at full speed. crossfire is showing in gpuz , icon is showing up saying crossfire is enabled in game/benchmarks etc but still showing 0% usage on 2nd gpu.



Loves the 15.3's though.


The 15.4's give me about the same fps and benchmarks but are terrible with frame pacing on in BF4. Overall , I have no idea why , the 15.3's are by far my best drivers in every major title that's out right now .


----------



## glenn37216

oops double post .. sorry .:/


----------



## electro2u

My 295x2 saga has come to an end. MSI is sending a check for 1350$ since I bought the card for 1500$ at launch. Card decided to die on the buyer I sold it to and sent it back to me. It would run (with artifacts) in Win SafeMode but attempting to install drivers caused black screen.

I am very happy. This was the same amount Newegg offered to give me in credit when the card refused to work in Crossfire in FFXIV about a year ago.


----------



## gatygun

Quote:


> Originally Posted by *electro2u*
> 
> My 295x2 saga has come to an end. MSI is sending a check for 1350$ since I bought the card for 1500$ at launch. Card decided to die on the buyer I sold it to and sent it back to me. It would run (with artifacts) in Win SafeMode but attempting to install drivers caused black screen.
> 
> I am very happy. This was the same amount Newegg offered to give me in credit when the card refused to work in Crossfire in FFXIV about a year ago.


I had this once with a 280 gtx card, thing died after 3 years of service. found out that i still had 1 week of warranty. Sended it in didn't hear 2 months from them. Called them up and they gave me a full refund as the card obviously wasn't in stock anymore. got like 400 euro's back. while the card was worth nothing anymore. Bought a 580 for it.


----------



## electro2u

Quote:


> Originally Posted by *gatygun*
> 
> I had this once with a 280 gtx card, thing died after 3 years of service. found out that i still had 1 week of warranty. Sended it in didn't hear 2 months from them. Called them up and they gave me a full refund as the card obviously wasn't in stock anymore. got like 400 euro's back. while the card was worth nothing anymore. Bought a 580 for it.


Truth is I told the rep I was prepared to accept fair market value... which would have been like what? 700$? I did not expect them to offer $1350... I am never going to spend that kind of money on a graphics card again, was an early midlife crisis thing. My justification was that what I really wanted was an Aston Martin.


----------



## Roaches

That's a pretty massive return. Heck great timing for the coming Fiji launch as well.

Anyhow, I just received my Devil 13 replacement today; certainly a new card as serial numbers and card cosmetic condition is completely different and cleaner than the one I sent. Will install tomorrow.


----------



## Alex132

Quote:


> Originally Posted by *electro2u*
> 
> My 295x2 saga has come to an end. MSI is sending a check for 1350$ since I bought the card for 1500$ at launch. Card decided to die on the buyer I sold it to and sent it back to me. It would run (with artifacts) in Win SafeMode but attempting to install drivers caused black screen.
> 
> I am very happy. This was the same amount Newegg offered to give me in credit when the card refused to work in Crossfire in FFXIV about a year ago.












Luckkkkyyyyyyyy


----------



## Alex132

Quote:


> Originally Posted by *glenn37216*
> 
> 
> 
> Spoiler: Warning: Spoiler!


20244 Graphics score, stock?

With 1110/1500 I get 23780 on 15.5. 15.5 also re-added the Performance tab in CCC for me









http://www.3dmark.com/fs/4953732

Stock 21598 graphics score:

http://www.3dmark.com/3dm/7148243

Gotta go faster!

What is the normal OC range for a 295X2 core/mem anyways? 1200? 1250? core and like 1600 mem?


----------



## Sgt Bilko

Mine does 1150 on the core and 1625 on the memory but 1500 mem is a little faster in benches for me

still haven't tried it with Trixx to see if the extra voltage will help it there ir not


----------



## Alex132

Mine is 1500 max on the memory so far. Anything beyond that and error-correction kicks in and my score goes lower in Firestrike. Well, it's not high but WAY better than my max overclock of like 2Mhz on memory for my GTX690









GDDR5 is weird overclocking, it won't crash it'll just start yielding lower FPS... so some do think they have a higher stable clock than they actually have...

Anyway, on 1130 core so far. Things are getting toasty at 61'c


----------



## Alex132

http://www.3dmark.com/fs/4954418

Around 1200 / 1500 seems to be my max. At 1200 I got about 3 frames where it looks like a bit of artifacting, might try push it a bit later but it seems like a fairly nice OC to me


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> 
> http://www.3dmark.com/fs/4954418
> 
> Around 1200 / 1500 seems to be my max. At 1200 I got about 3 frames where it looks like a bit of artifacting, might try push it a bit later but it seems like a fairly nice OC to me


That's quite nice indeed, I'm busy playing the Witcher to benchmark but i'll give Firestrike a few runs soon and see what my max is


----------



## Feyris

No longer have 295x2 now either, for now.... going TX OB only because i can dump it for a Fury without losses but i wont be building for over a month because irep waiting game.


----------



## Alex132

I feel like ~1200 is really high for a 295X2, I just cant find any information on them at all. Any reviewers that got the card when it came out had around 1070-1160.

Makes me tempted to push for like 1250 or something









I mean like 100% stable at 1200/1250 - I'm folding at that right now. (maybe memory wasnt 100% stable at 1500, hence why I saw some artifacts in 3DM)


----------



## DividebyZERO

Yeah sound good to me, if not its higher than ive seen posted in here that i rememebr


----------



## electro2u

Quote:


> Originally Posted by *Alex132*
> 
> I feel like ~1200 is really high for a 295X2, I just cant find any information on them at all. Any reviewers that got the card when it came out had around 1070-1160.
> 
> Makes me tempted to push for like 1250 or something
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mean like 100% stable at 1200/1250 - I'm folding at that right now. (maybe memory wasnt 100% stable at 1500, hence why I saw some artifacts in 3DM)


Yeah, thats fantastic. Mine would flip out if I went higher than 1100 or so.


----------



## Feyris

295x2 is not throwing a display with mdp > hdmi adapter on either 1 or 4. this....normal? nothing else plugged in. (im assuming adapter is passive from a gpu bundle) DVI always worked perfect though


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> 295x2 is not throwing a display with mdp > hdmi adapter on either 1 or 4. this....normal? nothing else plugged in. (im assuming adapter is passive from a gpu bundle) DVI always worked perfect though


Adapters suck, never trust them. Especially never trust an adapter that goes through a very different standard such as DP -> HDMI.


----------



## Feyris

Now its doing this/no signal. On DVI (asus 144hz monitor)



When i shipped to them i made sure it was well packaged and wrapped...it was fully working when i shipped it too.


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> Now its doing this/no signal. On DVI (asus 144hz monitor)
> 
> 
> 
> When i shipped to them i made sure it was well packaged and wrapped...it was fully working when i shipped it too.


FYI it survived a trip to Africa, so most likely he's messing up. Unless fatal static.... eh. Rare chance I guess. Try switch to the other BIOS?

Does the monitor work? Check it with other GPUs.
Does the cable work? Check it with other cables.

That's generally how it looks for both of those things dying. Even a GPU. So just reduce variables.


----------



## glenn37216

Quote:


> Originally Posted by *Alex132*
> 
> 20244 Graphics score, stock?
> 
> With 1110/1500 I get 23780 on 15.5. 15.5 also re-added the Performance tab in CCC for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/4953732
> 
> Stock 21598 graphics score:
> 
> http://www.3dmark.com/3dm/7148243
> 
> Gotta go faster!
> 
> What is the normal OC range for a 295X2 core/mem anyways? 1200? 1250? core and like 1600 mem?


That's great clocks for your particular brand. My XFX core doesn't like to overclock at all .









That Firestrike run was with the 15.5's with framepacing on . Terrible drivers for me . My best drivers are the 15.3's ..
Stock Clocks - 1018/1250. 23333 gpu score. 56c ave temps / ambient room temp 80f

http://www.3dmark.com/fs/4905592


----------



## SLK

I sent my original Visiontek 295x2 in for RMA because the fluid would start gurgling at a constant high load like the fluid was boiling or something. The RMA replacement does exactly the same thing. The card is installed with the radiator above the card and tubes at the bottom. Is this just normal for these Asecrap pumps to start making noises during operation?


----------



## xarot

A while since I had the reference 295X2. How does Witcher 3 run with latest drivers and Crossfire enabled? I still have my Ares III along with my Titan X SLI and would like to test the Ares III again - especially I'd like to compare how it runs W3 in Ultra mode. I just rebuilt my custom loop with Koolance QDCs so poppin the Ares III in will be a breeze.


----------



## glenn37216

Quote:


> Originally Posted by *SLK*
> 
> I sent my original Visiontek 295x2 in for RMA because the fluid would start gurgling at a constant high load like the fluid was boiling or something. The RMA replacement does exactly the same thing. The card is installed with the radiator above the card and tubes at the bottom. Is this just normal for these Asecrap pumps to start making noises during operation?


I've noticed this when my case was open and I'm booting up... I can hear a slight gargle then it dissipates. Under load I hear nothing from the pumps. this is on a xfx core edition.


----------



## fat4l

Here's my results ladz









[email protected]
2*8GB Corsair Dominator Plat 2400CL10
[email protected]/1600MHz (set randomly







)


----------



## Alex132

Quote:


> Originally Posted by *glenn37216*
> 
> That's great clocks for your particular brand. My XFX core doesn't like to overclock at all .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That Firestrike run was with the 15.5's with framepacing on . Terrible drivers for me . My best drivers are the 15.3's ..
> Stock Clocks - 1018/1250. 23333 gpu score. 56c ave temps / ambient room temp 80f
> 
> http://www.3dmark.com/fs/4905592


Well I guess I pay for a nice OC'ing 295X2 with awful driver problems too, I get a lot of BSODs when folding. Specifically when folding + watching a video. Or watching a video + opening MSI AB. It's weird.
Quote:


> Originally Posted by *fat4l*
> 
> Here's my results ladz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [email protected]
> 2*8GB Corsair Dominator Plat 2400CL10
> [email protected]/1600MHz (set randomly
> 
> 
> 
> 
> 
> 
> 
> )


Nice









Stock or water? And is that memory stable or ECC kicking in?

edit- now I gotta outdo you








Got my GPUs stable last night at 1240 while folding, gotta try 1251


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> Stock or water? And is that memory stable or ECC kicking in?
> 
> edit- now I gotta outdo you
> 
> 
> 
> 
> 
> 
> 
> 
> Got my GPUs stable last night at 1240 while folding, gotta try 1251


water, 40-42C load temps









well i was prolly stable. no artifacts in 3d 2013... I will be testing more


----------



## Alex132

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Stock or water? And is that memory stable or ECC kicking in?
> 
> edit- now I gotta outdo you
> 
> 
> 
> 
> 
> 
> 
> 
> Got my GPUs stable last night at 1240 while folding, gotta try 1251
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> water, 40-42C load temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well i was prolly stable. no artifacts in 3d 2013... I will be testing more
Click to expand...

GDDR5 VRAM becomes unstable when it starts to lose performance as you increase the clocks. I can do 1600Mhz sure, but it will be slower than 1500Mhz because the ECC of GDDR5 kicks in and has to correct the errors that the unstable 1600Mhz is throwing.

Basically, when you start to go up in clocks (on memory) but performance goes down = you've reached your limit.


----------



## gatygun

Quote:


> Originally Posted by *fat4l*
> 
> Here's my results ladz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [email protected]
> 2*8GB Corsair Dominator Plat 2400CL10
> [email protected]/1600MHz (set randomly
> 
> 
> 
> 
> 
> 
> 
> )


Would love to see you play witcher 3 + first metro with that setup. Go make some youtube video's while testing settings


----------



## fat4l

Well, I kinda need some help now .....
While I was playing with clocks and volts, one of my cores stopped showing in the system. ***...
I'm hoping I didnt die lol.

I tried to reinstall drivers and afterburner and trixx but still the same..Unchecked ULPS too.

Anyone ?


----------



## Alex132

Quote:


> Originally Posted by *fat4l*
> 
> Well, I kinda need some help now .....
> While I was playing with clocks and volts, one of my cores stopped showing in the system. ***...
> I'm hoping I didnt die lol.
> 
> I tried to reinstall drivers and afterburner and trixx but still the same..Unchecked ULPS too.
> 
> Anyone ?


Use HWiNFO64 to see if it sees the GPU. It even showed my completely dead 5870 - but it came up really... messed up..









What were your settings for 1250 core? +100mV / +50% - right?


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> Use HWiNFO64 to see if it sees the GPU. It even showed my completely dead 5870 - but it came up really... messed up..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What were your settings for 1250 core? +100mV / +50% - right?


it shows only 1 GPU

I was pushing +200mV i think / +50% on water.....~42C max temps

Any ideas ?
The screen freezed, I restarted the PC, after the restart it was......


----------



## Alex132

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Use HWiNFO64 to see if it sees the GPU. It even showed my completely dead 5870 - but it came up really... messed up..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What were your settings for 1250 core? +100mV / +50% - right?
> 
> 
> 
> it shows only 1 GPU
> 
> I was pushing +200mV i think / +50% on water.....~42C max temps
> 
> Any ideas ?
> The screen freezed, I restarted the PC, after the restart it was......
Click to expand...

Make sure ULPS is disabled via registery, not MSI AB.

And I was only overclocking with +100mV / +50%. I didn't want to go completely mad, I know the 295X2 VRMs and power delivery in general isn't amazing.


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> Make sure ULPS is disabled via registery, not MSI AB.
> 
> And I was only overclocking with +100mV / +50%. I didn't want to go completely mad, I know the 295X2 VRMs and power delivery in general isn't amazing.


well i was pissed so I turned the pc off. Now i turned it on and its working LOL







what the hell ?


----------



## Feyris

Sticky flux suddenly appeared after shipment ( new info ) gj usps....gj. means bad bad things i imagine


----------



## Alex132

Quote:


> Originally Posted by *Feyris*
> 
> Sticky flux suddenly appeared after shipment ( new info ) gj usps....gj. means bad bad things i imagine


295X2? white GPU jizz? That's not USPS... that's something else.

Giga-trash cards have them a lot. Solder flux or smth. I don't know much about it.


----------



## Feyris

Quote:


> Originally Posted by *Alex132*
> 
> 295X2? white GPU jizz? That's not USPS... that's something else.
> 
> Giga-trash cards have them a lot. Solder flux or smth. I don't know much about it.


Amber flux


----------



## Alex132

GTA V still runs awfully for me, bouncing between 25-100% usage on both GPUs all the time


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> GTA V still runs awfully for me, bouncing between 25-100% usage on both GPUs all the time


Which driver you on?


----------



## wermad

I installed my mb and quads in the reverse atx chamber of my new case. Glad to report no sag







. Im sure the koolance bridges are helping here a lot. Seems like it does slightly drop in atx, but I feel the bridge is making the difference here. I'm glad I didn't go w/ BP adjustable aqua links (sli links). There's a lot work to be done and many parts missing. For now, I'll try to plumb it temporarily while the build is a work in progress. I also wanted to move my components out of the x9 to ready it for sale.

I have pics but as usual, ocn mobile image uploader sucks







.

Edit:


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> GTA V still runs awfully for me, bouncing between 25-100% usage on both GPUs all the time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which driver you on?
Click to expand...

This happened on both 15.4 and 15.5.

Installed both via driver sweeper, etc.


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> I installed my mb and quads in the reverse atx chamber of my new case. Glad to report no sag
> 
> 
> 
> 
> 
> 
> 
> . Im sure the koolance bridges are helping here a lot. Seems like it does slightly drop in atx, but I feel the bridge is making the difference here. I'm glad I didn't go w/ BP adjustable aqua links (sli links). There's a lot work to be done and many parts missing. For now, I'll try to plumb it temporarily while the build is a work in progress. I also wanted to move my components out of the x9 to ready it for sale.
> 
> I have pics but as usual, ocn mobile image uploader sucks
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Edit:


i call dibs when you sell it in a week !~


----------



## fat4l

Will be selling mine after I saw 980Ti !


----------



## Orivaa

Wait for the 300x and Fiji to come out. Even if 980 ti performs better, the prices should still drop a bit.


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> i call dibs when you sell it in a week !~


gonna keep it longer then that







. I know guys that are worse offenders then me







. Was planning on expanding my loop and the cl came up for sale. was hard to say no since its rare to have a cl come up for sale locally. Let alone a TX10-D. After thinking it over for some time, I took the plunge. I recommissioned the old family machine as my kids need access to a pc for school. The school sites are not mobile friendly. It also gave me the chance to address my cooling as it wasn't working out great in the X9.

Cards are sitting solid and after thinking some more, the 900D mb tray was a little bit flexible. Maybe the sturdier and thicker gauge of the aluminum in the CL holds things better (along w/ the bridge). We'll see how things go







.


----------



## veaseomat

I decided today that I'm going to embrace my 295x2s and skip the 390x generation. It would cost too much to replace them, would probably need 3 390s to substantially improve from my current quadfire.

I also decided to get custom waterblocks for my 295x2s to help the temps out since I want them to last. I ordered the koolance waterblocks for 150 each, which isn't too bad considering most blocks for these cards are 200 ea. they will be in a loop with my motherboard vrm block, to keep them cool I'll have two 240mm rads and a single 120mm rad.

My CPU is under a ld cooling phase change. Fx9590 clocked at 5.72.

Fingers crossed delivery and installation goes smoothly.

I'll post my before and after max overclocks for you guys when it's all done.


----------



## wermad

So I'm gonna need some extensions. Lepa cables are a tad short and won't reach the cards. Before I go diy, any one have a recommendations? I used Bitfenix in the past (gtx titans and 690s)? I'm thinking of getting some CM or Corsair thick gauge pcie cables if they're long enough for the diy option.


----------



## gatygun

Quote:


> Originally Posted by *veaseomat*
> 
> I decided today that I'm going to embrace my 295x2s and skip the 390x generation. It would cost too much to replace them, would probably need 3 390s to substantially improve from my current quadfire.
> 
> I also decided to get custom waterblocks for my 295x2s to help the temps out since I want them to last. I ordered the koolance waterblocks for 150 each, which isn't too bad considering most blocks for these cards are 200 ea. they will be in a loop with my motherboard vrm block, to keep them cool I'll have two 240mm rads and a single 120mm rad.
> 
> My CPU is under a ld cooling phase change. Fx9590 clocked at 5.72.
> 
> Fingers crossed delivery and installation goes smoothly.
> 
> I'll post my before and after max overclocks for you guys when it's all done.


How much fps do you get on 1080p everything maxed on the witcher 3 with that setup? would love to see a vid.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> So I'm gonna need some extensions. Lepa cables are a tad short and won't reach the cards. Before I go diy, any one have a recommendations? I used Bitfenix in the past (gtx titans and 690s)? I'm thinking of getting some CM or Corsair thick gauge pcie cables if they're long enough for the diy option.


Using Bitfenix extensions on my 295X2. They're lovely quality for a very good price. Also no need for cable combs or such nonsense.

edit- using them on my CPU/Mobo/GPU.

Here's an old pic of them in their glory:



(Haven't got a good 295x2 pic - been too busy with work







)


----------



## wermad

Tnx Alex







. I'll get some when I order my remaining fittings.

Anyone going w/ Ti?


----------



## PCModderMike

Quote:


> Originally Posted by *wermad*
> 
> Tnx Alex
> 
> 
> 
> 
> 
> 
> 
> . I'll get some when I order my remaining fittings.
> 
> Anyone going w/ Ti?


If I wasn't so invested in my 295X2, with the block and all....I just might consider going back to green and getting a 980 Ti.
However, I really do like my big, hot, and heavy R9 and I'm excited to get it blocked up and under water. Soon™


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Anyone going w/ Ti?


Very disappointing tbh.

Well, the TX disappointed me.

You gotta remember I got the 295X2 for a complete steal at $600. The 295X2 is $1800 locally.

The Titan X is $1600. The 980 Ti is $900.

I cannot justify the 980 Ti with (no backplate?) at $900. Especially when my 295X2 is ~10-30% faster


----------



## Jpmboy

I'm running both SLI Titan X and a 295x2 that I've had since launch. Very impressive how well the 295x2 is holding up (game wise - benches like a brick tho). Really hoping AMD hits it outta the park with the 390x cards.


----------



## PCModderMike

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Anyone going w/ Ti?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Very disappointing tbh.
> 
> Well, the TX disappointed me.
> 
> You gotta remember I got the 295X2 for a complete steal at $600. The 295X2 is $1800 locally.
> 
> The Titan X is $1600. The 980 Ti is $900.
> 
> I cannot justify the 980 Ti with (no backplate?) at $900. Especially when my 295X2 is ~10-30% faster
Click to expand...









dem local prices

But I agree, 295X2 is a complete steal for the performance you get.


----------



## wermad

Keeping my quads. Need em dp's. Maybe go 5x1 wqhd down the road







.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Anyone going w/ Ti?


Just read the review on Tom's Hardware.

Card looks good, and the power draw, just wow. Still the 295 vs Ti is still just better, as long as there are crossfire drivers. Waiting to see what happens with the 390x and how much it costs, methinks.


----------



## SAFX

Quote:


> Originally Posted by *Jpmboy*
> 
> I'm running both SLI Titan X and a 295x2 that I've had since launch. Very impressive how well the 295x2 is holding up (game wise - benches like a brick tho). Really hoping AMD hits it outta the park with the 390x cards.


Just checked availability for 390x, this article says they will be released in one hour; is that today?

http://wccftech.com/amd-radeon-r9-390x-arrives-1h-2015-feature-hydra-liquid-cooling/


----------



## wermad

Wasn't hbm nicked on pirates? Fiji is now the hbm card apparently (?).


----------



## PCModderMike

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jpmboy*
> 
> I'm running both SLI Titan X and a 295x2 that I've had since launch. Very impressive how well the 295x2 is holding up (game wise - benches like a brick tho). Really hoping AMD hits it outta the park with the 390x cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just checked availability for 390x, this article says they will be released in one hour; is that today?
> 
> http://wccftech.com/amd-radeon-r9-390x-arrives-1h-2015-feature-hydra-liquid-cooling/
Click to expand...

Quote:


> the R9 390X will become available in 1H of 2015. This spans a six month period from January to June so it's quite wide.


Being that it's June now though, I would love to see what AMD has to offer.


----------



## wermad

Wccftech is not the most reliable source tbh







.

My guess, q3, almost 2 years from Hawaii's launch.


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> Wccftech is not the most reliable source tbh
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My guess, q3, almost 2 years from Hawaii's launch.


Oh damn, I saw "1H", thinking one hour, I was like, where's the countdown? This is a big deal! lol,


----------



## PCModderMike

Quote:


> Originally Posted by *wermad*
> 
> Wccftech is not the most reliable source tbh
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My guess, q3, almost 2 years from Hawaii's launch.


Yea I can't say I've actually ever seen or heard of Wccftech....but figured I would give it a look anyway.

Your guess sounds like a good estimate to me.


----------



## xer0h0ur

Um the new series launches this month and at no point has even AMD said Q3. Its this quarter. Q2. 300 series first then the Fury (Fiji) line.


----------



## Jpmboy

Quote:


> Originally Posted by *SAFX*
> 
> Just checked availability for 390x, this article says they will be released in one hour; is that today?
> 
> http://wccftech.com/amd-radeon-r9-390x-arrives-1h-2015-feature-hydra-liquid-cooling/


I was gonna say they 30 days (1H) but AMD is always delayed... gotta stay positive. It better be a home-run (like the 7970 was).


----------



## PCModderMike

Quote:


> Originally Posted by *xer0h0ur*
> 
> Um the new series launches this month and at no point has even AMD said Q3. Its this quarter. Q2. 300 series first then the Fury (Fiji) line.


Thank you for explaining that so politely.


----------



## xer0h0ur

I am not here to stroke anyone's ego. You have a wife and/or girlfriend for that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Wccftech is not the most reliable source tbh
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My guess, q3, almost 2 years from Hawaii's launch.


They are getting pretty good with their rumours now.

I dare say we'll see it launch this month


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> Um the new series launches this month and at no point has even AMD said Q3. Its this quarter. Q2. 300 series first then the Fury (Fiji) line.


Is this fiscal or annual, as it's the start of q3 annual, ?

Edit, and do read my post, I speculated. As its also anyone's guess until amd makes it an official launch date


----------



## xer0h0ur

I enjoy arguing semantics about as much as punching myself in the balls.

Edit: Official statement from Lisa Su was 1H and rumor was end of Q2 which would still be this month. We will find out soon enough.


----------



## wermad

You're correct still q2. Mea culpa.

As for all the speculation from all of us, still just speculation. You can hang on to what ever of all the amd announcements but the fact NV launched Ti worries me that amd is still lagging. And now no hbm for PI, have to wait for fiji, I'm not sure they'll deliver. You don't have to be pressed hard to compete when you own 70% of the market. I would be extremely surprised (especially with all their financial woes) if they did meet thus quarter. But I doubt it's the TX killer we thought it would be and on time. July 1st, I'll be looking forward to that sir







.


----------



## Mega Man

my speculation is 12.5,

just 12.5 nothing else !~ and please feel free to tell me when you give me your case ( the new one )


----------



## wermad

Nevah









Edit: anyone have a "Hitler finds out" video for recent gpus? Those are hilarious!


----------



## DividebyZERO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> They are getting pretty good with their rumours now.
> 
> I dare say we'll see it launch this month


This has to be the longest 2 months in my life. So it feels like may was slower than ever and it's only 2nd of June :-(\

EDIT: waiting on fiji and steam summer sale


----------



## F4ze0ne

June 24th is supposedly the launch date for reviews. I hope it's sooner though because I'm tired of these rumors and we need some more competition in this space.


----------



## PCModderMike

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am not here to stroke anyone's ego. You have a wife and/or girlfriend for that.


Quote:


> Originally Posted by *xer0h0ur*
> 
> I enjoy arguing semantics about as much as punching myself in the balls.
> 
> Edit: Official statement from Lisa Su was 1H and rumor was end of Q2 which would still be this month. We will find out soon enough.


OCN professionalism at its finest folks.


----------



## xer0h0ur

Professionalism: the skill, good judgment, and polite behavior that is expected from a person who is trained to do a job well

I. Am. Not. At. Work. I am also a straight shooter which rubs people the wrong way. I don't candy coat anything. If you don't like it report me like the rest of the babies that do so here.


----------



## Dagamus NM

@Wermad how do you have your lepa cabling setup? I mean which rails do you have for your gpu's and what else do you have on those rails?

For my previous quad gpu setups it has been my understanding that instability could result from splitting any one GPU's power connections across two rails on a multi rail psu.

I have rails 3 and 4 feeding one x2 and rails 5 and 6 to the other.

I have rail 4 also going to two HDD's, two Intel 730 SSD's, and two 200mm case fans. Rail 6 runs my aquero 6xt which has my CPU AIO corsair h110 pump, it's fans are on the motherboard.

I haven't loaded my aquero yet, but when I do it will be two d5's and probably a dozen 140mm fans.

With how much power these cards suck up I am concerned about my aquero drawing too much power away.

Any thoughts?

Nice pickup on the CL case. Mine should be in this week. One reverse and one standard, both are Merlin SM8's with pedestals, the 120mm top thing and casters. I expect this thing to be stupid heavy when done. My dimastech test bench is crazy heavy.


----------



## wermad

^^^mega man, helped me with the setup. Card one is using rail #3 & #4, card two is using rails #5 & #6. I removed one 8-pin cable from each of the dual-cable harness to clean it up a bit. Right now my new case doesn't allow me to run these cables as they're too short. I may swap them for longer Corsair or EVGA pcie cables (from the beefy units) or get some Bitfenix extensions. I have my fans on on rail #5 I believe.

I would calculate how much power you're gonna need for your fans and extra components (ie, pump, hdd, ssd, odd, led's, etc.).

The most I've seen was 1200w at the wall (kill-a-watt), w/ stock cpu and gpu's. all was under water, one d5 pump and 24 corsair 120mm fans (~2.2w each).

edit: here's what I gotta deal with











edit: so the amd confusion continues. Ok, so uber wce 390x is now Fury X and Fury, these are probably Ti and TX contenders. A 390x/390/380x...range will fall below these. The fury line will be hbm and fiji.

http://www.overclock.net/t/1558461/tt-amd-to-announce-the-radeon-r9-fury-x-and-fury-the-hbm-based-cards-not-just-the-r9-390x

http://www.tweaktown.com/news/45602/amd-radeon-r9-fury-watercooled-hbm-based-flagship-card/index.html


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Professionalism: the skill, good judgment, and polite behavior that is expected from a person who is trained to do a job well


Some people here respect that viewpoint and I am certainly one of them. But you definitely seem grumpier lately than I've known you to be over the past year, so I hope you aren't having a bad time right now.

For what it's worth I tried to get MSI to send me my dead card back so I could send you the LED and screws. They didn't want to come that far. I've thought about how to go about getting those things and I can't quite come up with anything plausible.


----------



## xer0h0ur

I don't have the patience to deal with people's fragile egos getting hurt when I point out they are wrong. That is not a problem that concerns me. If there is one class of people that have super fragile egos its definitely nerds. I work 10 hour days for my family's business 5 days a week. That is my time to exhibit professionalism. Not on a forum for something that is a hobby to me. I come here to help, get help or give information. Not to walk on eggshells explaining cutely why they are wrong so they don't get upset.

I appreciate the thought about the shroud LED. I have resigned myself to the fact I am screwed on that one. At least it still looks cool unlit. Not as visually satisfying but satisfying nonetheless. From the looks of it the Fury / Fury X will carry the exact same illuminated logo so I suppose once companies make full coverage waterblocks for those cards I may be able to get one of those.


----------



## PCModderMike

Quote:


> Originally Posted by *electro2u*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Professionalism: the skill, good judgment, and polite behavior that is expected from a person who is trained to do a job well
> 
> 
> 
> Some people here respect that viewpoint and I am certainly one of them. But you definitely seem grumpier lately than I've known you to be over the past year, so I hope you aren't having a bad time right now.
> 
> For what it's worth I tried to get MSI to send me my dead card back so I could send you the LED and screws. They didn't want to come that far. I've thought about how to go about getting those things and I can't quite come up with anything plausible.
Click to expand...

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't have the patience to deal with people's fragile egos getting hurt when I point out they are wrong. That is not a problem that concerns me. If there is one class of people that have super fragile egos its definitely nerds. I work 10 hour days for my family's business 5 days a week. That is my time to exhibit professionalism. Not on a forum for something that is a hobby to me. I come here to help, get help or give information. Not to walk on eggshells explaining cutely why they are wrong so they don't get upset.
> 
> I appreciate the thought about the shroud LED. I have resigned myself to the fact I am screwed on that one. At least it still looks cool unlit. Not as visually satisfying but satisfying nonetheless. From the looks of it the Fury / Fury X will carry the exact same illuminated logo so I suppose once companies make full coverage waterblocks for those cards I may be able to get one of those.


SMH. I honestly get where you're coming from. I work all week long in an office where I have to be on my best behavior....but that doesn't mean I turn around and vent on the internet and act like a douche bag.
There's a fine line between a "straight shooter" and someone who is just flat out rude, and I'm sorry to hurt YOUR fragile ego because I called you out.
The OCN professionalism initiative is not something I just made up to say you weren't acting professional. Most people around here do like to treat each other with mutual respect. If someone is incorrect about a piece of information during a discussion, of course correct them. And sure even with your best attempts, sometimes you'll offend someone just because they didn't use their butthurt cream for that day, but at least you remained calm, cool, and professional about the whole deal.

But I'm just giving my







and I know it's likely to be falling on deaf ears. Sorry to be so off topic in here.


----------



## xer0h0ur

LOL this thread has gone off topic more times than I can count. I have generally speaking regarded this thread now as a meeting area for 295X2 owners.

For the record I never was upset by someone pointing anything out, it was quite the opposite. I made a statement and you were the one who took issue with it. In fact my statement was "Um the new series launches this month and at no point has even AMD said Q3. Its this quarter. Q2. 300 series first then the Fury (Fiji) line." and you found it necessary to say "Thank you for explaining that so politely.







" as if I had broken some professionalism rule by starting my comment with "Um." I still see nothing wrong with the way I said it. But whatever.


----------



## xer0h0ur

For anyone looking to get better performance out of Witcher 3 I would recommend giving this a shot: http://www.nexusmods.com/witcher3/mods/104/?


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> For anyone looking to get better performance out of Witcher 3 I would recommend giving this a shot: http://www.nexusmods.com/witcher3/mods/104/?


i tried this and w3 just crashed, i tried 2 versions of it, i am on win7 so dunno if that matters.


----------



## xer0h0ur

Yeah it doesn't seem to be working for everyone. Which card are you using it on?


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> ^^^mega man, helped me with the setup. Card one is using rail #3 & #4, card two is using rails #5 & #6. I removed one 8-pin cable from each of the dual-cable harness to clean it up a bit. Right now my new case doesn't allow me to run these cables as they're too short. I may swap them for longer Corsair or EVGA pcie cables (from the beefy units) or get some Bitfenix extensions. I have my fans on on rail #5 I believe.
> 
> I would calculate how much power you're gonna need for your fans and extra components (ie, pump, hdd, ssd, odd, led's, etc.).
> 
> The most I've seen was 1200w at the wall (kill-a-watt), w/ stock cpu and gpu's. all was under water, one d5 pump and 24 corsair 120mm fans (~2.2w each).
> 
> edit: here's what I gotta deal with


I cannot quite tell what I am looking at here. Is that case really that wide? What is the deal with that closest radiator? It appears to have a normal length and width but the height seems very thick unless that is some kind of optical illusion.


----------



## xer0h0ur

Quote:


> Originally Posted by *DividebyZERO*
> 
> i tried this and w3 just crashed, i tried 2 versions of it, i am on win7 so dunno if that matters.


You can also give the 15.200.*1038*.0 a crack: http://www.mediafire.com/download/3v6d8ggyn3732js/15.200.1038.0_dlls.rar

I read someone complaining about game not even wanting to launch unless they used the 1023's dll's while using the 15.5 beta driver. I suppose you could try those too.

Me, I am going to try Asder's latest modified driver: http://forums.guru3d.com/showthread.php?t=399595

Hopefully it gets rid of my Dying Light BSOD crashing upon loading the game in crossfire/tri-fire.

Edit: Nevermind, BradleyW just let me know doing so would break Crossfire as its not supported yet through W10 drivers. Hacked or otherwise. Thanks for the info BradleyW


----------



## Dagamus NM

Damn, I haven't even tried going back to playing dying light as I am having issues with borderlands 2. The game itself is super smooth but when I start the game from the sanctuary location I have a 50% chance of freezing and about half of the time it recovers at about ten seconds after. Then when I close the game the computer freezes about half of the time.

This started after adding the second x2. I had the 3930k at 4900MHz and have since dialed it back its normal speed of 4200. ULPS is disabled, but only in AB. I will edit my registry keys,

I will move these to one of my 5960X builds unless I hold out for quad Fiji. I just don't want to adopt the first gen as each card will only have the 4gb hbm stacks. My other 5960X will get quad 980ti's just to have the best of both worlds.

Anyhow, I want to play dying light but if I have issues with borderlands then I am unsure if it will even be enjoyable.


----------



## Gambit74

Why does the 295x2 gurgle so much on startup? The H100i and H60 never made a sound.


----------



## Orivaa

.
Quote:


> Originally Posted by *Gambit74*
> 
> Why does the 295x2 gurgle so much on startup? The H100i and H60 never made a sound.


It's the on-card fan, not the radiator fan.


----------



## Dagamus NM

Quote:


> Originally Posted by *Gambit74*
> 
> Why does the 295x2 gurgle so much on startup? The H100i and H60 never made a sound.


Neither of mine do this. The fans run at full speed during startup and I imagine that the pumps do too which is somewhat annoying, but no gurgling here. Either way, I can hardly wait to get my Vesuvius or EK blocks in for these things.


----------



## Orivaa

Oh. Gurgle.
Well, I've no idea, then. Mine doesn't do that, so maybe yours is faulty?


----------



## F4ze0ne

Quote:


> Originally Posted by *Gambit74*
> 
> Why does the 295x2 gurgle so much on startup? The H100i and H60 never made a sound.


Is the radiator above the card?

I've only had my coolermaster AIO gurgle once when I first installed it. It settled in though eventually and doesn't do it anymore.


----------



## Alex132

My 295X2 gurgles shortly and kinda softly sometimes upon booting. I have the optimal positioning for the radiator too.
I really wouldn't be worried, pump dying sound is a constant grinding sound anyway. So long as its silent when operating, then you're good I'd say.


----------



## gatygun

Dude tests a 980 ti and has the sound also ( the water cooling version ) he explains it as airbubbles in your closed loop which makes noise, It should normally be eliminated after a few runs when the bubbles get trapped in the radiator.


----------



## kayan

If it gurgles on startup I wouldn't be worried. If it starts doing that the entire time that it's on then maybe.

I had an original h80 that did that, along with an annoying slight grinding noise from the pump, all the time. Sent it in for repair and they sent me a brand new h80i back.

But like alex said, if it isn't grinding or gurgling all the time, I think you're fine.


----------



## Mega Man

Quote:


> Originally Posted by *Mega Man*
> 
> Slightly off topic
> 
> super excited, the "new" megaman if you will is avail for preorder, ( ill let you do some googling as to why if it interests you ) - i am super excited !~
> 
> see Mighty No. 9
> 
> http://www.mightyno9.com/


----------



## Feyris

was checking 295 boxes and found a HDMI to DVI plug in the box that still has ati radeon logos on it... so weird, its not in other box either lol (I never put anything in there either)

I think it may be from the 7990 though, still wierd.

I feel like dummy now


----------



## Medusa666

Hey guys,

I currently have a R9 295X2 in an Enthoo Pro Black, the radiator is mounted as rear exhaust, on top of the IO panel.



I'm buying a second 295X2, and would like to know what the optimal placement of the two radiators would be?

I got 1x120 spot in the rear, 3x120 in the top, and 2x120 in the front, even got 1 at the bottom.

Any ideas or suggestions?

Thank you!


----------



## Alex132

Quote:


> Originally Posted by *Medusa666*
> 
> Hey guys,
> 
> I currently have a R9 295X2 in an Enthoo Pro Black, the radiator is mounted as rear exhaust, on top of the IO panel.
> 
> 
> 
> I'm buying a second 295X2, and would like to know what the optimal placement of the two radiators would be?
> 
> I got 1x120 spot in the rear, 3x120 in the top, and 2x120 in the front, even got 1 at the bottom.
> 
> Any ideas or suggestions?
> 
> Thank you!


tbh, I wouldn't buy a 2nd 295X2. Rather stick with 1 and see what happens with the GPU market in the next 2 months. It's going to be interesting.

And if you can't wait, sell your 295X2 and buy 2 980 Tis.

Quad-crossfire doesn't play as nice as SLI / Crossfire anyway. And you'll need a good PSU to quad-fire 295s too. It's just not worth it at this point.


----------



## Medusa666

Quote:


> Originally Posted by *Alex132*
> 
> tbh, I wouldn't buy a 2nd 295X2. Rather stick with 1 and see what happens with the GPU market in the next 2 months. It's going to be interesting.
> 
> And if you can't wait, sell your 295X2 and buy 2 980 Tis.
> 
> Quad-crossfire doesn't play as nice as SLI / Crossfire anyway. And you'll need a good PSU to quad-fire 295s too. It's just not worth it at this point.


I see, that is an interesting viewpoint.

I totally adore this card though, right now I got it running in Crossfire with an R9 290X, so I'm at 3 GPUs total.

I figured, if I buy another 295X2 I'l be good for a few years down the road with 1440P and 4K.

My power supply is a 1600W LEPA, so that base is covered.

I believe I can get the 295X2 for 500-550$.


----------



## Alex132

Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> tbh, I wouldn't buy a 2nd 295X2. Rather stick with 1 and see what happens with the GPU market in the next 2 months. It's going to be interesting.
> 
> And if you can't wait, sell your 295X2 and buy 2 980 Tis.
> 
> Quad-crossfire doesn't play as nice as SLI / Crossfire anyway. And you'll need a good PSU to quad-fire 295s too. It's just not worth it at this point.
> 
> 
> 
> I see, that is an interesting viewpoint.
> 
> I totally adore this card though, right now I got it running in Crossfire with an R9 290X, so I'm at 3 GPUs total.
> 
> I figured, if I buy another 295X2 I'l be good for a few years down the road with 1440P and 4K.
> 
> My power supply is a 1600W LEPA, so that base is covered.
> 
> I believe I can get the 295X2 for 500-550$.
Click to expand...

There's no point, Tri-fire is the max 'optimal' scaling point anyway.
This is probably the worst time to buy a new graphics card, we are very close to AMD revealing their hand and seeing how Nvidia reacts. And in the mean time you certainly do not have a slow graphics solution


----------



## Medusa666

Quote:


> Originally Posted by *Alex132*
> 
> There's no point, Tri-fire is the max 'optimal' scaling point anyway.
> This is probably the worst time to buy a new graphics card, we are very close to AMD revealing their hand and seeing how Nvidia reacts. And in the mean time you certainly do not have a slow graphics solution


Yeah, maybe you are right, and I see that, I really do, but you know there is something, a feeling when you just gotta have something haha, thats it with these cards for me.

Haha you still haven't given me your suggestion to the radiator placement ; )

I might take your advice and hold on for awhile, and see what AMD reveals, maybe end up selling the 295X2 and the 290X and buying Fiji.


----------



## Alex132

Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> There's no point, Tri-fire is the max 'optimal' scaling point anyway.
> This is probably the worst time to buy a new graphics card, we are very close to AMD revealing their hand and seeing how Nvidia reacts. And in the mean time you certainly do not have a slow graphics solution
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, maybe you are right, and I see that, I really do, but you know there is something, a feeling when you just gotta have something haha, thats it with these cards for me.
> 
> Haha you still haven't given me your suggestion to the radiator placement ; )
> 
> I might take your advice and hold on for awhile, and see what AMD reveals, maybe end up selling the 295X2 and the 290X and buying Fiji.
Click to expand...

If you really wanted to spend that money I would upgrade your CPU/motherboard to 5820k + X99 + DDR4.


----------



## kayan

Quote:


> Originally Posted by *Mega Man*
> 
> Originally Posted by Mega Man View Post
> 
> Slightly off topic
> 
> super excited, the "new" megaman if you will is avail for preorder, ( ill let you do some googling as to why if it interests you ) - i am super excited !~
> 
> see Mighty No. 9
> 
> http://www.mightyno9.com/


I am really stoked about this as well! Mega Man 3 was and is my fave NES game! I used to have a t-shirt with MM holding his mega buster and saying, "Say 'ello to my little friend"....at least I had it til a love interest of mine decided she'd keep it and never give it back. Ah well, anyway, yeah #9 is so exciting!

Edit due to the quote not quoting.

Also @ wermad or alex, or whoever it was who was helping me with the overclock issue about a week ago. I had sorta decided that I may try to sell my 295 before the new AMD cards come out, but earlier this week I had started to experience some black horizontal bars ranging from just 1 to so many that it blocked the view on a couple of browser pages. That's the first time anything with artifacting has occurred outside of benching. Anyway, it's also just randomly shut down windows when being put under load (and by load I mean gaming) this week. So I contacted XFX support this AM and am awaiting their reply.


----------



## Mega Man

Quote:


> Originally Posted by *Medusa666*
> 
> Hey guys,
> 
> I currently have a R9 295X2 in an Enthoo Pro Black, the radiator is mounted as rear exhaust, on top of the IO panel.
> 
> 
> 
> I'm buying a second 295X2, and would like to know what the optimal placement of the two radiators would be?
> 
> I got 1x120 spot in the rear, 3x120 in the top, and 2x120 in the front, even got 1 at the bottom.
> 
> Any ideas or suggestions?
> 
> Thank you!


waterblock them, !~ optimal placement for asetroll is in the trash








Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> tbh, I wouldn't buy a 2nd 295X2. Rather stick with 1 and see what happens with the GPU market in the next 2 months. It's going to be interesting.
> 
> And if you can't wait, sell your 295X2 and buy 2 980 Tis.
> 
> Quad-crossfire doesn't play as nice as SLI / Crossfire anyway. And you'll need a good PSU to quad-fire 295s too. It's just not worth it at this point.
> 
> 
> 
> I see, that is an interesting viewpoint.
> 
> I totally adore this card though, right now I got it running in Crossfire with an R9 290X, so I'm at 3 GPUs total.
> 
> I figured, if I buy another 295X2 I'l be good for a few years down the road with 1440P and 4K.
> 
> My power supply is a 1600W LEPA, so that base is covered.
> 
> I believe I can get the 295X2 for 500-550$.
Click to expand...

i have seen them for new for just above that so yea you should

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> tbh, I wouldn't buy a 2nd 295X2. Rather stick with 1 and see what happens with the GPU market in the next 2 months. It's going to be interesting.
> 
> And if you can't wait, sell your 295X2 and buy 2 980 Tis.
> 
> Quad-crossfire doesn't play as nice as SLI / Crossfire anyway. And you'll need a good PSU to quad-fire 295s too. It's just not worth it at this point.
> 
> 
> 
> I see, that is an interesting viewpoint.
> 
> I totally adore this card though, right now I got it running in Crossfire with an R9 290X, so I'm at 3 GPUs total.
> 
> I figured, if I buy another 295X2 I'l be good for a few years down the road with 1440P and 4K.
> 
> My power supply is a 1600W LEPA, so that base is covered.
> 
> I believe I can get the 295X2 for 500-550$.
> 
> Click to expand...
> 
> There's no point, Tri-fire is the max 'optimal' scaling point anyway.
> This is probably the worst time to buy a new graphics card, we are very close to AMD revealing their hand and seeing how Nvidia reacts. And in the mean time you certainly do not have a slow graphics solution
Click to expand...

sick of hearing this, there is enough of a difference, and it is fun if for no other reason even if it is other then just to know " i have this and 99% of people dont "


----------



## boredmug

Quote:


> Originally Posted by *Mega Man*
> 
> waterblock them, !~ optimal placement for asetroll is in the trash
> 
> 
> 
> 
> 
> 
> 
> 
> i have seen them for new for just above that so yea you should
> sick of hearing this, there is enough of a difference, and it is fun if for no other reason even if it is other then just to know " i have this and 99% of people dont "


If anything it'll make an expensive heater during the winter. :-D


----------



## Mega Man

speak for your self, it makes an expensive heater in the summer !

i put in a 3.5ton AC ( oversized it by .5 tons as i have heavy electronics ect [email protected] ) and an addition 2x 14000btu ( 12000 = 1 ton ) for my pc room !~ as the central air cant keep up ( rest of house is cold !~ but not the pc room )


----------



## doctakedooty

I been trying to sell my ek block for the 295x2 for almost a week and no one seems interested lol.


----------



## xer0h0ur

Not exactly everyone is willing to go the waterblock route with this card. Not surprised really.


----------



## doctakedooty

Yea I know I am doing everything but literally giving it away. I think this card needed a block honestly with as hot as even the stock radiator air got.


----------



## xer0h0ur

Man you don't have to convince me, I already have them EK blocked lololol. And you're right, that Asetek AIO unit doesn't do it justice.


----------



## Mega Man

dittoed lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *doctakedooty*
> 
> I been trying to sell my ek block for the 295x2 for almost a week and no one seems interested lol.


I'd be interested but a few reasons why i can't:

1. Wife would murder me.....

2. Would need more rads

3. Would most likely want a new pump

4. See reason Number 1


----------



## Alex132

Finally took some pics of my 295X2;




Bonus, the radiator after just 3 weeks of use


Hopefully the next time I clean it I'll be able to put EK Vardars on the radiator


----------



## jackalopeater

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd be interested but a few reasons why i can't:
> 
> 1. Wife would murder me.....
> 
> 2. Would need more rads
> 
> 3. Would most likely want a new pump
> 
> 4. See reason Number 1


Reason 1 also translates for me as well, lol....must resist to retain living in house status, we actually don't have a doghouse for me to stay in, so I'll be relegated to the corner the dog poops in.


----------



## Medusa666

Quote:


> Originally Posted by *Alex132*
> 
> If you really wanted to spend that money I would upgrade your CPU/motherboard to 5820k + X99 + DDR4.


Already got the X99 platform with above written. : )

Quote:


> Originally Posted by *Mega Man*
> 
> sick of hearing this, there is enough of a difference, and it is fun if for no other reason even if it is other then just to know " i have this and 99% of people dont "


Exactly, I fear that is the main reason, logic being set aside.

Looking at this, it seems that it is not especially beneficial in most cases, but damn it is cool to have : )

http://www.techpowerup.com/forums/threads/crossfire-vs-sli-780ti-780-290x-290-x-dual-triple-quad.195818/


----------



## Alex132

Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> If you really wanted to spend that money I would upgrade your CPU/motherboard to 5820k + X99 + DDR4.
> 
> 
> 
> Already got the X99 platform with above written. : )
Click to expand...

Then update your sig rig!


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> sick of hearing this, there is enough of a difference, and it is fun if for no other reason even if it is other then just to know " i have this and 99% of people dont "


It's not as much value for money as 2 GPUs, that's just a simple fact. And buying another 295X2 right now is not logical, especially seeing as how 1 is not slow by any means.


----------



## Intelligents

Quote:


> Originally Posted by *doctakedooty*
> 
> Yea I know I am doing everything but literally giving it away. I think this card needed a block honestly with as hot as even the stock radiator air got.


PM'd ya a few questions about it


----------



## Alex132

Does anyone else use the stock cooling + aftermarket fans plugged into their motherboard?

I am using 2x PWM Cougar fans plugged into 1 motherboard 4-pin header with a fan-cable splitter. In speed fan I have set up custom fan profiles relating to the GPU temperatures.

However it seems like Speefan doesn't give a crap, and does what it wants anyway.

As soon as it hits (either GPU) 50'c the fans ramp up from the 42% I set to 100%.



Fan curves:



edit -

Tried it out with using my CPU temps to control the fans, the exact same thing happened. Soon as 1 core hit 50'c the fans went from 42% -> 100%. Which means it's either the fans or the motherboard.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Tried it out with using my CPU temps to control the fans, the exact same thing happened. Soon as 1 core hit 50'c the fans went from 42% -> 100%. Which means it's either the fans or the motherboard.


Figured it out,
The author of SpeedFan configured a default max temp, something like 50C for all temps, when exceeded, fan goes to 100%.

Go to *Tempuratures* tab, for each GPU temp, change *MAX* temp range (along bottom of screen), to something very high (100C, whatever)
Now, temp/speed curve works as advertised, and to be honest, speedfan does an awesome job, it's very sensitive to the curve. I use CAM from NZXT for my Kraken cpu cooler, it can't hold a candle to SpeedFan with respect to fan control.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Tried it out with using my CPU temps to control the fans, the exact same thing happened. Soon as 1 core hit 50'c the fans went from 42% -> 100%. Which means it's either the fans or the motherboard.
> 
> 
> 
> Figured it out,
> The author of SpeedFan configured a default max temp, something like 50C for all temps, when exceeded, fan goes to 100%.
> 
> Go to *Tempuratures* tab, for each GPU temp, change *MAX* temp range (along bottom of screen), to something very high (100C, whatever)
> Now, temp/speed curve works as advertised, and to be honest, speedfan does an awesome job, it's very sensitive to the curve. I use CAM from NZXT for my Kraken cpu cooler, it can't hold a candle to SpeedFan with respect to fan control.
Click to expand...

Thank you so much







!

Can't believe that was it!


----------



## gatygun

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd be interested but a few reasons why i can't:
> 
> 1. Wife would murder me.....
> 
> 2. Would need more rads
> 
> 3. Would most likely want a new pump
> 
> 4. See reason Number 1


Just buy her flowers and buy the parts with it. If she asks how much those parts costed just say they where free. And if she mentions why you withdraw 300 euro's, just say that the flowers where very specially picked just for her and costed a lot, but she's worth it ever little euro cent.

Works everytime believe me.


Spoiler: Warning: Spoiler!



R.I.P.


----------



## Mega Man

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> sick of hearing this, there is enough of a difference, and it is fun if for no other reason even if it is other then just to know " i have this and 99% of people dont "
> 
> 
> 
> It's not as much value for money as 2 GPUs, that's just a simple fact. And buying another 295X2 right now is not logical, especially seeing as how 1 is not slow by any means.
Click to expand...

1 not a vulcan

2 i dont care about logic, if i WANT it and i make the money i will spend it like i want

3 when you run high resolutions 4 are HEAVILY used,

amd has always been better at higher res, IE 4k/ eyefinity and more so if you are willing to tweak settings, as someone without quadfire, i dont need to be told what it is like with 3 full rigs with quadfire, i see the difference,


----------



## cmoney408

i have no idea what you guys are talking about, but i did read on a 4 pin splitter review that you need to cut one of the yellow wires. both fans cant/shouldnt send back data to the mobo. so cutting 1 yellow will allow the header to control both, but only allow 1 fan to give readings back.i was looking into 2 noctuas into 1 header as well.


----------



## xer0h0ur

Alex campaigns pretty hard against quadfire. Its not for everyone but just because you don't want it doesn't mean everyone else shouldn't.


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> Alex campaigns pretty hard against quadfire. Its not for everyone but just because you don't want it doesn't mean everyone else shouldn't.


qft


----------



## veaseomat

I run crossfire 295x2's. I even ran 1 295x2 with 2 sapphire trix oc r9 290(non x) before I got the 2nd 295. I notice a difference. I play on a 1440p 144hz freesync (waiting on multi gpu support) and an LG 3440x1440 60hz ips curved ultrawide. I tried a samsung ud590 4k awhile back and decided 4k wasnt for me yet. Do what you want, I didn't ask around if I *SHOULD* get another one because I knew everyone would be like thats dumb, price/performance, wait for new gpu... ect. I love mine and I even have koolance waterblocks on the way for them now. do you boo boo.


----------



## SAFX

On startup, my 295x2 fan runs at 100%, sounds like a jet engine, and doesn't calm down until windows logo.
Same problem when in UEFI, fan screaming, very annoying.

Is there a fix for this?


----------



## Elmy

Quote:


> Originally Posted by *SAFX*
> 
> On startup, my 295x2 fan runs at 100%, sounds like a jet engine, and doesn't calm down until windows logo.
> Same problem when in UEFI, fan screaming, very annoying.
> 
> Is there a fix for this?


Earplugs....


----------



## SAFX

Quote:


> Originally Posted by *Elmy*
> 
> Earplugs....


Is that in the bios? I checked Advanced, no earplugs setting


----------



## Elmy

Quote:


> Originally Posted by *SAFX*
> 
> Is that in the bios? I checked Advanced, no earplugs setting


You could always do what I did to make them quieter.


----------



## SAFX

wow, that's nice, love the chrome, but seriously, anything I can do about the fans on startup?


----------



## veaseomat

Quote:


> Originally Posted by *Elmy*
> 
> You could always do what I did to make them quieter.


Hey dude I've seen your rig around a bunch recently, great job +1 sub. In that video you said the monitors were running at 120hz, are you actually getting near that in fps on bf4?


----------



## Elmy

Quote:


> Originally Posted by *veaseomat*
> 
> Hey dude I've seen your rig around a bunch recently, great job +1 sub. In that video you said the monitors were running at 120hz, are you actually getting near that in fps on bf4?


Yes I am getting 120 FPS give or take 20 FPS depending on the map and location on Ultra.


----------



## SAFX

What are you guys scoring on Fire Strike for single 295x2?

Just upgraded to 15.5 beta (from 15.4), I managed only 0.5% increase (cpu oc 4250, ram 3000), max FPS between drivers was relatively unchanged 129/130.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> What are you guys scoring on Fire Strike for single 295x2?
> 
> Just upgraded to 15.5 beta (from 15.4), I managed only 0.5% increase (cpu oc 4250, ram 3000), max FPS between drivers was relatively unchanged 129/130.


FYI always look at graphical score, not overall.

My firestrike results:

stock: http://www.3dmark.com/fs/4953822 Graphics Score: 21598
1200/1500: http://www.3dmark.com/fs/4954418 Graphics Score: 25096


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> FYI always look at graphical score, not overall.
> 
> My firestrike results:
> 
> stock: http://www.3dmark.com/fs/4953822 Graphics Score: 21598
> 1200/1500: http://www.3dmark.com/fs/4954418 Graphics Score: 25096


Oh, graphics score, got it! Well, in that case, no difference between 15.4/15.5









Damn, how did you manage 1200/1500 OC? I'm flashing my bios tonight for 1030/1300, hopefully that goes well


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> FYI always look at graphical score, not overall.
> 
> My firestrike results:
> 
> stock: http://www.3dmark.com/fs/4953822 Graphics Score: 21598
> 1200/1500: http://www.3dmark.com/fs/4954418 Graphics Score: 25096
> 
> 
> 
> Oh, graphics score, got it! Well, in that case, no difference between 15.4/15.5
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn, how did you manage 1200/1500 OC? I'm flashing my bios tonight for 1030/1300, hopefully that goes well
Click to expand...

1270 / 1500 is my max OC, stable for most benchmarks only









+100mV / +50% stock BIOS. Stock cooling (bar new fans).

I can get 1625 on memory, but it has less performance than 1500. 1505 has less performance than 1500 - so therefore 1500Mhz is my max memory OC


----------



## SAFX

Quote:


> Originally Posted by *295x2*
> 
> So i am already back, and my 295x2 Sapphire is now an OC edition.
> While i am running some tests, here is how you do it:
> 
> 1: you read the info here:
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> 
> 2) you prepare your USB key the same way, you save : 295X2_OC.txt as OC.rom and 295X2_OC_Slave.txt as OC2.rom
> You put them on your USB key, you have also atiflash.exe
> 
> 3) You boot on the key and get the command line
> 
> 4) atiflash -f -p 0 OC.rom (this flash the master card)
> atiflash -f -p 1 OC2.rom (this flash the Slave card)
> 
> Reboot and enjoy


Preparing to flash now, instructions say to "Check if card is Unlockable", I'm assuming that applies only to flashing 290 to 290x, and not the 295x2?


----------



## Alex132

No idea. I don't want to flash my cards, no point really


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> No idea. I don't want to flash my cards, no point really


Well, _you_ certainly don't given your current oc,







you got the cake, I just want some crumbs


----------



## SAFX

Changed my mind, no bios flashing for now, ocing with MSIAF instead,

What is the safe OC range for this card with stock cooling? I'm concerns about VRM since I can't monitor its temp.
My current OC settings, testing with Fire Strike, temps never break 61c, very smooth, not sure if I should go higher.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Changed my mind, no bios flashing for now, ocing with MSIAF instead,
> 
> What is the safe OC range for this card with stock cooling? I'm concerns about VRM since I can't monitor its temp.
> My current OC settings, testing with Fire Strike, temps never break 61c, very smooth, not sure if I should go higher.


Good VRMs can go up to 125'c no problemo. These VRMs don't get that hot really, around 70'c from what I have seen.

They passed my _touching the VRMs for 5 seconds, totally scientific patented test_® while under OC load, so I was fine with that. My 690s VRMs got hotter.

That being said, I don't use my OC permanently. I don't have a reason to









Also, just set the volts to +100mV and power draw to +50%. Those are still safe limits for the card.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Good VRMs can go up to 125'c no problemo. These VRMs don't get that hot really, around 70'c from what I have seen.
> 
> They passed my _touching the VRMs for 5 seconds, totally scientific patented test_® while under OC load, so I was fine with that. My 690s VRMs got hotter.
> 
> That being said, I don't use my OC permanently. I don't have a reason to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, just set the volts to +100mV and power draw to +50%. Those are still safe limits for the card.


I was under the impression voltage control was locked on these cards? It's enabled in MSI, though I've never touched it, does that mean I can change it?

lol, patented? seriosuly, is that an actual test you perform?


----------



## xer0h0ur

Infrared thermometer ftw


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Infrared thermometer ftw


Not a bad idea, like this?


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> I was under the impression voltage control was locked on these cards? It's enabled in MSI, though I've never touched it, does that mean I can change it?


It's locked as in within the set limits. +100mV is that set limit. You can thank the exploding GTX 590 for setting this whole thing rolling









Quote:


> Originally Posted by *SAFX*
> 
> lol, patented? seriosuly, is that an actual test you perform?


I was making a joke









And yes, it is. If its near 90'c it should be an instant too-hot-to-touch component. Holding for 5s comfortably means nearer to 50-65'c.

Quote:


> Originally Posted by *xer0h0ur*
> 
> Infrared thermometer ftw


Meh, expensive and hard to get here. Also kinda not worth it for just that.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Meh, expensive and hard to get here. Also kinda not worth it for just that.


Not that expensive, I just purchased this one, cheap, but who cares, I plan on using it at work to avoid the hot heads


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Meh, expensive and hard to get here. Also kinda not worth it for just that.
> 
> 
> 
> Not that expensive, I just purchased this one, cheap, but who cares, I plan on using it at work to avoid the hot heads
Click to expand...

Not in the USA, unfortunately


----------



## Feyris

mafia connections


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> No idea. I don't want to flash my cards, no point really
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, _you_ certainly don't given your current oc,
> 
> 
> 
> 
> 
> 
> 
> you got the cake, I just want some crumbs
Click to expand...

Mine will do 1175/1500 with +100mV and +50% Powerlimit in Afterburner.

needs +145mV to do 1200/1500, haven't gone above that as yet


----------



## Klocek001

hey y'all I wanna ask you something,
Is my Super Flower Leadex Gold 850W gonna handle the 295x2 ? And how loud is it compared to, let's say, a 290x with good aftermarket air cooling, like sapphire or powercolor 3 fan coolers?
thx in advance


----------



## al3x360

Personnaly i did a hardware modification on it, i've put a corsair SP120 silent edition and there is no noise coming from the gpu, on the temps you are at 65°C max (On full load)


----------



## Mega Man

Quote:


> Originally Posted by *Klocek001*
> 
> hey y'all I wanna ask you something,
> Is my Super Flower Leadex Gold 850W gonna handle the 295x2 ? And how loud is it compared to, let's say, a 290x with good aftermarket air cooling, like sapphire or powercolor 3 fan coolers?
> thx in advance


yes it will work fine


----------



## SAFX

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mine will do 1175/1500 with +100mV and +50% Powerlimit in Afterburner.
> 
> needs +145mV to do 1200/1500, haven't gone above that as yet


1175 with stock cooling?


----------



## SAFX

What do you guys use for testing long term temps? Fire Strike on a 30m loop?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Mine will do 1175/1500 with +100mV and +50% Powerlimit in Afterburner.
> 
> needs +145mV to do 1200/1500, haven't gone above that as yet
> 
> 
> 
> 1175 with stock cooling?
Click to expand...

Yes, still on stock cooling just with 2 Noctua's on the rad instead of the original fan.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> What do you guys use for testing long term temps? Fire Strike on a 30m loop?


Lol firestrike is not good with temps.

For temps, I just use folding overnight. Otherwise GTA V is actually very good for raising temps.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SAFX*
> 
> What do you guys use for testing long term temps? Fire Strike on a 30m loop?
> 
> 
> 
> Lol firestrike is not good with temps.
> 
> For temps, I just use folding overnight. Otherwise GTA V is actually very good for raising temps.
Click to expand...

Folding or Unigine Heaven or Valley for me


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Lol firestrike is not good with temps.
> 
> For temps, I just use folding overnight. Otherwise GTA V is actually very good for raising temps.


GTA V benchmark runs then exits after 2m, not exactly long term, right?


----------



## Dagamus NM

So here is something I found interesting. I was messing with my setup after dying light kept crashing and I then decided to roll back from the 15.5 beta to the 15.4. Somehow in this process I noticed my maximum screen resolution had two more notches above 1920x1080. I figured this was an error but decided to give it a go. I then matched the resolution in dying light up to 3200x1800 and the game somehow became more stable. Artifacts are gone, completely gone. The game still crashes but now it takes 20-30 minutes instead of two.

Anyhow, I was quite surprised to find the higher resolution on my Sony kdl55w950b as nothing in the literature claimed it. The quad 780ti's I had in there previously never listed this resolution as an option. Looks like Sony just used the same stuff as the x950b but I never knew until running the new catalyst drivers and the pair of 295x2's.

Haha, awesome.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Lol firestrike is not good with temps.
> 
> For temps, I just use folding overnight. Otherwise GTA V is actually very good for raising temps.
> 
> 
> 
> GTA V benchmark runs then exits after 2m, not exactly long term, right?
Click to expand...

Play the game


----------



## SAFX

Had an idea last night.

I'm looking at my X41 Kraken rad compared to 295x2's rad, got me thinking, why not try the Kraken rad on the 295x2?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dagamus NM*
> 
> So here is something I found interesting. I was messing with my setup after dying light kept crashing and I then decided to roll back from the 15.5 beta to the 15.4. Somehow in this process I noticed my maximum screen resolution had two more notches above 1920x1080. I figured this was an error but decided to give it a go. I then matched the resolution in dying light up to 3200x1800 and the game somehow became more stable. Artifacts are gone, completely gone. The game still crashes but now it takes 20-30 minutes instead of two.
> 
> Anyhow, I was quite surprised to find the higher resolution on my Sony kdl55w950b as nothing in the literature claimed it. The quad 780ti's I had in there previously never listed this resolution as an option. Looks like Sony just used the same stuff as the x950b but I never knew until running the new catalyst drivers and the pair of 295x2's.
> 
> Haha, awesome.


That's called VSR, It's an option in CCC, it's basically internal SuperSampling, GCN 1.1 can do up to 3200x1800, GXN 1.2 can do up to 4k.

Look in CCC under "My Digital Flat Panels" then Properties, then you see a little check box called "Enable Virtual Super Resolution"

Works great with older games I've found.


----------



## xer0h0ur

Quote:


> Originally Posted by *SAFX*
> 
> Not a bad idea, like this?


Yup, I just have a cheapo $20ish one. I don't even know the difference between the more expensive ones and the cheaper ones anyways. I don't take the reading to be absolutely perfect, I take it more as a general temp range to have an idea as I can't attest to the cheap units' accuracy.


----------



## Dagamus NM

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's called VSR, It's an option in CCC, it's basically internal SuperSampling, GCN 1.1 can do up to 3200x1800, GXN 1.2 can do up to 4k.
> 
> Look in CCC under "My Digital Flat Panels" then Properties, then you see a little check box called "Enable Virtual Super Resolution"
> 
> Works great with older games I've found.


Ok, thank you for pointing that out?

It works really well on my TV. I have a couple of Asus 4k monitors coming in Tuesday. I am curious to see how well they compare to the vsr on my Sony TV. I guess the 295x2 is the first GCN 1.1 card that I have run. Previous AMD must have been a 1.0 Tahiti cards.


----------



## SAFX

OC'ed at 1110/1250, +50mv/+50PL,

Just ran another test using Valley.....does this accurately reflect throttling did _not_ occur?


----------



## Mega Man

Correct. Clocks were up but usage was notthat indicates the game did not need 100 percent usage (assuming your cpu was not pegged at full usage)


----------



## SAFX

Quote:


> Originally Posted by *Mega Man*
> 
> Correct. Clocks were up but usage was notthat indicates the game did not need 100 percent usage (assuming your cpu was not pegged at full usage)


CPU's were definitely not pegged, hovering around 50%

How can I determine if I'm getting bottle-necked with RAM, disk usage, or other critical areas?


----------



## Mega Man

your not


----------



## wermad

Hey guys, been waiting for pieces to be ordered and come in for my new case. I've done a bit if work but it's hard to progress substantially with missing (and sometimes small) crucial pieces.

Here's a couple of pics:




Ended buying some inexpensive pcie extensions

Quote:


> Originally Posted by *Dagamus NM*
> 
> I cannot quite tell what I am looking at here. Is that case really that wide? What is the deal with that closest radiator? It appears to have a normal length and width but the height seems very thick unless that is some kind of optical illusion.


It's a big bastard, Caselabs TX10-D + pedestal:

http://www.caselabs-store.com/magnum-tx10/



Edit: @megaman, the answer is still no!


----------



## SAFX

Alternate AIO water cooling for this card?

I'm looking at my X41 Kraken rad compared to 295x2's rad, got me thinking, why not use the Kraken rad on the 295x2?


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> Hey guys, been waiting for pieces to be ordered and come in for my new case. I've done a bit if work but it's hard to progress substantially with missing (and sometimes small) crucial pieces.
> 
> Here's a couple of pics:
> 
> 
> 
> 
> Ended buying some inexpensive pcie extensions
> It's a big bastard, Caselabs TX10-D + pedestal:
> 
> http://www.caselabs-store.com/magnum-tx10/


WOW! That case is huge? Does it come with a lift? Lol, why you going so big?


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> WOW! That case is huge? Does it come with a lift? Lol, why you going so big?


He wants to workout while going to LANs.


----------



## Orivaa

Quote:


> Originally Posted by *wermad*


When I scrolled past this picture, I thought you had put a Coca Cola in there.


----------



## wermad

The opportunity came up to buy a TX10 and i took it. Its rare to have these monsters come up. I was already planning to expand my cooling (adding a second X9 or going w/ a custom scratch build), but this takes care of the "expansion" needs. Its a dual system which allows me to throw in the old family desktop hardware in there. Now my kids can use the computer w/out having to use my own rig. I got it as is from a local member for a great price and i couldn't say no. Its got a custom finish.


----------



## Mega Man

By my count 5 more days ( give or take a few months) and it will be mine


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> The opportunity came up to buy a TX10 and i took it. Its rare to have these monsters come up. I was already planning to expand my cooling (adding a second X9 or going w/ a custom scratch build), but this takes care of the "expansion" needs. Its a dual system which allows me to throw in the old family desktop hardware in there. Now my kids can use the computer w/out having to use my own rig. I got it as is from a local member for a great price and i couldn't say no. Its got a custom finish.


Is it _too_ big? I mean, if it were painted white I'd confuse it for a refrigerator


----------



## wermad

Check out seross' tx10 + 3x pedestals + extended top, in white .


----------



## beasty54

Hi guys

I've recently decided to get into PC Gaming again and have just purchased an R9 295x2 and 3 24" 1080p monitors. I want to run the 3 monitors in an Eyefinity configuration and then have my 4th monitor running separately. The problem i've run into is that i cant get Windows (8.1) to activate the fourth display, is it because 3 are using the display ports and 1 has to use DVI? I can get ALL the monitors working no matter what they are connected to, but only 3 at a time?
Thanks for your time guys, any help would be appreciated.


----------



## NBrock

Hey everyone,

Any of you guys notice a big difference swapping out thermal compound on these? I have a 295x2 that should be here tomorrow or the next day and want to know if it would be worth it before I cram it into my 250D.

Thanks!


----------



## SAFX

Quote:


> Originally Posted by *NBrock*
> 
> Hey everyone,
> 
> Any of you guys notice a big difference swapping out thermal compound on these? I have a 295x2 that should be here tomorrow or the next day and want to know if it would be worth it before I cram it into my 250D.
> 
> Thanks!


Great question, discussed in the past here, but would like to hear any updates/improvements from members too!


----------



## xer0h0ur

Quote:


> Originally Posted by *beasty54*
> 
> Hi guys
> 
> I've recently decided to get into PC Gaming again and have just purchased an R9 295x2 and 3 24" 1080p monitors. I want to run the 3 monitors in an Eyefinity configuration and then have my 4th monitor running separately. The problem i've run into is that i cant get Windows (8.1) to activate the fourth display, is it because 3 are using the display ports and 1 has to use DVI? I can get ALL the monitors working no matter what they are connected to, but only 3 at a time?
> Thanks for your time guys, any help would be appreciated.


You need active displayport adapters. Once you get to 3 monitors or more they are required.


----------



## beasty54

Quote:


> Originally Posted by *xer0h0ur*
> 
> You need active displayport adapters. Once you get to 3 monitors or more they are required.


Ok thanks, after a bit of reading I thought that might be the case. Do I need 1 for each monitor or do I just need one on the fourth?
Is it an issue because I'm using 4 monitors or is it an issue because one is using dvi?
I just find it strange that the r9 295x2 has 4 mini DP and one dvi but I can only use 3 monitors without buying extra hardware.


----------



## xer0h0ur

I couldn't possibly give you an explanation why its like this as off the top of my head I forgot it. I just remember that once you connect more than 2 monitors then you require the usage of active displayport adapters. Since you're just using 1080p monitors you can also get yourself an MST hub instead and connect directly to the hub which is providing an active signal. You can use up to 4 1080p monitors on an MST hub off a single displayport.

I recently hooked up a video wall for some day trader and this crazy SOB wanted 16 32" 1080p monitors so I got 4 club3d 4-port MST hubs and hooked them all up to a 295X2. I forgot to take pictures when it was up and running but its the largest video wall I have ever set up.


----------



## beasty54

Quote:


> Originally Posted by *xer0h0ur*
> 
> I just remember that once you connect more than 2 monitors then you require the usage of active displayport adapters..


I can connect 3 monitors just fine, i suppose i just don't understand why there's 4 Display Ports if i cant use them without purchasing extra hardware


----------



## Mega Man

You can see below
Quote:


> Originally Posted by *beasty54*
> 
> Hi guys
> 
> I've recently decided to get into PC Gaming again and have just purchased an R9 295x2 and 3 24" 1080p monitors. I want to run the 3 monitors in an Eyefinity configuration and then have my 4th monitor running separately. The problem i've run into is that i cant get Windows (8.1) to activate the fourth display, is it because 3 are using the display ports and 1 has to use DVI? I can get ALL the monitors working no matter what they are connected to, but only 3 at a time?
> Thanks for your time guys, any help would be appreciated.


You are using dp on the card? What about the monitors?
Quote:


> Originally Posted by *NBrock*
> 
> Hey everyone,
> 
> Any of you guys notice a big difference swapping out thermal compound on these? I have a 295x2 that should be here tomorrow or the next day and want to know if it would be worth it before I cram it into my 250D.
> 
> Thanks!


Yes but I also switch to water block at the same time D:
Quote:


> Originally Posted by *beasty54*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> You need active displayport adapters. Once you get to 3 monitors or more they are required.
> 
> 
> 
> Ok thanks, after a bit of reading I thought that might be the case. Do I need 1 for each monitor or do I just need one on the fourth?
> Is it an issue because I'm using 4 monitors or is it an issue because one is using dvi?
> I just find it strange that the r9 295x2 has 4 mini DP and one dvi but I can only use 3 monitors without buying extra hardware.
Click to expand...

The 79xx cards could do 2 the 2xx can do 3 without active adapters

Quote:


> Originally Posted by *xer0h0ur*
> 
> I couldn't possibly give you an explanation why its like this as off the top of my head I forgot it. I just remember that once you connect more than 2 monitors then you require the usage of active displayport adapters. Since you're just using 1080p monitors you can also get yourself an MST hub instead and connect directly to the hub which is providing an active signal. You can use up to 4 1080p monitors on an MST hub off a single displayport.
> 
> I recently hooked up a video wall for some day trader and this crazy SOB wanted 16 32" 1080p monitors so I got 4 club3d 4-port MST hubs and hooked them all up to a 295X2. I forgot to take pictures when it was up and running but its the largest video wall I have ever set up.


MST hubs do not need active adapters the MST hubs can use all passive adapters one bonus of a dp hub

This is a little outdated bit still valid

http://www.overclock.net/t/721931/active-vs-passive-displayport-adapters-the-truth

Again this is only for dp to a different type of connector and current cards can do 3 passive outputs

(1dvi and 2 dp, 3dp those dp can use passive adapters)

If you use dp to dp then you can use as many dp as the card has as the signal does not have to change

Hope this helps


----------



## beasty54

Quote:


> Originally Posted by *Mega Man*
> 
> You are using dp on the card? What about the monitors?


I'm still reading up on all this but I get why I'm having an issue now. I'm using mini dp to hdmi for the 3 monitors that will use eyefinity and then dvi to hdmi for the 4th. So as it stands I can use 3 monitors in any combination, do I now need to get an active mini dp to dvi adapter for the 4th monitor or will I need active adapters on more than just that one?


----------



## xer0h0ur

Quote:


> Originally Posted by *Mega Man*
> 
> You can see below
> You are using dp on the card? What about the monitors?
> Yes but I also switch to water block at the same time D:
> The 79xx cards could do 2 the 2xx can do 3 without active adapters
> MST hubs do not need active adapters the MST hubs can use all passive adapters one bonus of a dp hub
> 
> This is a little outdated bit still valid
> 
> http://www.overclock.net/t/721931/active-vs-passive-displayport-adapters-the-truth
> 
> Again this is only for dp to a different type of connector and current cards can do 3 passive outputs
> 
> (1dvi and 2 dp, 3dp those dp can use passive adapters)
> 
> If you use dp to dp then you can use as many dp as the card has as the signal does not have to change
> 
> Hope this helps


So just to be brutally specific then, the only reason his 295X2 is giving him grief on the 4th monitor is because he has one of them hooked up through the dual link DVI? He would still require one active displayport adapter then right? Other than the use of an MST hub obviously.


----------



## Mega Man

Depends on the resolution/refresh rate. If you need a dvi single link they are cheap. If you need a dual link dvi they are very expensive (around 100) dp to hdmi or minidp to hdmi are cheapish. But don't go to cheap ( just an FYI )

Just any monitor over 3 needs an active adapter any 3 monitors can be passive


----------



## beasty54

Quote:


> Originally Posted by *Mega Man*
> 
> Depends on the resolution/refresh rate. If you need a dvi single link they are cheap. If you need a dual link dvi they are very expensive (around 100) dp to hdmi or minidp to hdmi are cheapish. But don't go to cheap ( just an FYI )
> 
> Just any monitor over 3 needs an active adapter any 3 monitors can be passive


I'm only running at 1920 x 1080 @ 60hz so single link should be fine


----------



## D3AD PIX3L

My 295x2 arrived in the mail yesterday. It is running like a champ. So far I have played Witcher 3, GTA V and Battlefield 4 using the latest beta drivers and everything runs awesome on my eyefinity setup. The only game where I had to turn down some of the settings was Witcher 3.

Sorry for the poor pic quality, cellphone pics.









I still need to install the middle fan at the bottom of my FT02. I had removed it thinking it wouldn't fit with the new card, but it slides right under.


----------



## SAFX

Quote:


> Originally Posted by *D3AD PIX3L*
> 
> My 295x2 arrived in the mail yesterday. It is running like a champ. So far I have played Witcher 3, GTA V and Battlefield 4 using the latest beta drivers and everything runs awesome on my eyefinity setup. The only game where I had to turn down some of the settings was Witcher 3.
> 
> Sorry for the poor pic quality, cellphone pics.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still need to install the middle fan at the bottom of my FT02. I had removed it thinking it wouldn't fit with the new card, but it slides right under.


...wish I could relive the day when mine arrived by mail,...guess I'll just have to buy another one for crossfire


----------



## wermad

Why is the xfx box so huge?!?!?!?!?!?

Almost rog mb size!

Missing my quads







. Tube came in but I'm missing a bunch of little things to tie everything together. Hoping by this weekend I should be 90% there and enough to temporarily fire her back up







.


----------



## TooManyAlpacas

Yeah I have the XFX version and the box is just massive I have no idea as to why


----------



## Dagamus NM

So this is a lesson of why you should have multiple quads right?









The box is so big because these things used to have a huge price tag, made it seem more legit. The xfx units are the best deal in high end GPU's around in recent history.


----------



## Mega Man

Quote:


> Originally Posted by *Dagamus NM*
> 
> So this is a lesson of why you should have multiple quads right?


qft and the reason i do


----------



## gatygun

Quote:


> Originally Posted by *D3AD PIX3L*
> 
> My 295x2 arrived in the mail yesterday. It is running like a champ. So far I have played Witcher 3, GTA V and Battlefield 4 using the latest beta drivers and everything runs awesome on my eyefinity setup. The only game where I had to turn down some of the settings was Witcher 3.
> 
> Sorry for the poor pic quality, cellphone pics.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still need to install the middle fan at the bottom of my FT02. I had removed it thinking it wouldn't fit with the new card, but it slides right under.


Be sure to check out the witcher 3 thread, and make a ccc profile with reducing tesselation for hairworks, or disable hairworks entirely, will give you tons of fps.


----------



## BootPirate

That's awesome. Welcome to the club!


----------



## D3AD PIX3L

Would adding an additional pull fan on top of the radiator be overkill? It would of course move the push fan closer to my cpu cooler with push/pull. Is there anything wrong with having fans pushing air the same direction that close together?

Sent from my Nexus 5 using Tapatalk


----------



## Intelligents

Since the price of these are so cheap now, I'm tossing around the idea of picking up another one for quadfire. How are the performance gains? I've read a lot of year-old benchmarks where people were complaining about stuttering etc. I can only assume driver updates have fixed that by now.


----------



## MIGhunter

Quote:


> Originally Posted by *D3AD PIX3L*
> 
> My 295x2 arrived in the mail yesterday. It is running like a champ. So far I have played Witcher 3, GTA V and Battlefield 4 using the latest beta drivers and everything runs awesome on my eyefinity setup. The only game where I had to turn down some of the settings was Witcher 3.
> 
> Sorry for the poor pic quality, cellphone pics.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still need to install the middle fan at the bottom of my FT02. I had removed it thinking it wouldn't fit with the new card, but it slides right under.


How are your temps? I see you are blowing your CPU into the radiator of the GPU.


----------



## Medusa666

Guys I never got a good reply on what would be the optimal placement for the radiators in an Enthoo Pro chassi while running two of these bad boys in Quadfire, all I got was advice against my decision to buy a second card haha.









Anyone who can help me out, I'm an amateur here so any advice is helpful.


----------



## Intelligents

Quote:


> Originally Posted by *Medusa666*
> 
> Guys I never got a good reply on what would be the optimal placement for the radiators in an Enthoo Pro chassi while running two of these bad boys in Quadfire, all I got was advice against my decision to buy a second card haha.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone who can help me out, I'm an amateur here so any advice is helpful.


I can't help you with that case, but I support your quadfire decision







. I almost pulled the trigger on a second one for quadfire this morning until I realized one of these is eating 1440p for lunch.


----------



## BootPirate

Quote:


> Originally Posted by *Medusa666*
> 
> Guys I never got a good reply on what would be the optimal placement for the radiators in an Enthoo Pro chassi while running two of these bad boys in Quadfire, all I got was advice against my decision to buy a second card haha.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone who can help me out, I'm an amateur here so any advice is helpful.


Are you going to be liquid cooling your CPU? It looks like there are mounting points on the top of the case, you can put the radiators up there. Blowing out of course.


----------



## Medusa666

Quote:


> Originally Posted by *BootPirate*
> 
> Are you going to be liquid cooling your CPU? It looks like there are mounting points on the top of the case, you can put the radiators up there. Blowing out of course.


Yeah I got a 120mm radiator on my CPU atm, a Enermax Liqmax 120.

So there will be a total of 3x120mm rads in the case, the placement I'm considering is putting all three on top, CPU top (currently is) and the 2 AMD ones in the front of the case blowing out.
Thing is, is it bad if the radiator is below the pump? For the flow.


----------



## D3AD PIX3L

Just ran Fire Strike Ultra and hit a high of 60c. Going to jump into a game for a bit and see if it get higher.
Quote:


> Originally Posted by *MIGhunter*
> 
> How are your temps? I see you are blowing your CPU into the radiator of the GPU.


----------



## LegacyLG

what is the case to get to put 2x R9 295x2 in


----------



## MIGhunter

Quote:


> Originally Posted by *LegacyLG*
> 
> what is the case to get to put 2x R9 295x2 in


Honestly, if I were going to run two of these if just bite the bullet and put waterblocks on them through one radiator. It would be more efficient and probably not to much more money


----------



## LegacyLG

I don't know much about waterblocks explain as this will be my 1st build (but im very good with working out stuff aslong as I have the details it will be fine


----------



## LegacyLG

i am thinking maybe 390x quadfire with waterblocks if i can work out how to do it and its better than r9 295x2 crossfire(quadfire)


----------



## D3AD PIX3L

After playing a bunch of games and running 3dmark, heavenly and valley last night for a few hours, temps went up to 71 c.

Also I usually game with a headset on and just noticed that when my GPU is under load my brand new PSU I got from an RMA (Corsair AX860i) makes this constant buzzing sound. In some games it is more intense than others but it is an electrical buzz not a fan issue. The lame thing about this is the PSU (AX850) I RMA'd was returned because it was making a high pitch frequency under load. From what I read in other forums and listened to on youtube it seems to be a common issue with Corsair PSUs also in some cases it goes away? Wondering if I should give it a week or so and if the noise continues request another RMA.


----------



## Alex132

Had it on my HX850 (2009). Didn't ever go away, ended up selling it and got a Cooler Master V1000. Frickin' love this unit, silent and very, very cool. 230v means that the platinum EU rating basically is the same as Titanium for USA.


----------



## rakesh27

Quote:


> Originally Posted by *LegacyLG*
> 
> what is the case to get to put 2x R9 295x2 in


If your in the UK get the corsair 900d, one of the best cases around, or if you can afford to get one from the states as the one im gonna suggest only comes from there get a caslab, thats the ultimate in cases..


----------



## Orivaa

I wouldn't get the 900D. Crap build quality.


----------



## Intelligents

You can honestly fit 2 R9 295x2s in a lot of cases. You don't need a full tower. I could fit 2 in my Define R5 for sure.


----------



## Orivaa

Something Fractal would be a good choice.


----------



## F4ze0ne

I have the Arc Midi R1 and can fit 2 of them, but it's a little tight near the hdd cage.

The Arc Midi R2 can remove the hdd cage, which solves the R1 spacing issue.


----------



## BootPirate

I have use an Antec Lanboy and I can fit a second one. I'm thinking of doing that actually.
Antec doesn't make these anymore though.


----------



## D3AD PIX3L

Quote:


> Originally Posted by *Alex132*
> 
> Had it on my HX850 (2009). Didn't ever go away, ended up selling it and got a Cooler Master V1000. Frickin' love this unit, silent and very, very cool. 230v means that the platinum EU rating basically is the same as Titanium for USA.


Tempting to upgrade to another PSU... You sold your noise maker to someone else? Poor soul.







I am tempted to request another RMA, but worry it will still have the noise or even worse a louder noise. I might just keep this one in hopes the noise goes away.


----------



## kayan

Quote:


> Originally Posted by *D3AD PIX3L*
> 
> Tempting to upgrade to another PSU... You sold your noise maker to someone else? Poor soul.
> 
> 
> 
> 
> 
> 
> 
> I am tempted to request another RMA, but worry it will still have the noise or even worse a louder noise. I might just keep this one in hopes the noise goes away.


Something you could do is request another rma and then sell the replacement unit without opening it. Just get another psu in the meantime. I can vouch for the CM v1000. Mine is super quiet and has been reliable for a year and a half thus far.


----------



## Alex132

CM V1000 = SeaSonic Platinum 1000w unit. Exact same internals except the CM unit has a better fan in it and cheaper!

I suspect the CM unit is rated as Gold 80+ instead of Platinum 80+ to not compete with their 1200w Platinum PSU.

If you can get an EVGA G2 / P2 1000w unit for cheaper/same I'd go for that first, and then the CM V1000.


----------



## joeh4384

Quote:


> Originally Posted by *Intelligents*
> 
> You can honestly fit 2 R9 295x2s in a lot of cases. You don't need a full tower. I could fit 2 in my Define R5 for sure.


I wouldn't unless you can put a side fan or have additional spacing. Mine VRM throttled when I ran my 290x below and I was just running crossfire and the 290x was doing nothing.


----------



## D3AD PIX3L

Quote:


> Originally Posted by *Alex132*
> 
> CM V1000 = SeaSonic Platinum 1000w unit. Exact same internals except the CM unit has a better fan in it and cheaper!
> 
> I suspect the CM unit is rated as Gold 80+ instead of Platinum 80+ to not compete with their 1200w Platinum PSU.
> 
> If you can get an EVGA G2 / P2 1000w unit for cheaper/same I'd go for that first, and then the CM V1000.


Thanks for all your help. I requested another RMA. Going to see if it works. It sounds like if there is still an issue they will refund me then I can use that $$ for another PSU.. I am very impressed with Corsair's customer service.

Sent from my Nexus 5 using Tapatalk


----------



## BootPirate

So I've been looking at some benchmarks and my results don't add up. One example I can remember off the top of my head is Battlefield 4, It's running at 0.5-1fps on ultra and a resolution of 4800x1200.
There were a few other games that don't seem to be running as smooth as they should. I can get the numbers when I finish work.

Games like:
Far Cry 3
Star Citizen
Arma 3 (I haven't seen any benchmarks but it seems like it should be running better then it is for a purebred PC game.)
There were a few more I can't remember right now.

I have the latest drivers, I made sure crossfire is on in reg-edit. I've noticed the ram gets damn hot (on the card)
I took a picture of the temps and clock speeds while running some bench marks and games. They looked okay, I'll post them after I finish work.


----------



## AlphaBravo

I been a R9 295X2 owner for a couple of months, and I am becoming worried about the sounds that my cooler/radiator is making. A couple of weeks ago, after an hour long Battlefield 3 gaming session, I could hear liquid circulating though the pumps and radiator. The noise was gone the next day. But I am starting to notice the noise occurring more frequently now. I thought it was only happening when the liquid had been hot (like after a long gaming session), but I just turned on my computer right now (which had been off all night), and I can hear the liquid circulating though the system. It has been several minutes, and it has not yet gone away. I have several AIO cpu coolers in my other systems, and have never heard liquid sounds like this. Should I be concerned?


----------



## kayan

Quote:


> Originally Posted by *AlphaBravo*
> 
> I been a R9 295X2 owner for a couple of months, and I am becoming worried about the sounds that my cooler/radiator is making. A couple of weeks ago, after an hour long Battlefield 3 gaming session, I could hear liquid circulating though the pumps and radiator. The noise was gone the next day. But I am starting to notice the noise occurring more frequently now. I thought it was only happening when the liquid had been hot (like after a long gaming session), but I just turned on my computer right now (which had been off all night), and I can hear the liquid circulating though the system. It has been several minutes, and it has not yet gone away. I have several AIO cpu coolers in my other systems, and have never heard liquid sounds like this. Should I be concerned?


Strange response question, but do you move your case around much? Also is your rad above the GPU? Is the sound coming from your pump or the tubing? If it's the pump, then maybe, if it's tubing probably just an air bubble. You could try unmounting your rad moving it around a little and then remounting it.

I have a custom loop, and have used AIO's before, and the noise is usually the worst on system boot. It should calm down after a bit.

For peace of mind check the tubing ports into the rad and see if there's any condensation at the joints.


----------



## SAFX

Quote:


> Originally Posted by *BootPirate*
> 
> I have the latest drivers, I made sure crossfire is on in reg-edit. I've noticed the ram gets damn hot (on the card)


Confirm crossfire is _really_ working and set " Show AMD Crossfire status Icon" in CCC (you can do it from sys tray icon too)
Is ULPS disabled?


----------



## SAFX

I'm thinking about going _all in_ with a custom water loop, any advice on reliable waterblocks? EK has nice selection, but XSPC looks damn sweet, leds are nice too.

Just curious, thanks


----------



## kayan

I've used XSPC before, for CPU's (my old AMD and my current x99) and they are solid. I had 2x 290x's XSPC blocks before I sold it to get the 295x2. They were solid as well. They look good. Just be careful, they are heavy once installed. And yes, the LED's look good.


----------



## SAFX

Quote:


> Originally Posted by *kayan*
> 
> I've used XSPC before, for CPU's (my old AMD and my current x99) and they are solid. I had 2x 290x's XSPC blocks before I sold it to get the 295x2. They were solid as well. They look good. Just be careful, they are heavy once installed. And yes, the LED's look good.


Thanks, kayan, I read about weight issues with XSPC, ...no reinforced L-brackets that can fix that?


----------



## kayan

Quote:


> Originally Posted by *SAFX*
> 
> Thanks, kayan, I read about weight issues with XSPC, ...no reinforced L-brackets that can fix that?


I didn't have any issues with 2x 290 block + backplate (in a Corsair 760t), but you can buy video card jacks, lol. I think powercolor makes one. I have a horizontal motherboard layout now, so I have no use, but I've seen pics and stuff on forums of blocked cards tearing off the whole PCI-E slot before. Better to be safe than sorry.


----------



## BootPirate

Quote:


> Originally Posted by *SAFX*
> 
> Confirm crossfire is _really_ working and set " Show AMD Crossfire status Icon" in CCC (you can do it from sys tray icon too)
> Is ULPS disabled?


Confirmed.

I have ULPS disabled.
Quote:


> Originally Posted by *SAFX*
> 
> I'm thinking about going _all in_ with a custom water loop, any advice on reliable waterblocks? EK has nice selection, but XSPC looks damn sweet, leds are nice too.
> 
> Just curious, thanks


A friend of mine used the kryographics Vesuvius while he still had the card. He seemed to like it very much.


----------



## SAFX

Quote:


> Originally Posted by *kayan*
> 
> I think powercolor makes one.


Good to know, thanks


----------



## SAFX

Quote:


> Originally Posted by *BootPirate*
> 
> A friend of mine used the kryographics Vesuvius while he still had the card. He seemed to like it very much.


Thanks!









Post your 295x2 benchmarks and temps.

btw, here's something interesting I found for GTA V performance tweaks, have not tried it yet, but may be useful for your troubles.

Sorry, nevermind that link, I thought it was for crossfire not working, it's actually for crossfire stutter


----------



## BootPirate

No problem.

Thanks for that link. I've done a few of those things. I'm going to change a small number of things at a time and see which one does the trick.
I'm going to test each game and write down the GPU usage and temps for each game.

I'm still wondering why it isn't running the way it should be. I've seen 4k test that were better then my results :\

BTW it wasn't BF3 it was BF4 with the .5-1 fps.


----------



## kayan

Quote:


> Originally Posted by *SAFX*
> 
> Good to know, thanks


http://www.powercolor.com/us/product_Accessories.asp here it is.


----------



## SAFX

Thanks again, kayan!


----------



## ViRuS2k

any of you guys noticed that our 295x2`s are not UEFI compatible bios
check latest CPUZ and it shows we have no UEFI bios :/

going to be a pisstake when windows 10 comes cause i wanted to use UEFI mode.

also how do you guys get more voltage than 100+ that afterburner sets im on watercooling custom so would help to have more volage so i can clock higher is there a method to unlock higher voltages with afterburner :/

also is there any bios`s out there thats higher than 1030/1300 at default :/

and i dont think there is any R9 bios tweaking tools out there to tweak our own bios`s yet either and probably never will lol


----------



## SAFX

Quote:


> Originally Posted by *ViRuS2k*
> 
> also how do you guys get more voltage than 100+ that afterburner sets im on watercooling custom so would help to have more volage so i can clock higher is there a method to unlock higher voltages with afterburner :/
> 
> also is there any bios`s out there thats higher than 1030/1300 at default :/
> 
> and i dont think there is any R9 bios tweaking tools out there to tweak our own bios`s yet either and probably never will lol


Not sure on voltage, but you can overlock the card with MSI Afterburner.
Don't waste your time upgrading the BIOS, it's tedious and not worth the oc gains, just use MSIAF


----------



## xer0h0ur

Wouldn't that be a motherboard related issue not Windows 10 related? With respect to UEFI or non-UEFI vBIOS that is.


----------



## BootPirate

If you go into the Afterburner settings you will find an option to exceed AMD OC limits. It's what AMD considers to be safe, and anything above those limits could potentially damage your card.


----------



## AlphaBravo

Quote:


> Originally Posted by *kayan*
> 
> Strange response question, but do you move your case around much? Also is your rad above the GPU? Is the sound coming from your pump or the tubing? If it's the pump, then maybe, if it's tubing probably just an air bubble. You could try unmounting your rad moving it around a little and then remounting it.
> 
> I have a custom loop, and have used AIO's before, and the noise is usually the worst on system boot. It should calm down after a bit.
> 
> For peace of mind check the tubing ports into the rad and see if there's any condensation at the joints.


I do not move my case around at all. The rad is mounted above the GPU. Originally, I though the sounds were coming from the pumps inside of the card, but recently, the sounds appeared to be coming from the radiator area. It was difficult to tell. I have had the computer on today, and no liquid sounds. Very strange that it is coming and going without me doing anything. No condensation on the tubing ports into the radiator.


----------



## Without Wax

Does anyone know if you need to have afterburner launching with windows and running all the time for the undervolting to work? Or does afterburner simply do a regedit of a sort and set the voltage and then no need for the program to run after that?


----------



## TooManyAlpacas

Quote:


> Originally Posted by *AlphaBravo*
> 
> I do not move my case around at all. The rad is mounted above the GPU. Originally, I though the sounds were coming from the pumps inside of the card, but recently, the sounds appeared to be coming from the radiator area. It was difficult to tell. I have had the computer on today, and no liquid sounds. Very strange that it is coming and going without me doing anything. No condensation on the tubing ports into the radiator.


My R9 295x2 does this at times also and I do not move my case either. It usually stopes a few minutes after windows has booted. I have never really thought it was that big of a deal as long as the GPU stays cool and water does not leak.


----------



## SAFX

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> My R9 295x2 does this at times also and I do not move my case either. It usually stopes a few minutes after windows has booted. I have never really thought it was that big of a deal as long as the GPU stays cool and water does not leak.


My card is making a low pitch rattling sound; it's not loud, but definitely noticeable and very annoying, so I wrapped it in duct tape, problem solved. It's funny because now it looks like a kilo of Columbian coke









(j/k, it's not gift wrapped, but the noise issue is real







)


----------



## TooManyAlpacas

Quote:


> Originally Posted by *SAFX*
> 
> My card is making a low pitch rattling sound; it's not loud, but definitely noticeable and very annoying, so I wrapped it in duct tape, problem solved. It's funny because now it looks like a kilo of Columbian coke
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (j/k, it's not gift wrapped, but the noise issue is real
> 
> 
> 
> 
> 
> 
> 
> )


Mine does not really have a rattling noise issue it sounds like water trickling down rocks. It has only done it about 2 or 3 times since I have had the card (about 5 months).


----------



## SAFX

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> Mine does not really have a rattling noise issue it sounds like water trickling down rocks. It has only done it about 2 or 3 times since I have had the card (about 5 months).


It's a machine, it's normal, enjoy the top-of-the line card, don't obsess over details unless necessary









Seriously, for the past month, ever since I completed my build, I've been tweaking, benching, modding, testing different cooling set ups, purchased nearly every fan from Micro Center, ordered more from Europe, and purchased a thermal laser-guided thermometer for testing VRM temps, and documented everything (I should do a write up on my findings, very interesting stuff)....then I stopped and asked "what the hell just happened? where did May go?"







....I've taken a chill pill, now I'm just enjoying my rig, you should do the same, otherwise, what's the point, right?









Have a nice evening


----------



## BootPirate

Oh man. I didn't go ballistic like you, but I know exactly what you're talking about. Though this under performance thing is bothering be a little. Like a thorn in my side. Or a really annoying tag on a new shirt.


----------



## veaseomat

My koolance waterblocks arrived today! I cant wait to rebuild my rig after work tonight! I'm moving it all into a new case as well. I'll probably do a build video with my gopro.


----------



## SAFX

Quote:


> Originally Posted by *veaseomat*
> 
> 
> 
> My koolance waterblocks arrived today! I cant wait to rebuild my rig after work tonight! I'm moving it all into a new case as well. I'll probably do a build video with my gopro.


Nice!! what made you go with Koolance? ...and what case?


----------



## Alex132

Anyone have an idea why my clocks will fluctuate like this when I overclock? (this is while folding btw)



VRMs aren't that hot, I can touch them for about 3-4 seconds.


----------



## BradleyW

Quote:


> Originally Posted by *Alex132*
> 
> Anyone have an idea why my clocks will fluctuate like this when I overclock? (this is while folding btw)
> 
> 
> 
> VRMs aren't that hot, I can touch them for about 3-4 seconds.


Disable "Power Play".


----------



## Alex132

Does PowerPlay actually affect heat/power consumption?
https://en.wikipedia.org/wiki/AMD_PowerPlay


----------



## GreenGoblinGHz

My asus amd 295x2. Koolance block. Modified XSPC backplate.
High load temps 48'c - 52'c (furmark 15 min burn got it to 52'c ).

Stock aio kept on pushing near 75'c, so this block made a huge difference.
Backplate is passive but I do got quite good quality thermal pads on it.


----------



## BradleyW

Quote:


> Originally Posted by *Alex132*
> 
> Does PowerPlay actually affect heat/power consumption?
> https://en.wikipedia.org/wiki/AMD_PowerPlay


Without power play, you'll never downclock. That will consume more power.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Does PowerPlay actually affect heat/power consumption?
> https://en.wikipedia.org/wiki/AMD_PowerPlay
> 
> 
> 
> Without power play, you'll never downclock. That will consume more power.
Click to expand...

^ that


----------



## veaseomat

I went with koolance because they were only 150$ on amazon lol. 300$ for both blocks or 400... hmmm. Also using the rosewill rise case, I chose this case again for price, it's simple looks to match my phase change cooler a little better, and it's alot smaller than my rosewill thor v2 case i moved it out of.

I did a build video and have tons of footage on my gopro, I will be editing the film over the next few days, will post a link when I've finished.

It's up and running now with my LD phase change cooler attached, I'm using it right now.









**edit** oh and all gpu's are reading 36 celsius idle right now







ambiends of 74 fahrenheit.


----------



## boredmug

Quote:


> Originally Posted by *veaseomat*
> 
> 
> 
> 
> 
> I went with koolance because they were only 150$ on amazon lol. 300$ for both blocks or 400... hmmm. Also using the rosewill rise case, I chose this case again for price, it's simple looks to match my phase change cooler a little better, and it's alot smaller than my rosewill thor v2 case i moved it out of.
> 
> I did a build video and have tons of footage on my gopro, I will be editing the film over the next few days, will post a link when I've finished.
> 
> It's up and running now with my LD phase change cooler attached, I'm using it right now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> **edit** oh and all gpu's are reading 36 celsius idle right now
> 
> 
> 
> 
> 
> 
> 
> ambiends of 74 fahrenheit.


Just curious what your load temps are like?


----------



## SAFX

Why is it so difficult to find XSPC full water block for this card? I found one, that's it, should I grab it?

UPDATE
Found some more!


----------



## xer0h0ur

I stopped using powerplay so now my "idle" temps are higher even though in reality its never idle really. Constantly clocked up now.


----------



## veaseomat

Quote:


> Originally Posted by *boredmug*
> 
> Just curious what your load temps are like?


40 idle, 50 load. almost spot on. http://www.3dmark.com/3dm/7375405


----------



## boredmug

Quote:


> Originally Posted by *veaseomat*
> 
> 40 idle, 50 load. almost spot on. http://www.3dmark.com/3dm/7375405


Nice. What kind of rads? You get those temps with a 9590?


----------



## electro2u

Quote:


> Originally Posted by *ViRuS2k*
> 
> any of you guys noticed that our 295x2`s are not UEFI compatible bios
> check latest CPUZ and it shows we have no UEFI bios :/
> 
> going to be a pisstake when windows 10 comes cause i wanted to use UEFI mode.
> 
> also how do you guys get more voltage than 100+ that afterburner sets im on watercooling custom so would help to have more volage so i can clock higher is there a method to unlock higher voltages with afterburner :/
> 
> also is there any bios`s out there thats higher than 1030/1300 at default :/
> 
> and i dont think there is any R9 bios tweaking tools out there to tweak our own bios`s yet either and probably never will lol


MSI sent me a UEFI BIOS for mine upon request but I couldn't get it working. I tried a few different times and I could flash the Sapphire OC BIOS just fine but in the end I bricked one of the BIOS sets (little switch is 2 independent sets of a Master and a Slave BIOS on each). I needed to boot with another video card in order to unbrick it, which was a pain in the ass.


----------



## veaseomat

Quote:


> Originally Posted by *boredmug*
> 
> Nice. What kind of rads? You get those temps with a 9590?


my 9590 us under LD PC V2 phase change cooler that's why. The rads are nothing special, 2 xspc 240mm and a migicool 120mm. I also got the wrong fans for them, corsair AF quiet edition, and I should have got corsair sp fans.


----------



## cennis

I am looking to let go of my 295x2 and aquacomputer waterblock and active backplate if anyone is interested, pm me


----------



## Intelligents

Quote:


> Originally Posted by *GreenGoblinGHz*
> 
> My asus amd 295x2. Koolance block. Modified XSPC backplate.
> High load temps 48'c - 52'c (furmark 15 min burn got it to 52'c ).
> 
> Stock aio kept on pushing near 75'c, so this block made a huge difference.
> Backplate is passive but I do got quite good quality thermal pads on it.


What thermal pads do you recommend? I need to pick up a new set before a slap my EK block on mine.


----------



## GreenGoblinGHz

I suggest Fujipoly's Extreme pads. Personally usin em, and rly got nothing bad to say about em (compared to some low quality thermal pads..)
Link is to Frozencpu.

http://www.frozencpu.com/cat/l3/g8/c487/s1797/list/p1/Thermal_Interface-Thermal_Pads_Tape-Ultra_Extreme_Thermal_Pads-Page1.html

Sincerely :
Druizza


----------



## Intelligents

Quote:


> Originally Posted by *GreenGoblinGHz*
> 
> I suggest Fujipoly's Extreme pads. Personally usin em, and rly got nothing bad to say about em (compared to some low quality thermal pads..)
> Link is to Frozencpu.
> 
> http://www.frozencpu.com/cat/l3/g8/c487/s1797/list/p1/Thermal_Interface-Thermal_Pads_Tape-Ultra_Extreme_Thermal_Pads-Page1.html
> 
> Sincerely :
> Druizza


Great, I will definitely check them out. Thank you much.


----------



## gatygun

120 dollars for a thermal pads tape


----------



## Intelligents

Quote:


> Originally Posted by *gatygun*
> 
> 120 dollars for a thermal pads tape


I saw that, but I think it's for an entire sheet or something.


----------



## xer0h0ur

For what its worth, I can run 1700MHz clocks with the Fujipoly Ultra Extreme pads on my 295X2. Those pads have a godly thermal conductivity rating.


----------



## gatygun

Quote:


> Originally Posted by *Intelligents*
> 
> I saw that, but I think it's for an entire sheet or something.


ah you are right, there are smaller once. Was already thinking god dam lol


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> For what its worth, I can run 1700MHz clocks with the Fujipoly Ultra Extreme pads on my 295X2. Those pads have a godly thermal conductivity rating.


WOW!

What's the performance advantage (%) over stock at that speed; did you benchmark?


----------



## electro2u

That can't be right...? Can it? Xer0 is that a typo? I capped out at 1250 even under water with Fujipoly pads. It's a typo.

Even 290x has only gone to around 1550Mhz on LN2. -140C lol


Crazy stuff.


----------



## xer0h0ur

Whoa whoa whoa. I was talking about the vRAM. I didn't think anyone used fuji pads on the GPU. That would be nuts.


----------



## Alex132

Pretty sure he means 1700mhz on memory.


----------



## electro2u

Quote:


> Originally Posted by *xer0h0ur*
> 
> Whoa whoa whoa. I was talking about the vRAM. I didn't think anyone used fuji pads on the GPU. That would be nuts.


Oh haha. I was thinking VRMs.








I used the Gelid extreme on the die and RAM chips because Aquacomputer.


----------



## Elmy

I use the Fujipoly pads too... got about 5c-7c difference from EK thermal pads.


----------



## xer0h0ur

Aye, not all blocks call for using pads.


----------



## electro2u

My single stock 295x2 with an Intel 4820k (4 core proc)


QuadCF with 2x295x2 and an FX proc. presumably at 5Ghz 1120/1625MHz


Someone's 2x295x2 QuadFire rig with a 4790k 1100/1500Mhz









Nacho's 2x 290x with some sort of Intel Processor, I forget. 1400/1750


----------



## Sgt Bilko

Quote:


> Originally Posted by *electro2u*
> 
> My single stock 295x2 with an Intel 4820k (4 core proc)
> 
> 
> QuadCF with 2x295x2 and an FX proc. presumably at 5Ghz 1120/1625MHz
> 
> 
> Someone's 2x295x2 QuadFire rig with a 4790k 1100/1500Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nacho's 2x 290x with some sort of Intel Processor, I forget. 1400/1750


Can't use Firestrike to gauge performance with an FX CPU, they get gimped in the combined test that said i managed 13216 with a 295x2 + CF 290's


----------



## BootPirate

Quote:


> Originally Posted by *electro2u*
> 
> My single stock 295x2 with an Intel 4820k (4 core proc)
> 
> 
> QuadCF with 2x295x2 and an FX proc. presumably at 5Ghz 1120/1625MHz
> 
> 
> Someone's 2x295x2 QuadFire rig with a 4790k 1100/1500Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nacho's 2x 290x with some sort of Intel Processor, I forget. 1400/1750


How? I only get 7,469 on Extreme. Stock 295x2 and Intel 4790


----------



## electro2u

Quote:


> Originally Posted by *BootPirate*
> 
> How? I only get 7,469 on Extreme. Stock 295x2 and Intel 4790


I don't know







There's an optimal driver option setting for 3DMark in CCC which is legal for valid scores but it doesn't make a huge difference. Your tesselation on maybe? Throttling? I used to throttle mine easy on stock cooling but water helped a ton. I don't have one anymore.

My poor single 980 hits right 7,000 on Firestrike Extreme.
http://www.3dmark.com/3dm/7391980

I had 2 of them and sold one. In SLI I recorded a 13,718 at some point. Edit: sorry that's wrong. 11,332 In extreme.
http://www.3dmark.com/fs/4142969


----------



## GreenGoblinGHz

My 3dmarks (basic, extreme and ultra). Using R R295x2 + R R290A (flashed Asus pt1 vga bios) with i5-4690k, and i7-4790k. Also few test runs with FX chips and fx990 mobos.

Mobos ( Intel chips.. i5-4690k nad i7-4790k). Asus z97-PRO and ASRock Extreme 9, z97.

Mobos ( AMDchips, FX8350 and FX9590). Asus Crosshair Formula-z, Asus Sabertooth gen2, and gen3. ASRock Extreme 9, and Extreme 4.

Some results :









Imo. The gap between Fx9590 + R R295x2 vs i5-4690k/i7-4790k is just ridiculous.
Now-days I prefer Heaven, and Valley. Real Bench. 3DMark has trouble time to time + Nvidia has even got caught altering their driver to score more from 3dmark benches. Not long ago Futuremark gave announcement ,that AMD gpu's score lower than they should, and they r fixing the issue...
I rly like running benchmarks , and to c what my rig can rly do when I push it.. 3dMark was one of my favorite bechmarks. Ultra and Extreme Firestrikes are still fun. Normal Firestrike is a bit boring with 295x2 , and up...

*Still FX chip vs Intel chip. Same Gpus.
13k with AMD chip. 21.5k with Intel chip. Same gpu's.*


----------



## kayan

Quote:


> Originally Posted by *GreenGoblinGHz*
> 
> My 3dmarks (basic, extreme and ultra). Using R R295x2 + R R290A (flashed Asus pt1 vga bios) with i5-4690k, and i7-4790k. Also few test runs with FX chips and fx990 mobos.
> 
> Imo. The gap between Fx9590 + R R295x2 vs i5-4690k/i7-4790k is just ridiculous.
> Now-days I prefer Heaven, and Valley. Real Bench. 3DMark has trouble time to time + Nvidia has even got caught altering their driver to score more from 3dmark benches. Not long ago Futuremark gave announcement ,that AMD gpu's score lower than they should, and they r fixing the issue...
> I rly like running benchmarks , and to c what my rig can rly do when I push it.. 3dMark was one of my favorite bechmarks. Ultra and Extreme Firestrikes are still fun. Normal Firestrike is a bit boring with 295x2 , and up...
> 
> *Still FX chip vs Intel chip. Same Gpus.
> 13k with AMD chip. 21.5k with Intel chip. Same gpu's.*


I had the same thing, the difference between my 9370 running at 5 ghz, and the current 5820 running at stock was absolutely ridiculous in 3dMark. I don't remember the exact number, but it was a lot of points. I wonder if, because of the way W10 is being designed, if the FX-octacores will start to provide even more bang for the buck.

On a second note, who is selling their 295x2 for a Fury or Fury X?


----------



## Orivaa

Quote:


> Originally Posted by *kayan*
> 
> On a second note, who is selling their 295x2 for a Fury or Fury X?


If the performance eclipses the 295x2 and I can procure the money, sure.


----------



## kayan

Quote:


> Originally Posted by *Orivaa*
> 
> If the performance eclipses the 295x2 and I can procure the money, sure.


I believe it's something like up to 60% more performance than a 290x. so, better than in games where the xfire profiles suck/non-existent, but worse if the game scales well.


----------



## Agent Smith1984

Quote:


> Originally Posted by *kayan*
> 
> I believe it's something like up to 60% more performance than a 290x. so, better than in games where the xfire profiles suck/non-existent, but worse if the game scales well.


295x2 will be faster than the Fury X IN EVERY SINGLE SCENARIO

Keep your cards, or sell them to me for $500


----------



## GreenGoblinGHz

Im also curious... How many of you are gonna hop on the Fury X train once it comes ? Or even better the upcomin double gpu : Radeon R9 Fury X2 ?
Personally...I think Im hoppin to HBM memory. R9 Fury X2 will be great...because imo R R295X2 has been/is great. Under after-market block imo easy 15-30% better overall perfomance hop.

Im a lazy scroller, so postin 1 x pic. Added EKs uv blue coolant. Backplate XSPC ,modified, fujima's Extreme pads. Full cover plate Koolance. Cpu EK Evo supremacy . Gonna shorten the tubes , and add 1 x 360 rad more to the loop (also got R R290A with EKs block sitting, doing nothing atm. 295x2 is plenty.


----------



## joeh4384

I love shiny new PC parts but I cant justify moving from 295x2 to Fury or Furyx2 at this moment. The only upgrade performance wise would be two Furies which would be epic for sure but expensive. Plus without DVI, I would need to replace one of my Korean monitors too. I do want a 1440p 144hz Freesync monitor too so maybe I will get one of those first and hop on Fury train when prices eventually fall.


----------



## Orivaa

I don't think the HBM argument is good enough for 2 Furies, as we are getting HBM 2 next year. Unless Directx 12 makes Crossfire absolutely amazing.


----------



## Intelligents

I'm personally just waiting for 295x2s to take a bit of a price dip with these new releases so I can get a second for quadfire


----------



## xer0h0ur

Simply put, the card lacks too much for me to change out my hardware. No HDMI 2.0, no DP 1.3, no DVI port, HBM1 limiting to 4GB and to top it off AMD went full ****** by locking the AIBs from modifying Fury X's reference design apparently. So if that is true no air cooled Fury X and no custom I/O versions adding a DVI port.

Frankly the first card to use DP 1.3 for 4K @ 120Hz is going to get my wallet spreading wide for it.


----------



## kayan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 295x2 will be faster than the Fury X IN EVERY SINGLE SCENARIO


Except it won't be in every scenario. There are games where xfire just flat out doesn't work.


----------



## veaseomat

I throw together a video off my pc rebuild with the koolance waterblocks. Just uploaded it so it might be a few hours until the full 1080p/60fps is available.


----------



## Roxycon

does this card support DP 1.2?


----------



## xer0h0ur

The 295X2? Its DP 1.2a


----------



## Agent Smith1984

Quote:


> Originally Posted by *kayan*
> 
> Except it won't be in every scenario. There are games where xfire just flat out doesn't work.


I don't personally have any games where CF doesn't work except for Skyrim, and the one card blows the game away anyways.

All the latest AAA titles benefit from CF, and more than just a little bit....


----------



## gatygun

They should make a 3x fury x chip on 1 card, now the card is smaller. That's gona be something to upgrade towards.

I think atm that a 295x2 is the sweetspot still, i would sure as hell wait for the next revision and go towards that.


----------



## xer0h0ur

LMAO, a triple GPU card would be some sort of atrocity.


----------



## remedy1978

First off, I want to thank this forum for their advice. I switched my R295X2 from a rear exhaust sucking hot air out of the case to a front intake with push/pull. Temps dropped from 74 to 64. I also think my case (NZXT H440) isn't helping either. Once I removed the front fascia, that helped dramatically also. I have a Corsair 450D on order so I will be returning the H440.

On another note, I was playing Crysis 3 and notice even though the temps were around 64, the 1st GPU would clock up and down from 1018mhz to around 992 or so. The 2nd GPU stayed constant. This had nothing to do with the load as the clock speed changed independent of the load. Is this behavior normal? Or should I see a consistant 1018mhz on both GPUs?


----------



## TooManyAlpacas

Quote:


> Originally Posted by *remedy1978*
> 
> First off, I want to thank this forum for their advice. I switched my R295X2 from a rear exhaust sucking hot air out of the case to a front intake with push/pull. Temps dropped from 74 to 64. I also think my case (NZXT H440) isn't helping either. Once I removed the front fascia, that helped dramatically also. I have a Corsair 450D on order so I will be returning the H440.
> 
> On another note, I was playing Crysis 3 and notice even though the temps were around 64, the 1st GPU would clock up and down from 1018mhz to around 992 or so. The 2nd GPU stayed constant. This had nothing to do with the load as the clock speed changed independent of the load. Is this behavior normal? Or should I see a consistant 1018mhz on both GPUs?


I also have a H440 and I have my CPU 280mm rad in the front with the top intaking air. My R9 295x2 is in push pull exhausting air and I'm getting around the same temps. Push/Pull helps a lot


----------



## gatygun

Quote:


> Originally Posted by *xer0h0ur*
> 
> LMAO, a triple GPU card would be some sort of atrocity.


Tried to make one rofl


----------



## rakesh27

Guys,

I was checking Overclockers and i see the 390x is out, when i checked the specs it looks the same as the 290x, same stream processors 2816, only the memory clock and cpu clock has alittle boost.

Am i missing something here, since i or anyone owns a 295x2 my self with a 290x as well, is there any point of upgrading to 2 x 390x...

Seems like they are like rebranded 290x 8gb versions.

Can somone please enlighten me, thanks...


----------



## kayan

Quote:


> Originally Posted by *rakesh27*
> 
> Guys,
> 
> I was checking Overclockers and i see the 390x is out, when i checked the specs it looks the same as the 290x, same stream processors 2816, only the memory clock and cpu clock has alittle boost.
> 
> Am i missing something here, since i or anyone owns a 295x2 my self with a 290x as well, is there any point of upgrading to 2 x 390x...
> 
> Seems like they are like rebranded 290x 8gb versions.
> 
> Can somone please enlighten me, thanks...


They are rebranded 290x with more vram. If you want an"upgrade" you'll need to wait until next week for the Fury. Or next month for the Fury X.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rakesh27*
> 
> Guys,
> 
> I was checking Overclockers and i see the 390x is out, when i checked the specs it looks the same as the 290x, same stream processors 2816, only the memory clock and cpu clock has alittle boost.
> 
> Am i missing something here, since i or anyone owns a 295x2 my self with a 290x as well, is there any point of upgrading to 2 x 390x...
> 
> Seems like they are like rebranded 290x 8gb versions.
> 
> Can somone please enlighten me, thanks...


They are a refresh of the 290/x series.

More vram, higher clock speeds and some small refinements made here and there.

they aren't worth going to from a 295x2 or 290x, we have to wait another week until Fury becomes available


----------



## joeh4384

I want them to release the new 300 series drivers for 200 series cards.


----------



## joeh4384

Quote:


> Originally Posted by *remedy1978*
> 
> First off, I want to thank this forum for their advice. I switched my R295X2 from a rear exhaust sucking hot air out of the case to a front intake with push/pull. Temps dropped from 74 to 64. I also think my case (NZXT H440) isn't helping either. Once I removed the front fascia, that helped dramatically also. I have a Corsair 450D on order so I will be returning the H440.
> 
> On another note, I was playing Crysis 3 and notice even though the temps were around 64, the 1st GPU would clock up and down from 1018mhz to around 992 or so. The 2nd GPU stayed constant. This had nothing to do with the load as the clock speed changed independent of the load. Is this behavior normal? Or should I see a consistant 1018mhz on both GPUs?


You should see a constant clock. Download afterburner and increase the power limit to +25. I had to do this or my core clocks would fluctuate.

http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


----------



## Orivaa

Quote:


> Originally Posted by *kayan*
> 
> They are rebranded 290x with more vram. If you want an"upgrade" you'll need to wait until next week for the Fury. Or next month for the Fury X.


That's the other way around. The Fury X will be released next week, the standard Fury next month.


----------



## rakesh27

Thanks guys, phew for a min there i was disappointed or jump the bullet and make a purchase...

One thing though are the 295x2 and 290x fully DX12 compliant...

i wait for the fury to come out, wait a while until the settle in and we can see benchmarks then either make the purchase or see what Nvidea has...

I hope the 2 series are DX12 compliant

Thanks again guys you all the best


----------



## kayan

Whoops, thanks for correcting that. Was doing from a sleep deprived memory while at physical therapy this AM. Haha.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rakesh27*
> 
> Thanks guys, phew for a min there i was disappointed or jump the bullet and make a purchase...
> 
> One thing though are the 295x2 and 290x fully DX12 compliant...
> 
> i wait for the fury to come out, wait a while until the settle in and we can see benchmarks then either make the purchase or see what Nvidia has...
> 
> I hope the 2 series are DX12 compliant
> 
> Thanks again guys you all the best


Yes, they will work with DX12 no problems


----------



## Agiel

i just came man when i see your hardware ... love those big big monsters that i would ever ever can afford to have.


----------



## Orivaa

I do not think the 200 series is fully DirectX 12 compliant, but the 300 series should be.


----------



## rakesh27

I Should have just googles it, doh...

What ive found the 295x2 and 290x is DX12 compliant but not DX12.1, i think it relates to what GCN Architecture you have.

Some places say DX12.1 wont be utilized by many games, hardly if any.

I was asking these questions, so i can judge how long i prolong my 300 series upgrade. It would be good to upgrade once the 300 have fully settled in, its a win for all of us.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Orivaa*
> 
> I do not think the 200 series is fully DirectX 12 compliant, but the 300 series should be.


A few people have been concerned about this, found this a while ago


----------



## SAFX

Fan on my card is making a persistent grinding sound, it's audible above all system fans combined, very annoying (I tried ignoring it, but can't, it basically sounds like 



)

I had plans to apply Arctic thermal paste on the gpus this weekend. Since the card will be disassembled, is there anything I can/should do about the fan?


----------



## remedy1978

Quote:


> Originally Posted by *joeh4384*
> 
> You should see a constant clock. Download afterburner and increase the power limit to +25. I had to do this or my core clocks would fluctuate.
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


Worked perfectly. Can I run two three pin fans with a spliter off the GPU?


----------



## Intelligents

Quote:


> Originally Posted by *remedy1978*
> 
> Worked perfectly. Can I run two three pin fans with a spliter off the GPU?


This didn't work when I tried it. I ended up just running the splitter to the mobo.


----------



## joeh4384

Quote:


> Originally Posted by *remedy1978*
> 
> Worked perfectly. Can I run two three pin fans with a spliter off the GPU?


I wouldn't run a splitter on the GPU. I have one fan connected to the GPU and the other connected to my R5's fan controller.


----------



## D3AD PIX3L

Quote:


> Originally Posted by *D3AD PIX3L*
> 
> Thanks for all your help. I requested another RMA. Going to see if it works. It sounds like if there is still an issue they will refund me then I can use that $$ for another PSU.. I am very impressed with Corsair's customer service.
> 
> Sent from my Nexus 5 using Tapatalk


Update - Received the second RMA unit AX860i and it's still buzzing under load.







Requesting a refund and awaiting a response from Corsair. Going to be taking a look at CM V1000 per Alex132's recommendation.


----------



## wermad

My twin Vesuvius' have a new home:



Spoiler: Warning: Spoiler!


----------



## boredmug

Quote:


> Originally Posted by *wermad*
> 
> My twin Vesuvius' have a new home:
> 
> 
> 
> Spoiler: Warning: Spoiler!


Holy **** that thing is amazing!


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> My *tiny* Vesuvius' have a new home:
> 
> 
> 
> Spoiler: Warning: Spoiler!


ftfy

and dayum.....soooo many fans


----------



## TooManyAlpacas

Quote:


> Originally Posted by *wermad*
> 
> My twin Vesuvius' have a new home:
> 
> 
> 
> Spoiler: Warning: Spoiler!


Holy **** that is huge. And talk about fans that is amazing


----------



## Roaches

Literally the size of a fully decked server rack! How are temps btw?


----------



## xer0h0ur

Quote:


> Originally Posted by *gatygun*
> 
> Tried to make one rofl


WAT!? Where in the world did that surface from?


----------



## gatygun

Quote:


> Originally Posted by *wermad*
> 
> My twin Vesuvius' have a new home:
> 
> 
> 
> Spoiler: Warning: Spoiler!


U need temps on cpu/gpu's haha, this is amazing.
Quote:


> Originally Posted by *xer0h0ur*
> 
> WAT!? Where in the world did that surface from?


From my pro paint skills, had nothing to do haha


----------



## xer0h0ur

HAHAHA touche good sir, touche.


----------



## fat4l

Guys, what do u think is the price for the STOCK COOLER ? As my card is watercooled, I dont use the stock cooler and one guy is asking me for a price. Any ideas?
(Yes its the stock aio watercooling cooler for 295X2)


----------



## electro2u

Quote:


> Originally Posted by *fat4l*
> 
> Guys, what do u think is the price for the STOCK COOLER ? As my card is watercooled, I dont use the stock cooler and one guy is asking me for a price. Any ideas?
> (Yes its the stock aio watercooling cooler for 295X2)


Shroud too?


----------



## xer0h0ur

Quote:


> Originally Posted by *fat4l*
> 
> Guys, what do u think is the price for the STOCK COOLER ? As my card is watercooled, I dont use the stock cooler and one guy is asking me for a price. Any ideas?
> (Yes its the stock aio watercooling cooler for 295X2)


If you're interested in selling the illuminated logo from the shroud of the cooler let me know.


----------



## wermad

Quote:


> Originally Posted by *fat4l*
> 
> Guys, what do u think is the price for the STOCK COOLER ? As my card is watercooled, I dont use the stock cooler and one guy is asking me for a price. Any ideas?
> (Yes its the stock aio watercooling cooler for 295X2)


Keep it, resale value is always higher if the stock cooler is included. What if you gotta rma? Then you're sol on a very expensive card.


----------



## xer0h0ur

Hook a brutha up with one of them illuminated logos though, I can even send you mine which burned out in like two days. Not like they would deny an RMA because the illuminated logo wasn't turning on lol.


----------



## veaseomat

Wanted to show off my recent firestrike ultra and extreme scores with the new koolance waterblocks.

ultra: http://www.3dmark.com/3dm/7433939?

extreme: http://www.3dmark.com/fs/5161951


----------



## wermad

Cards are nice and cool, hovering in the low mid 30s w/ some light use. Better then the low 40s in the old case.


----------



## xer0h0ur

Quote:


> Originally Posted by *veaseomat*
> 
> Wanted to show off my recent firestrike ultra and extreme scores with the new koolance waterblocks.
> 
> ultra: http://www.3dmark.com/3dm/7433939?
> 
> extreme: http://www.3dmark.com/fs/5161951


Smoking me on the graphics score. Tri-fire conservative overclocks: http://www.3dmark.com/fs/5055862


----------



## SAFX

*VRMs hit 150c!*

Did some benchmarking earlier today with AIDA64. Ran for 11 minutes at 100%, gpu temps stable at 65c, no throttling whatsoever, but VRM's hit 150c, tested with laser thermometer. I don't have any fans blowing around the card or near the vrms, for that matter, just rad intakes; all other fans outflow.
Are those vmrm temps cause for concern?


----------



## electro2u

Quote:


> Originally Posted by *SAFX*
> 
> *VRMs hit 150c!*
> 
> Did some benchmarking earlier today with AIDA64. Ran for 11 minutes at 100%, gpu temps stable at 65c, no throttling whatsoever, but VRM's hit 150c, tested with laser thermometer. I don't have any fans blowing around the card or near the vrms, for that matter, just rad intakes; all other fans outflow.
> Are those vmrm temps cause for concern?


Good god.

Yes.

Are you sure the thermometer is accurate? That just seems out of touch with anything resembling normality.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> *VRMs hit 150c!*
> 
> Did some benchmarking earlier today with AIDA64. Ran for 11 minutes at 100%, gpu temps stable at 65c, no throttling whatsoever, but VRM's hit 150c, tested with laser thermometer. I don't have any fans blowing around the card or near the vrms, for that matter, just rad intakes; all other fans outflow.
> Are those vmrm temps cause for concern?


VRM limit is around 125'c (stick below 100'c pref.), are you sure you're not reading 'F?

Was it way too hot to touch?


----------



## xer0h0ur

Christ on a cracker. Dassa no goo. Like alex said, you sure you weren't reading in Fahrenheit?


----------



## SAFX

Quote:


> Originally Posted by *electro2u*
> 
> Good god.
> 
> Yes.
> 
> Are you sure the thermometer is accurate? That just seems out of touch with anything resembling normality.


But why was the card not throttling?...the clocks are straight as an arrow,

...oh man, what an idiot I am







, I was using *Fahrenheit!*, almost had a heart attack!
Just ran a quick test, vrm at 70c, much better


----------



## electro2u

Quote:


> Originally Posted by *electro2u*
> 
> Good god.
> 
> Yes.


Quote:


> Originally Posted by *SAFX*
> 
> But why was the card not throttling?...the clocks are straight as an arrow,
> 
> ...oh man, what an idiot I am
> 
> 
> 
> 
> 
> 
> 
> , I was using *Fahrenheit!*, almost had a heart attack!


Ahahaha. 150C would melt the card.
But... this should make you squirm a little:


----------



## Alex132

If we cannot measure the VRM temperature, how does the GPU know when to ramp up the VRM fan? Is it based on the GPU load + heat + guesstimates? Why not just use a 2-4 VRM temperature sensors, and average them out....?


----------



## SAFX

Quote:


> Originally Posted by *electro2u*
> 
> Ahahaha. 150C would melt the card.
> But... this should make you squirm a little:


I was hoping for an explosion or something at the end


----------



## BootPirate

Quote:


> Originally Posted by *electro2u*
> 
> Ahahaha. 150C would melt the card.
> But... this should make you squirm a little:


Is this normal? My card does get up to 70C when I'm gaming, and the vram is extremely hot to touch.


----------



## SAFX

Quote:


> Originally Posted by *BootPirate*
> 
> Is this normal? My card does get up to 70C when I'm gaming, and the vram is extremely hot to touch.


that's totally normally, but I was hoping for an explosion in that video, *electro2u* let us down! I really want to see this card explode to oblivion, under load, of course


----------



## SAFX

Why does ULPS randomly enable? I disabled it about 2 weeks ago after refreshing drivers, but upon inspecting the registry, it's enabled again.

I decided to write a script that runs on startup to disable all entries, I've got about 25 "EnableUlps"...I'm assuming it's ok to disable all of them, and does it require a restart?

Code:



Code:


reg add "HKLM\SYSTEM\ControlSet001\services\amdkmdag" /f /v "EnableUlps" /t REG_DWORD /d 0


----------



## Alex132

Use CCleaner to clean the registry from time to time, it really helps.

And yes, it requires a restart. And there is only 1 for me, it stays disabled. I have it disabled in both MSI AB and a reg edit (did the rev edit first). Not sure if that means anything.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Use CCleaner to clean the registry from time to time, it really helps.
> 
> And yes, it requires a restart. And there is only 1 for me, it stays disabled. I have it disabled in both MSI AB and a reg edit (did the rev edit first). Not sure if that means anything.


I thought I had everything configured right, what are your settings in MSI? does this look right?


----------



## Sgt Bilko

Quote:


> Originally Posted by *veaseomat*
> 
> Wanted to show off my recent firestrike ultra and extreme scores with the new koolance waterblocks.
> 
> ultra: http://www.3dmark.com/3dm/7433939?
> 
> extreme: http://www.3dmark.com/fs/5161951


I want your Phase change unit









FX-9590 under water and 295x2 + 2x XFX DD R9 290's Stock cooling

http://www.3dmark.com/compare/fs/3232933/fs/5162209


----------



## veaseomat

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I want your Phase change unit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FX-9590 under water and 295x2 + 2x XFX DD R9 290's Stock cooling
> 
> http://www.3dmark.com/compare/fs/3232933/fs/5162209


haha omg it's you! I actually reinstalled older drivers so I could validate above you lmao. GOTEEEM ; )


----------



## Sgt Bilko

Quote:


> Originally Posted by *veaseomat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I want your Phase change unit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FX-9590 under water and 295x2 + 2x XFX DD R9 290's Stock cooling
> 
> http://www.3dmark.com/compare/fs/3232933/fs/5162209
> 
> 
> 
> haha omg it's you! I actually reinstalled older drivers so I could validate above you lmao. GOTEEEM ; )
Click to expand...

lol, Nice work mate!









I might have to see if i can borrow a 290x somewhere to take back Ultra









You've well and truly beaten me in Firestrike (I can't get close to you CPU speed) but Extreme......maybe, If i end up grabbing another 290x i'll give it a run sometime









Either way you pushed out some monster numbers, congrats


----------



## ZARuslan

Good day! Tried someone to place radiator below gpu? Pumps still get chance to pump water? I thinks thats waterloop filled for 100%, but who knows? Thanks!


----------



## CapitanPelusa

Question to R9 295X2 Owners: Does the 290x/295x2 have Chroma Subsampling support in order to output at [email protected] over HDMI 1.4? similar to the Keplers?


----------



## Alex132

Quote:


> Originally Posted by *ZARuslan*
> 
> Good day! Tried someone to place radiator below gpu? Pumps still get chance to pump water? I thinks thats waterloop filled for 100%, but who knows? Thanks!


Not a good idea. The radiator integrates the reservoir tank too - so you're placing the collection of the air bubbles right at the pumps instead of the reservoir. ie; don't do that - it will shorten the lifespan of the pumps rather quickly.


----------



## fat4l

Quote:


> Originally Posted by *electro2u*
> 
> Shroud too?


yes stock aio cooler + shoud...Is ~100$ (70£) a good price ?


----------



## electro2u

Quote:


> Originally Posted by *fat4l*
> 
> yes stock aio cooler + shoud...Is ~100$ (70£) a good price ?


Was what I was going to suggest.

See if you can pull off the LED unit for Xer0Hour and make an extra $25?


----------



## xer0h0ur

Quote:


> Originally Posted by *CapitanPelusa*
> 
> Question to R9 295X2 Owners: Does the 290x/295x2 have Chroma Subsampling support in order to output at [email protected] over HDMI 1.4? similar to the Keplers?


I don't know what sort of nonsense that is but HDMI 1.4 can't carry a 60Hz 4K signal. There is a reason were on HDMI 2.0 at the moment. That will carry a 4K 60Hz signal. 1.4 is limited to 30Hz.


----------



## CapitanPelusa

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know what sort of nonsense that is but HDMI 1.4 can't carry a 60Hz 4K signal. There is a reason were on HDMI 2.0 at the moment. That will carry a 4K 60Hz signal. 1.4 is limited to 30Hz.


I understand your confusion. Normally 4k 60hz via hdmi does requires HDMI 2.0.

However nVidia patched via drivers support for Chroma Subsampling on their 600 and 700 line of cards where the video card outputs 4k res at 60hz but with 4:2:0 color compression, thus keeping under the bandwidth constraints of HDMI 1.4. I currently have 780 (which dont have hdmi 2 but rather 1.4) in SLi and game at 4k 60hz on my 50 inch TV and I was planning of upgrading to FuryX but this HDMI 2.0 lack of support from FuryX line derailed that plan.

I was just wondering if the AMD 290 line also does Chroma Subsampling to at least play games at 4k @60hz via HDMI 1.4.


----------



## tsm106

Quote:


> Originally Posted by *CapitanPelusa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I don't know what sort of nonsense that is but HDMI 1.4 can't carry a 60Hz 4K signal. There is a reason were on HDMI 2.0 at the moment. That will carry a 4K 60Hz signal. 1.4 is limited to 30Hz.
> 
> 
> 
> I understand your confusion. Normally 4k 60hz via hdmi does requires HDMI 2.0.
> 
> However nVidia patched via drivers support for Chroma Subsampling on their 600 and 700 line of cards where the video card outputs 4k res at 60hz but with *4:2:0* color compression, thus keeping under the bandwidth constraints of HDMI 1.4. I currently have 780 (which dont have hdmi 2 but rather 1.4) in SLi and game at 4k 60hz on my 50 inch TV and I was planning of upgrading to FuryX but this HDMI 2.0 lack of support from FuryX line derailed that plan.
> 
> I was just wondering if the AMD 290 line also does Chroma Subsampling to at least play games at 4k @60hz via HDMI 1.4.
Click to expand...

That's not real 4k, it's heavily reduced settings. And on the 970/960 whatever, they used the same reduced setting and called that hdmi 2.0, which is basically fraud. Only the 980ti has the sil chip for full featured [email protected]

http://www.siliconimage.com/Company/News_and_Events/Press_Releases/2014_01_07_-_Silicon_Image_Announces_World_s_First_Full-Bandwidth_Dual-Mode_HDMI%C2%AE_2_0/MHL%C2%AE_3_0_IC_with_HDCP_2_2/

AMD's Fury will not have a sil chip, but at least they do not lie about it. Apparently the work around is to wait for the sil chip powered adapters, to go from DP to hdmi 2.0. It's annoying for sure, but I suppose that is better than them lying to us about what port it truly is.


----------



## xer0h0ur

Quote:


> Originally Posted by *CapitanPelusa*
> 
> I understand your confusion. Normally 4k 60hz via hdmi does requires HDMI 2.0.
> 
> However nVidia patched via drivers support for Chroma Subsampling on their 600 and 700 line of cards where the video card outputs 4k res at 60hz but with 4:2:0 color compression, thus keeping under the bandwidth constraints of HDMI 1.4. I currently have 780 (which dont have hdmi 2 but rather 1.4) in SLi and game at 4k 60hz on my 50 inch TV and I was planning of upgrading to FuryX but this HDMI 2.0 lack of support from FuryX line derailed that plan.
> 
> I was just wondering if the AMD 290 line also does Chroma Subsampling to at least play games at 4k @60hz via HDMI 1.4.


Dang, well you learn something every day. I had never heard of Chroma Subsampling till now. Just for the record though were still waiting on release to know with any certainty whether or not the HDMI port is 2.0 or not. The AMD rep that said it wasn't isn't reliable at all so I don't take his word for it. As an aside, there will soon be DP to HDMI 2.0 adapters so you're not completely left hanging if it does turn out the port is 1.4.


----------



## xer0h0ur

Quote:


> Originally Posted by *tsm106*
> 
> That's not real 4k, it's heavily reduced settings. And on the 970/960 whatever, they used the same reduced setting and called that hdmi 2.0, which is basically fraud. Only the 980ti has the sil chip for full featured [email protected]
> 
> http://www.siliconimage.com/Company/News_and_Events/Press_Releases/2014_01_07_-_Silicon_Image_Announces_World_s_First_Full-Bandwidth_Dual-Mode_HDMI%C2%AE_2_0/MHL%C2%AE_3_0_IC_with_HDCP_2_2/
> 
> AMD's Fury will not have a sil chip, but at least they do not lie about it. Apparently the work around is to wait for the sil chip powered adapters, to go from DP to hdmi 2.0. It's annoying for sure, but I suppose that is better than them lying to us about what port it truly is.


Holy crap, how have they not been sued for that yet?


----------



## kayan

So guys, I have a quick question. I've got a 295x2, as you know and my wife has a 290x. Could I throw them together on my v1000 for some Firestrike benchies, if nothing is overclocked, or is that just asking for trouble?


----------



## boredmug

Quote:


> Originally Posted by *kayan*
> 
> So guys, I have a quick question. I've got a 295x2, as you know and my wife has a 290x. Could I throw them together on my v1000 for some Firestrike benchies, if nothing is overclocked, or is that just asking for trouble?


sure.. A lot of people use that setup for tri-fire. Just match the clocks with afterburner or trixx


----------



## kayan

Quote:


> Originally Posted by *boredmug*
> 
> sure.. A lot of people use that setup for tri-fire. Just match the clocks with afterburner or trixx


Is my PSU up to the task I guess is what I'm asking?


----------



## boredmug

Quote:


> Originally Posted by *kayan*
> 
> Is my PSU up to the task I guess is what I'm asking?


Lol.. Well if its just to run a few benches it might be OK. I run two 290x's in crossfire on a 850watt PSU so you may be ok


----------



## xer0h0ur

I draw over 1000W sometimes tri-fired. I never did use my kill-a-watt meter to check overclocked power draw.


----------



## tagaxxl

Hey guys .....Can you tri-fire R9 295x2 with R9 390x (rebranded R9 290x) ?


----------



## wermad

It should technically be feasible but ultimately its up to Amd and drivers. I'm sure someone w/ both will try it.


----------



## rakesh27

Guys,

As you know the AMD Fury X is coming out this week, 4096 Stream Processors, 4096-bit (impressive). 4gb GDDR5 (no so good, me thinks), ive come into abit of money, is it worth the upgrade....

I own already Power Color R9 295x2 running with Sapphire 290x, would the newer card be better then 3 gpus that have 12gb Ram, i know technically they have 4gb...


----------



## Alex132

It'd be better to grab a 980 Ti Windforce if you want a single powerful GPU.

Otherwise stick with your current GPU setup. It is faster than both options.


----------



## Mega Man

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boredmug*
> 
> sure.. A lot of people use that setup for tri-fire. Just match the clocks with afterburner or trixx
> 
> 
> 
> Is my PSU up to the task I guess is what I'm asking?
Click to expand...

just try it, but probably not.

worst case your pc will shut down ( google OCP- over current protection )


----------



## rakesh27

thanks guys i think i will stick with what i have.

For me the only thing i will do is turn the 290x into 290x with All in One water cooling solution for better temps.


----------



## fat4l

have you guys tried AMD Catalyst™ Display Driver version 15.15 ? The official driver for R9 300 series ...

Also I decided to keep my 295x2 but letting the cooler go







What 1440p 144/120Hz freesync monitor do u suggest ? is there any TN models ?


----------



## Dagamus NM

Catalyst 15.15 installed without issue for me. It runs kind of wonky though. I ran both heaven and valley and saw very low utilization on cores 2-4 (dual r9 295x2). Frame rates went as low as 31fps and there are definite stutter spots. I do not appear to be hitting either a current or thermal limit.

Either way it runs strong for the first two thirds and then falls apart. That said I was able to play dying light after turning off AA, so that was good.

Water blocks should be in at some point this week so those should help. I don't know about this Lepa 1600 power supply though. It is supposed to be good enough but it just seems like these cards want more. I am afraid the raise the power limits.


----------



## Orivaa

Quote:


> Originally Posted by *rakesh27*
> 
> Guys,
> 
> As you know the AMD Fury X is coming out this week, 4096 Stream Processors, 4096-bit (impressive). 4gb GDDR5 (no so good, me thinks), ive come into abit of money, is it worth the upgrade....
> 
> I own already Power Color R9 295x2 running with Sapphire 290x, would the newer card be better then 3 gpus that have 12gb Ram, i know technically they have 4gb...


It's 4GB HBM, not GDDR5. HBM is vastly superior.


----------



## SAFX

Anyone try the new 15.6 beta driver?


----------



## Alex132

Quote:


> *AMD Catalyst™ 15.6 Beta for Windows®*
> 
> This driver release included optimizations for Batman™: Arkham Knight.
> This article provides information on the latest posting of the AMD Catalyst™ Software Suite, AMD Catalyst™ 15.6 Beta.
> This particular software suite updates the AMD Catalyst™ Display Driver and the AMD Catalyst™ Control Center. This unified driver has been updated, and is designed to provide enhanced performance and reliability.


http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## Alex132

How do you guys apply TIM on these GPUs? Was thinking pea-drop size, but worried it won't cover the whole die - so more along a line-shape?


----------



## joeh4384

Quote:


> Originally Posted by *SAFX*
> 
> Anyone try the new 15.6 beta driver?


Nothing special and crossfire and sli are completely broke in the game too. It doesn't include any of the windows 10 improvements or 300 series improvements.


----------



## Mega Man

Quote:


> Originally Posted by *Alex132*
> 
> How do you guys apply TIM on these GPUs? Was thinking pea-drop size, but worried it won't cover the whole die - so more along a line-shape?


I always use pea no issues


----------



## LegacyLG

how is quadfire on the r9 295x2 guys?


----------



## Dagamus NM

Demanding is the first word that comes to mind.

Followed by awesome. E-Peen extravaganza.

Can you be more specific with what you want to know?


----------



## Themisseble

hello,

questions
How can R9 295X2 pull more than 400W from only 2x8pin connector?
I assume that 1x 8pin = max 150W
2x8 pin = 300W
+
board = 75W
all = 375W

So avg should be around 350W? + water cooling 50-100W?


----------



## joeh4384

Quote:


> Originally Posted by *Themisseble*
> 
> hello,
> 
> questions
> How can R9 295X2 pull more than 400W from only 2x8pin connector?
> I assume that 1x 8pin = max 150W
> 2x8 pin = 300W
> +
> board = 75W
> all = 375W
> 
> So avg should be around 350W? + water cooling 50-100W?


AMD laughs at PCIe power standards.


----------



## Themisseble

please better explanation


----------



## Alex132

Quote:


> Originally Posted by *Themisseble*
> 
> please better explanation


It pulls way more. If you read the description from AMD it says it can pull ~500w easily.

If you're trying to use this card on a budget PSU - don't. There are recommended PSUs in the first post, an EVGA G2/P2 850-1000w or CoolerMaster Vantec 850-1000w would be my recommendation.

And don't use the daisy-chained PCIE 8pin cables from the CoolerMaster PSUs, make sure each PCIE slot is using its own entire cable.


----------



## Themisseble

Quote:


> Originally Posted by *Alex132*
> 
> It pulls way more. If you read the description from AMD it says it can pull ~500w easily.
> 
> If you're trying to use this card on a budget PSU - don't. There are recommended PSUs in the first post, an EVGA G2/P2 850-1000w or CoolerMaster Vantec 850-1000w would be my recommendation.
> 
> And don't use the daisy-chained PCIE 8pin cables from the CoolerMaster PSUs, make sure each PCIE slot is using its own entire cable.


after more research i have found info about it.
http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review/2

R9 295x2 will not pull 500W and 500W is actually maximum for stock design. Normal power consumption is 420-440W.

So anyone who is saying that stock design R9 295X2 is pulling more than 500W is wrong.


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> It pulls way more. If you read the description from AMD it says it can pull ~500w easily.
> 
> If you're trying to use this card on a budget PSU - don't. There are recommended PSUs in the first post, an EVGA G2/P2 850-1000w or CoolerMaster Vantec 850-1000w would be my recommendation.
> 
> *And don't use the daisy-chained PCIE 8pin cables from the CoolerMaster PSUs, make sure each PCIE slot is using its own entire cable*.


I ran the V1000 w/ a single cable w/ the "daisy chained" connector. This really comes down to single or multi rail units. The V1000 is a single rail and both my cards survived bench testing.

With both cards (and using a G1600), I pulled 1200w max at the wall (stock cpu and gpu's)


----------



## Themisseble

So 1200W for quad fire with watter cooling for all components? (monitor etc.)

If that true than R9 295X2 is pretty good.
If only for PC
then 89% eff. = 1070W (probably 410W for each GPU)
Not bad at all. Thnx for INFO.


----------



## wermad

Stock, one pump and a few fans. Once you start overclocking, adding your other components, for crossfire, I would recommended a 1350w to get started, 1500-1700+ for the safe side.


----------



## Alex132

Quote:


> Originally Posted by *Themisseble*
> 
> So 1200W for quad fire with watter cooling for all components? (monitor etc.)
> 
> If that true than R9 295X2 is pretty good.
> If only for PC
> then 89% eff. = 1070W *(probably 410W for each GPU)*
> Not bad at all. Thnx for INFO.


No, around 270-300w to be safe for each GPU + overclocking. So that's 4x270-300=~1080-1200 + CPU/other = ~1350w PSU at least for OC'ing etc.

And that is not how efficiency works. It will be applied on the other end of the PSU. If your internal system is drawing 1000w, and the PSU is 80% efficient - then it will be drawing 1250w from the wall. Does that mean you need a 1000w PSU? No - because 1000w is being drawn from the PSU not 1250w.


----------



## wermad

By gpu, he's probably referring to a single 295x2 card. 450-500w for each card.


----------



## motokill36

Hi All
Just got 4K screnn and am considering getting one of these .
Have had R290 's in the past .
What is this gpu like on 4K screen as fare as smoothness goes. ?

Or is it a driver nightmare ?

Just I can get one of these for 400 pounds now and it looks like a good deal









Thanks all


----------



## GreenGoblinGHz

Any1 got info how crossfire works with r300-series cards ? Its rebranded R290x, so CF with R295x2 should work ?
I have used CF with 295x2 + 290 , and 295x2 + 290x - both worked fine.

Now Im tempted to get 300-series variant of 290x to gain 4gb more vram...from 4gb to 8gb. Usin the 300-series card as top card....

I think it should work but havent found any info about it....

Share your info if you know more


----------



## y4h3ll

Can someone tell me why at 190%+ res on bf4 I get studdering on explosions? R9-295x2 stock 160f max temps xfire is working correctly, fx-9590 4.7 vcore 1.884 32gb ddr3 1600, gigabyte UD5 mobo, 1300 watt psu 2 ssd which game is on primary ssd in rapid mode. Only thing I can think of is that it's on a 2.0 pcie slot. But I'm running 1080. And it's not like a normal fps drop it's like a rubber band effect but my MS/ping is avg 13-25 fps is always 60 even at 190%? I have researched it. No good infinitive answers. Win 8.1 enterprise.


----------



## GreenGoblinGHz

Do you use Mantle with bf4 ? My experiences with 295x2 n bf4 have been very positive. Using atm beta ccc 14.5.
It did have problems too. 1250w psu wasent enuff for 295x2 + 290x. Chip i7-4790k. Chip is oc'd to 4998mhz. Also bf4 nearly got my gpus to reach the 75'c thermal limit. After-market block dropped temps 20c+. Now bf4 gets the gpus up to 52 c.
Try with Mantle on. .


----------



## y4h3ll

Quote:


> Originally Posted by *GreenGoblinGHz*
> 
> Do you use Mantle with bf4 ? My experiences with 295x2 n bf4 have been very positive. Using atm beta ccc 14.5.
> It did have problems too. 1250w psu wasent enuff for 295x2 + 290x. Chip i7-4790k. Chip is oc'd to 4998mhz. Also bf4 nearly got my gpus to reach the 75'c thermal limit. After-market block dropped temps 20c+. Now bf4 gets the gpus up to 52 c.
> Try with Mantle on. .


I do run mantle. Lol.


----------



## rdr09

Quote:


> Originally Posted by *y4h3ll*
> 
> I do run mantle. Lol.


lower the rez scale a bit from 190 to 150%. higher will exact toll on the cpu as well. monitor your cpu usage using afterburner. set pagefile to auto and disable ulps if not yet.


----------



## y4h3ll

Quote:


> Originally Posted by *rdr09*
> 
> lower the rez scale a bit from 190 to 150%. higher will exact toll on the cpu as well. monitor your cpu usage using afterburner. set pagefile to auto and disable ulps if not yet.


This is true but not the problem, fx-9590 4.7ghz raised vcore from 1.55 to 1.88 65% at 185% res nothing seems to be maxed my mobo is amd GA-990FXA-UD5


----------



## y4h3ll

Quote:


> Originally Posted by *rdr09*
> 
> lower the rez scale a bit from 190 to 150%. higher will exact toll on the cpu as well. monitor your cpu usage using afterburner. set pagefile to auto and disable ulps if not yet.


Also upls is off, all nessicary steps have been done to get optimum performance. You think it could be pcie 2.0?


----------



## rdr09

Quote:


> Originally Posted by *y4h3ll*
> 
> Also upls is off, all nessicary steps have been done to get optimum performance. You think it could be pcie 2.0?


i doubt the 2.0 affects much. use HWINFO64 or AB to see usage. Here is my i7 @ 4.5GHz HT off with a single 290 in BF4, 1080 and 130% scale . . .



edit: that 1.55 vcore is kinda high. keep cpu temp below 60C if possible. HWINFO will show that too. That was MP 64. SP is not much on the CPU.

You can run Firestrike to compare with others.


----------



## y4h3ll

Quote:


> Originally Posted by *rdr09*
> 
> i doubt the 2.0 affects much. use HWINFO64 or AB to see usage. Here is my i7 @ 4.5GHz HT off with a single 290 in BF4, 1080 and 130% scale . . .
> 
> 
> 
> edit: that 1.55 vcore is kinda high. keep cpu temp below 60C if possible. HWINFO will show that too. That was MP 64. SP is not much on the CPU.
> 
> You can run Firestrike to compare with others.


1.55 is stock for fx-9590 4.7ghz was unstable for 1.55 vcore so i oc to 1.88 stable for 4.7-5.0ghz but had no reason to run it that high. I have h100i for cooling. I get about 100f 110f max on my cpu at 90%-100% load.


----------



## wermad

Guess I'll be keeping my 295x2 for a bit longer. Fury X was not the bruiser I knew it wasn't gonna be. All the signs were there and I kept telling folks, its getting less and less optimistic with all the crap that's going on with amd. Might go w/ Ti next year when it comes time to retire my 295x2s. If anything, quad 390X 8gb if (or 290X 8gb) if I find some w/ blocks available to them.

Almost done with my new case/setup. Just need to finish the second rig and I'll start cranking up the clocks. I am a little worried about my extensions. I picked up some inexpensive Kingwin 6-pin extensions. They're 18awg and tbh, it should work as its just a simple extension. I'm still looking out for some EVGA or Corsair cables/extensions I can swap to but I may just buy some 16awg wire and make some custom lines. I did run Metro LL for an hour and did not have any issues.


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *y4h3ll*
> 
> 1.55 is stock for fx-9590 4.7ghz was unstable for 1.55 vcore so i oc to 1.88 stable for 4.7-5.0ghz but had no reason to run it that high. I have h100i for cooling. I get about 100f 110f max on my cpu at 90%-100% load.


If all cores are hitting above 90%, then that will cause the minimum fps to drop so far you'd feel it or see it during large fight or huge explosions. 1.88v is a lot of voltage. didn't realize it can take so much. My phenom's max is 1.47v to keep under 55C on water.


----------



## y4h3ll

No what I am saying bf4 is about 65% load average on 185% res for bf4 lol.... it's not the cpu if it was I would oc my cpu more but it's not the problem. If it was I would have gone with a better solution than what I got haha.


----------



## y4h3ll

Okay I was wrong I run at 1.524 OC, stock is 1.472 something like that. My bad man I was In class and we are doing cfm all that crap and all the .00 in class got me thrown off.


----------



## y4h3ll

Quote:


> Originally Posted by *wermad*
> 
> Guess I'll be keeping my 295x2 for a bit longer. Fury X was not the bruiser I knew it wasn't gonna be. All the signs were there and I kept telling folks, its getting less and less optimistic with all the crap that's going on with amd. Might go w/ Ti next year when it comes time to retire my 295x2s. If anything, quad 390X 8gb if (or 290X 8gb) if I find some w/ blocks available to them.
> 
> Almost done with my new case/setup. Just need to finish the second rig and I'll start cranking up the clocks. I am a little worried about my extensions. I picked up some inexpensive Kingwin 6-pin extensions. They're 18awg and tbh, it should work as its just a simple extension. I'm still looking out for some EVGA or Corsair cables/extensions I can swap to but I may just buy some 16awg wire and make some custom lines. I did run Metro LL for an hour and did not have any issues.
> 
> 
> Spoiler: Warning: Spoiler!


What case is that!


----------



## Orivaa

Quote:


> Originally Posted by *y4h3ll*
> 
> What case is that!


Caselabs TX10. It's gargantuan.


----------



## wermad

Quote:


> Originally Posted by *y4h3ll*
> 
> What case is that!
> 
> Quote:
> 
> 
> 
> Originally Posted by *Orivaa*
> 
> Caselabs TX10. It's gargantuan.
Click to expand...

TX10-D + pedestal


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> If all cores are hitting above 90%, then that will cause the minimum fps to drop so far you'd feel it or see it during large fight or huge explosions. 1.88v is a lot of voltage. didn't realize it can take so much. My phenom's max is 1.47v to keep under 55C on water.


Trust me, he is not running 1.88 on a 9590 on an H100i


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> If all cores are hitting above 90%, then that will cause the minimum fps to drop so far you'd feel it or see it during large fight or huge explosions. 1.88v is a lot of voltage. didn't realize it can take so much. My phenom's max is 1.47v to keep under 55C on water.
> 
> 
> 
> Trust me, he is not running 1.88 on a 9590 on an H100i
Click to expand...

He wasnt, 1.524v iirc.

Im at 1.488-1.5v under load for 5.0 daily


----------



## y4h3ll

Quote:


> Originally Posted by *Orivaa*
> 
> Caselabs TX10. It's gargantuan.


Well if any one has not looked in to that case it's about 800$ lol. Will it fit more than one board? Just wondering, I need to clear up some room and putting my woman's rig in that case will do so.


----------



## wermad

The D model is more expensive, but it gives you the dual system option. You get one standard atx layout mb tray and a second reverse atx layout in the same chassis. CL has recently added the option to go w/ dual XL-ATX as opposed to forcing you to get dual HP-ATX (think EVGA SR2 motherboard). HP-ATX is a very tiny market, so they finally made the option for xl-atx on the D model. Its slightly cheaper but still pretty pricey option (~$200-220). I was extremely lucky to find one in the second hand market locally. You may find one once in a very long while but shipping can get pricey. I was just a 30 minute drive from the previous owner and i had a large suv to pick it up and bring her home. The price was within my budget (albeit, the extreme maximum end) and its very rare to find one. I took the leap and I love it so far. I did have to shell out a few more hundreds for the accessories I ended up needing so its still very expensive. If you stick w/ the CL matte black (mine was custom powder coated in two colors), for a dual system with the accessories, you're gonna be well over $1k. Mine came with the pedestal which typically adds another couple hundred to the total.



Spoiler: Warning: Spoiler!


----------



## detunedstring

Quick question guys. Will help me become a member of the club 

I will be adding a 295x2 to my system soon. My question is, can i put the 295x2 as my secondary card to my existing 290x? Meaning, the 290x in the first slot, and the 295x2 in the third?

They're both pci e 3.0 x16 slots individually, x8x8 when both populated. The way my water loop is set up might make things complicated to put the 295x2 on the top slot.

Thanks for any advice you might have.


----------



## electro2u

Quote:


> Originally Posted by *detunedstring*
> 
> Quick question guys. Will help me become a member of the club
> 
> I will be adding a 295x2 to my system soon. My question is, can i put the 295x2 as my secondary card to my existing 290x? Meaning, the 290x in the first slot, and the 295x2 in the third?
> 
> They're both pci e 3.0 x16 slots individually, x8x8 when both populated. The way my water loop is set up might make things complicated to put the 295x2 on the top slot.
> 
> Thanks for any advice you might have.


I've had 295x2+290x in both configurations on Z97 and X79


----------



## Mega Man

Quote:


> Originally Posted by *GreenGoblinGHz*
> 
> Any1 got info how crossfire works with r300-series cards ? Its rebranded R290x, so CF with R295x2 should work ?
> I have used CF with 295x2 + 290 , and 295x2 + 290x - both worked fine.
> 
> Now Im tempted to get 300-series variant of 290x to gain 4gb more vram...from 4gb to 8gb. Usin the 300-series card as top card....
> 
> I think it should work but havent found any info about it....
> 
> Share your info if you know more


that is not the way it works, you will only have 4 gb assuming you can cfx it
Quote:


> Originally Posted by *y4h3ll*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> lower the rez scale a bit from 190 to 150%. higher will exact toll on the cpu as well. monitor your cpu usage using afterburner. set pagefile to auto and disable ulps if not yet.
> 
> 
> 
> This is true but not the problem, fx-9590 4.7ghz raised vcore from 1.55 to 1.88 65% at 185% res nothing seems to be maxed my mobo is amd GA-990FXA-UD5
Click to expand...

no way that is happening sorry
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *y4h3ll*
> 
> Also upls is off, all nessicary steps have been done to get optimum performance. You think it could be pcie 2.0?
> 
> 
> 
> i doubt the 2.0 affects much. use HWINFO64 or AB to see usage. Here is my i7 @ 4.5GHz HT off with a single 290 in BF4, 1080 and 130% scale . . .
> 
> 
> 
> edit: that 1.55 vcore is kinda high. keep cpu temp below 60C if possible. HWINFO will show that too. That was MP 64. SP is not much on the CPU.
> 
> You can run Firestrike to compare with others.
Click to expand...

1.55 is not really that high, assuming you can cool it i keep mine at 1.6ish and another member 1.7 24/7 neither of our chips have had any issues for well over 2 years
Quote:


> Originally Posted by *wermad*
> 
> The D model is more expensive, but it gives you the dual system option. You get one standard atx layout mb tray and a second reverse atx layout in the same chassis. CL has recently added the option to go w/ dual XL-ATX as opposed to forcing you to get dual HP-ATX (think EVGA SR2 motherboard). HP-ATX is a very tiny market, so they finally made the option for xl-atx on the D model. Its slightly cheaper but still pretty pricey option (~$200-220). I was extremely lucky to find one in the second hand market locally. You may find one once in a very long while but shipping can get pricey. I was just a 30 minute drive from the previous owner and i had a large suv to pick it up and bring her home. The price was within my budget (albeit, the extreme maximum end) and its very rare to find one. I took the leap and I love it so far. I did have to shell out a few more hundreds for the accessories I ended up needing so its still very expensive. If you stick w/ the CL matte black (mine was custom powder coated in two colors), for a dual system with the accessories, you're gonna be well over $1k. Mine came with the pedestal which typically adds another couple hundred to the total.
> 
> 
> 
> Spoiler: Warning: Spoiler!


not torying ot brag but the price only goes up from their, i plan on spending 3k on my tx10 when it is all said and done --- that is just the case and accessories, NOTHING else- not system parts, watecooling parts ect
Quote:


> Originally Posted by *detunedstring*
> 
> Quick question guys. Will help me become a member of the club
> 
> I will be adding a 295x2 to my system soon. My question is, can i put the 295x2 as my secondary card to my existing 290x? Meaning, the 290x in the first slot, and the 295x2 in the third?
> 
> They're both pci e 3.0 x16 slots individually, x8x8 when both populated. The way my water loop is set up might make things complicated to put the 295x2 on the top slot.
> 
> Thanks for any advice you might have.


yes


----------



## wermad

Quote:


> Originally Posted by *detunedstring*
> 
> Quick question guys. Will help me become a member of the club
> 
> I will be adding a 295x2 to my system soon. My question is, can i put the 295x2 as my secondary card to my existing 290x? Meaning, the 290x in the first slot, and the 295x2 in the third?
> 
> They're both pci e 3.0 x16 slots individually, x8x8 when both populated. The way my water loop is set up might make things complicated to put the 295x2 on the top slot.
> 
> Thanks for any advice you might have.
> 
> Quote:
> 
> 
> 
> Originally Posted by *electro2u*
> 
> I've had 295x2+290x in both configurations on Z97 and X79
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> yes
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

HardOcp did their quad-fire 295x2 review on a Maximus V Extreme, which runs 8x/8x 3.0 w/ two cards. There's plenty of bandwidth @ 8x 3.0. You're ready to roll









Quote:


> Originally Posted by *Mega Man*
> 
> not torying ot brag but the price only goes up from their, i plan on spending 3k on my tx10 when it is all said and done --- that is just the case and accessories, NOTHING else- not system parts, watecooling parts ect


I expect w/ more peds and accessories then Seross for your "mega" build


----------



## Mega Man

may god help me , so close to clicking buy, but really having to think if i should with the baby.... if my wife were not pregnant i would not have this issue

and my spelling .....







this is what happens after a 14 hour day at work


----------



## y4h3ll

Quote:


> Originally Posted by *Mega Man*
> 
> that is not the way it works, you will only have 4 gb assuming you can cfx it
> no way that is happening sorry
> 1.55 is not really that high, assuming you can cool it i keep mine at 1.6ish and another member 1.7 24/7 neither of our chips have had any issues for well over 2 years
> not torying ot brag but the price only goes up from their, i plan on spending 3k on my tx10 when it is all said and done --- that is just the case and accessories, NOTHING else- not system parts, watecooling parts ect
> yes


Guess you didn't see my repost, I'm running 1.524vcore. I was getting something else mixed up with my vcore from class. Don't ask lol. I have not changed my vcore for 4 months. Here is a pic. 

Sorry doing this from my phone, problems with trying to get my phone to qoat the last post you made about me.


----------



## y4h3ll

Quote:


> Originally Posted by *Mega Man*
> 
> may god help me , so close to clicking buy, but really having to think if i should with the baby.... if my wife were not pregnant i would not have this issue
> 
> and my spelling .....
> 
> 
> 
> 
> 
> 
> 
> this is what happens after a 14 hour day at work


I understand after dropping around give or take 5k on my built then my womans... lol to find out she only goes to facebook when the idea was to share my gaming habit with me.... I could have had a much sexier rig going on right now.


----------



## gatygun

Quote:


> Originally Posted by *y4h3ll*
> 
> I understand after dropping around give or take 5k on my built then my womans... lol to find out she only goes to facebook when the idea was to share my gaming habit with me.... I could have had a much sexier rig going on right now.


This is why i don't do in woman and live in a cave.

Infinity budget forever.


----------



## y4h3ll

Quote:


> Originally Posted by *gatygun*
> 
> This is why i don't do in woman and live in a cave.
> 
> Infinity budget forever.


Buhahahha ikr, hey she has a 290x if I xfire with my 296x2 will I need pcie 3.0? Or is my 2.0 good enough?

Or better yet what if I got another 295x2? Both on 2.0 16x lane.


----------



## Mega Man

Quote:


> Originally Posted by *y4h3ll*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> may god help me , so close to clicking buy, but really having to think if i should with the baby.... if my wife were not pregnant i would not have this issue
> 
> and my spelling .....
> 
> 
> 
> 
> 
> 
> 
> this is what happens after a 14 hour day at work
> 
> 
> 
> I understand after dropping around give or take 5k on my built then my womans... lol to find out she only goes to facebook when the idea was to share my gaming habit with me.... I could have had a much sexier rig going on right now.
Click to expand...

haha i used to build her a double of mine, now i just give her handme downs,
atm she has a 7970 and a 290x ( 2 rigs ) more then overkill for her uses, but if she ever wants to game she can, maybe not at max, but pretty well.

i buy quadfire with each new gen fury x will take a while for me as i now will have a baby and i will not let my funds get as low as i have in the past

once my truck is paid off ( 2-3 more years i dont remember exactly ) i will finish getting out of debt excessively quick, and then i wont have an issue upgrading for a long time


----------



## ramos29

gta 5 at launch was running ok 60fps every thing on max
exept msaa X4 and vegetation set to high not ultra
now after several updates the game is running like ****
huge random fps drp stutter,
i set every thing to normal and the stuteer fps drop is even more wort
who else is experiencing this!!
every time i install a new driver i use ddu


----------



## ramos29

i fixed my problem by increasing page file from 8 to 25gig, the game is devouring 13gig of pagefiles this was causing the stutter
i still cant understand the relationship wetween pagefiles and the gpu, i have 12go of ram it should be enough in games


----------



## y4h3ll

Quote:


> Originally Posted by *ramos29*
> 
> i fixed my problem by increasing page file from 8 to 25gig, the game is devouring 13gig of pagefiles this was causing the stutter
> i still cant understand the relationship wetween pagefiles and the gpu, i have 12go of ram it should be enough in games


If you have a SSD that should fix it. I don't and have not had that problem except when the driver wasn't correct. So I honestly couldn't tell you. What gpu are you useing?


----------



## y4h3ll

Quote:


> Originally Posted by *y4h3ll*
> 
> If you have a SSD that should fix it. I don't and have not had that problem except when the driver wasn't correct. So I honestly couldn't tell you. What gpu are you useing?


I was having an issue with grass and still don't know of it is fixed, any one else still having the problem where in xfire grass would randomly spawn instantly causing lots of Shadows and fps drop was 5-10 fps if I was lucky. This is for people who have or have an r9-295x2. Windows8.1 only. Also gta5 was not the only game doing this. For dirt rally it was flash effect. "Dirt or mud or water on windshield."also for bf4 used to do it with bloI'd splatter. The ONLY thing that would fix this for me was mantle. 1x1 never fixed it default xfire op xfire didn't work either. If any one could help this would be great. I have gotten so tired of having the fastest card in the world and my games playing like gtx 470. With 512 mb ram. I mean cmon.


----------



## ramos29

texture popup its a probleme known in gta 4 not 5!
the first time i run bf4 in mantle mode, every thing was quick , both dx11 and mantle i had 60fps but in mantle things were faster explosions player movements..... i dont know what hapened next mantle become like dx11


----------



## y4h3ll

Quote:


> Originally Posted by *ramos29*
> 
> texture popup its a probleme known in gta 4 not 5!
> the first time i run bf4 in mantle mode, every thing was quick , both dx11 and mantle i had 60fps but in mantle things were faster explosions player movements..... i dont know what hapened next mantle become like dx11


Funny you say that but I cannot get dx11 to run correctly in xfire. As an example non of my on screen nor my logitec onscreen display will work with dx 11 under xfire. I finally figured out I couldn't run 200% because of thermal throttleing. I rebuilt my rig because of non related problems and in doing so I discovered an entire cat stuck in my rad. Lol. I then put 2 3000rpm +or-5% in push pull then plugged them in to my corsair link and there is a 25f differance. From what I was told is the thermal throttling kicks in around 75c yet I didn't get past roughly 71-72c. So I say all that to say that push pull is the cheapest method as far as temp control for the 295x2.

Lastly not related to this post, anyone know how far I can push the 5.0ghz fx-9590 and what should I set the vcore if I can push it a bit more. Bf4 maxed 200% including msaa 4x I use roughly 70% cpu. Temps never jump past 101f on my cpu under 90%-98% load. I have it stable at 1.488 4.7 but anything after always gets me a blue screen. I have plenty of power, Fans are on a custome curve so I can Raise another 20% or so at 100%.


----------



## xer0h0ur

No one uses Fahrenheit to judge temperatures. Stick with Celsius. As for throttling on the 295X2, it begins at 74C.


----------



## y4h3ll

Quote:


> Originally Posted by *xer0h0ur*
> 
> No one uses Fahrenheit to judge temperatures. Stick with Celsius. As for throttling on the 295X2, it begins at 74C.


I understand, this was intended for thoes who do. And i included celsius. If it helps I believe it's roughly 65-67c where my temps sorry to bother you with my imperial defaults.


----------



## xer0h0ur

Whoa there chief. No one is bothered. Just saying the universal method used. If you're experiencing throttling effects despite not reaching throttling temps then you may want to look into removing the cooler and changing the thermal pads used on the VRMs and PLX chip. If you already are going through that trouble then you might as well change the TIM on the GPU too and I would suggest having some spare vRAM thermal pads as you will likely tear up some of the pads when you remove the cooler.


----------



## BootPirate

Quick question, I saw that it came up earlier and I was interested in the answer but I can't seem to find it again (it was quiet a few pages back).
Should the radiator/reservoir be above the card, or is it alright bellow?

Is this okay?
My current set-up in my case


Or should it look like this?


My temps seem fine (70C) But will it affect the longevity of my pump?


----------



## wermad

The hoses are pretty robust so they won't kink easily. What you wanna make sure is your fan is being fed fresh/cool air as much as possible. The pump is strong enough to push through the loop regardless of where the radiator is located. I've seen these pumps actually power several blocks (and mind you the 295x2 has two pumps!). More then likely, you'll end up selling the card before you run into issues w/ the pump. Just don't mess with it in any manner that would void your warranty (ie, opening it up, chopping it up, etc.). If you have any issues, as long as your warranty is good, you'll be covered. Also, note that the pump and over all design has been used a ton by the cpu industry, so don't fear this as brand new technology.

If you get the urge to go full water custom, any of the venerable pumps such as ddc, d5, and jingway are strong enough for several components. My sole D5 is pushing through eight radiators, two gpu's, cpu, and mb. Everyone kept telling me i needed two or something crazy like an eheim, I said nah. So far, so good


----------



## NBrock

I swapped the TIM on the GPUs and got the fan adapter for the VRM fan so I can run it off the motherboard and control the fan speed. It dropped temps by about 15*c going from the stock TIM to PROLiMA PK-1. I am also able to OC quite a bit without temps getting crazy. Currently running 1100 core and 1300 mem.


----------



## y4h3ll

Quote:


> Originally Posted by *xer0h0ur*
> 
> Whoa there chief. No one is bothered. Just saying the universal method used. If you're experiencing throttling effects despite not reaching throttling temps then you may want to look into removing the cooler and changing the thermal pads used on the VRMs and PLX chip. If you already are going through that trouble then you might as well change the TIM on the GPU too and I would suggest having some spare vRAM thermal pads as you will likely tear up some of the pads when you remove the cooler.


I agree with you, lol. I didn't think any one was upset.I even went back up is fixed the post.


----------



## xer0h0ur

Quote:


> Originally Posted by *BootPirate*
> 
> Quick question, I saw that it came up earlier and I was interested in the answer but I can't seem to find it again (it was quiet a few pages back).
> Should the radiator/reservoir be above the card, or is it alright bellow?
> 
> Is this okay?
> My current set-up in my case
> 
> 
> Or should it look like this?
> 
> 
> My temps seem fine (70C) But will it affect the longevity of my pump?


AMD recommends having the radiator above the height of the card but honestly the only reason they say that is so that customers don't hear the gurgling sounds or pump noise at startup and complain that something is wrong. So in other words if you have the radiator below the card's height you're likely going to hear noises at startup while the air bubbles are working their way through the pump.


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> The hoses are pretty robust so they won't kink easily. What you wanna make sure is your fan is being fed fresh/cool air as much as possible. The pump is strong enough to push through the loop regardless of where the radiator is located. I've seen these pumps actually power several blocks (and mind you the 295x2 has two pumps!). More then likely, you'll end up selling the card before you run into issues w/ the pump. Just don't mess with it in any manner that would void your warranty (ie, opening it up, chopping it up, etc.). If you have any issues, as long as your warranty is good, you'll be covered. Also, note that the pump and over all design has been used a ton by the cpu industry, so don't fear this as brand new technology.
> 
> If you get the urge to go full water custom, any of the venerable pumps such as ddc, d5, and jingway are strong enough for several components. My sole D5 is pushing through eight radiators, two gpu's, cpu, and mb. Everyone kept telling me i needed two or something crazy like an eheim, I said nah. So far, so good


This is correct. I use 4 ddcs but that is my personal pref. ( I could easily use 1 or 2 )

Heck my itx with 1gpu had 2 ddcs lol


----------



## NBrock

I have a question for you guys/gals....

Currently I use an Auria 27in 2560x1440 IPS monitor. Since I got the 295x2 I am debating on upgrading monitors. I am stuck between the Asus MG279Q (27 inch 144hz FreeSync IPS) or a 4k FreeSync IPS.

Do you think the 295x2 would be able to push 4k 60fps in current games in DX11 (obviously not with AA...but on a 27inch 4k I feel it wouldn't be needed)? I feel that with DX12 right around the corner future games should run "better" at 4k than DX11 games so it might be worth the move to 4k... What do you guys/gals think?

THANKS!!


----------



## wermad

When I was bench testing my two cards, both rads were sitting below the cards. Didn't hear or see anything abnormal.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Whoa there chief. No one is bothered. Just saying the universal method used. If you're experiencing throttling effects despite not reaching throttling temps then you may want to look into removing the cooler and changing the thermal pads used on the VRMs and PLX chip. If you already are going through that trouble then you might as well change the TIM on the GPU too and I would suggest having some spare vRAM thermal pads as you will likely tear up some of the pads when you remove the cooler.


hey, can you explain how to go about changing thermal pads?...I understand Fujipoly are the best?


----------



## veaseomat

Quote:


> Originally Posted by *NBrock*
> 
> I have a question for you guys/gals....
> 
> Currently I use an Auria 27in 2560x1440 IPS monitor. Since I got the 295x2 I am debating on upgrading monitors. I am stuck between the Asus MG279Q (27 inch 144hz FreeSync IPS) or a 4k FreeSync IPS.
> 
> Do you think the 295x2 would be able to push 4k 60fps in current games in DX11 (obviously not with AA...but on a 27inch 4k I feel it wouldn't be needed)? I feel that with DX12 right around the corner future games should run "better" at 4k than DX11 games so it might be worth the move to 4k... What do you guys/gals think?
> 
> THANKS!!


I have 295x2's in crossfire, I think I single 295x2 would push 60 frames on 4k, might have to tweak a few settings but it should be doable. Remember dual GPUs dont have freesync support yet. I have the benq 1440p 144hz freesync that i swap back and fourth with, I currently use a LG 1440 ultrawide curved IPS (60hz). I tried a samsung UD590 (4k) back and december and returned it, I didn't like 4k. I think 1440p 144hz is the sweet spot right now because it's got a nice balance of framerates and resolution, but that's up to you to decide, my friends run 4k monitors and TVs and love them.

unrelated question, So I started reading crossfire only works when in fullscreen the other day, how about fullscreen windowed? When I play my games in fullscreen windowed the crossfire badge wont show up (when turned on), but gpu usage on all 4 are up, that means it's working right?


----------



## Mega Man

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Whoa there chief. No one is bothered. Just saying the universal method used. If you're experiencing throttling effects despite not reaching throttling temps then you may want to look into removing the cooler and changing the thermal pads used on the VRMs and PLX chip. If you already are going through that trouble then you might as well change the TIM on the GPU too and I would suggest having some spare vRAM thermal pads as you will likely tear up some of the pads when you remove the cooler.
> 
> 
> 
> hey, can you explain how to go about changing thermal pads?...I understand Fujipoly are the best?
Click to expand...

remove the heatsink,

you will see the pads, remove them ( i use a xacto )

i wipe mine down with alcohol and a micro fiber cloth.

then cut new pads, and remove plastic ( again xacto )

install


----------



## SAFX

Thanks, Mega Man!


----------



## SAFX

Quote:


> Originally Posted by *Mega Man*
> 
> remove the heatsink,
> 
> you will see the pads, remove them ( i use a xacto )
> 
> i wipe mine down with alcohol and a micro fiber cloth.
> 
> then cut new pads, and remove plastic ( again xacto )
> 
> install


When you say "remove the heatsink", you mean the copper coil between the water blocks?


----------



## gatygun

Quote:


> Originally Posted by *veaseomat*
> 
> I have 295x2's in crossfire, I think I single 295x2 would push 60 frames on 4k, might have to tweak a few settings but it should be doable. Remember dual GPUs dont have freesync support yet. I have the benq 1440p 144hz freesync that i swap back and fourth with, I currently use a LG 1440 ultrawide curved IPS (60hz). I tried a samsung UD590 (4k) back and december and returned it, I didn't like 4k. I think 1440p 144hz is the sweet spot right now because it's got a nice balance of framerates and resolution, but that's up to you to decide, my friends run 4k monitors and TVs and love them.
> 
> unrelated question, So I started reading crossfire only works when in fullscreen the other day, how about fullscreen windowed? When I play my games in fullscreen windowed the crossfire badge wont show up (when turned on), but gpu usage on all 4 are up, that means it's working right?


Any news on crossfire and freesync support?


----------



## wermad

Quote:


> Originally Posted by *SAFX*
> 
> When you say "remove the heatsink", you mean the copper coil between the water blocks?


The whole shroud/heatsink/pumps are screwed together. Just remove the screws from the pcb side and from the i/o plate. This will release the pcb and you can swap pads.


----------



## veaseomat

Quote:


> Originally Posted by *gatygun*
> 
> Any news on crossfire and freesync support?


last time it was mentioned:
http://techreport.com/news/28193/amd-delays-freesync-support-for-multi-gpu-systems


----------



## gatygun

Quote:


> Originally Posted by *veaseomat*
> 
> last time it was mentioned:
> http://techreport.com/news/28193/amd-delays-freesync-support-for-multi-gpu-systems


Ah thanks.

This kinda makes me worried about current freesync monitors, what if they announce crossfire support but you need a dp 1.3 solution instead of a 1.2a solution and push out new kind of screens. Kinda a hard thing to bite.

Just thinking out loud tho


----------



## xer0h0ur

Quote:


> Originally Posted by *SAFX*
> 
> hey, can you explain how to go about changing thermal pads?...I understand Fujipoly are the best?


For reference I would suggest downloading the waterblock instructions for an EK block so you can recognize the VRMs, mosfets, PLX chip etc etc. If memory serves me right the vRAM pads are 0.5mm, VRMs are 1.5mm and the PLX chip would be either 1.0mm or 1.5mm. Any surfaces that I am applying new pads to gets cleaned with q-tips and alcohol. Here are the instructions for reference: https://shop.ekwb.com/EK-IM/EK-IM-3831109869079.pdf


----------



## xer0h0ur

Quote:


> Originally Posted by *gatygun*
> 
> Ah thanks.
> 
> This kinda makes me worried about current freesync monitors, what if they announce crossfire support but you need a dp 1.3 solution instead of a 1.2a solution and push out new kind of screens. Kinda a hard thing to bite.
> 
> Just thinking out loud tho


That is not true. Freesync does not and will not require DP 1.3 to work in crossfire. Nothing exists with a DP 1.3 port and we have no idea when they will finally include it into a monitor or video card. The specification has been locked down for a while now but the scalers don't exist yet.


----------



## BootPirate

Quote:


> Originally Posted by *xer0h0ur*
> 
> AMD recommends having the radiator above the height of the card but honestly the only reason they say that is so that customers don't hear the gurgling sounds or pump noise at startup and complain that something is wrong. So in other words if you have the radiator below the card's height you're likely going to hear noises at startup while the air bubbles are working their way through the pump.


Okay, thanks!
I actually like that sound. Am I the only one?


----------



## xer0h0ur

Quote:


> Originally Posted by *BootPirate*
> 
> Okay, thanks!
> I actually like that sound. Am I the only one?


The way I see it, if your system is silent enough to hear the pump or coil whine at load then you have an very quiet rig.


----------



## BootPirate

It's nice that you say that, because I have 9 case fans


----------



## xer0h0ur

Yeah I have a ton of fans but they are running at relatively low RPM for the sake of relative silence. I don't even get coil whine on my 295X2 anymore. Used to be light but now its just not there.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> For reference I would suggest downloading the waterblock instructions for an EK block so you can recognize the VRMs, mosfets, PLX chip etc etc. If memory serves me right the vRAM pads are 0.5mm, VRMs are 1.5mm and the PLX chip would be either 1.0mm or 1.5mm. Any surfaces that I am applying new pads to gets cleaned with q-tips and alcohol. Here are the instructions for reference: https://shop.ekwb.com/EK-IM/EK-IM-3831109869079.pdf


Is it worth doing? How much did your temps go down?


----------



## Dagamus NM

So I finally got dying light to work right. I went into the bios of my RIVE and gave it more power and current for the pice lanes as well as tweaked a few other GPU related settings for the Mobo and everything plays well at 1440.

All is right now.


----------



## kayan

Anyone having any luck with Witcher 3 not flickering anymore in x-fire?


----------



## y4h3ll

Quote:


> Originally Posted by *BootPirate*
> 
> Quick question, I saw that it came up earlier and I was interested in the answer but I can't seem to find it again (it was quiet a few pages back).
> Should the radiator/reservoir be above the card, or is it alright bellow?
> 
> Is this okay?
> My current set-up in my case
> 
> 
> Or should it look like this?
> 
> 
> My temps seem fine (70C) But will it affect the longevity of my pump?


Quote:


> Originally Posted by *xer0h0ur*
> 
> AMD recommends having the radiator above the height of the card but honestly the only reason they say that is so that customers don't hear the gurgling sounds or pump noise at startup and complain that something is wrong. So in other words if you have the radiator below the card's height you're likely going to hear noises at startup while the air bubbles are working their way through the pump.


My RAD is the bottom Configuration but upside down is this safe? Lol

Any danger in doing the second configuration but with the rad upside down temps are good. Let me know


----------



## Alex132

Running bubbles through the pump isn't really a good idea, but it should be fine. Just at least have the end tank at the bottom so it can't take in air while it's running.


----------



## Alex132

Does anyone else experience this?


----------



## gatygun

Quote:


> Originally Posted by *Alex132*
> 
> Does anyone else experience this?


Is your gpu getting to hot or vrm, seems like what i get when i overclock to far, or things heat up.
Quote:


> Originally Posted by *kayan*
> 
> Anyone having any luck with Witcher 3 not flickering anymore in x-fire?


Disable AA should fix that i believe ingame. If that doesn't work i have no idea. further.


----------



## y4h3ll

Quote:


> Originally Posted by *Alex132*
> 
> Running bubbles through the pump isn't really a good idea, but it should be fine. Just at least have the end tank at the bottom so it can't take in air while it's running.


I have the lines at the bottom, wouldn't this prevent bubbles?rad is elivated


----------



## y4h3ll

Quote:


> Originally Posted by *Alex132*
> 
> Does anyone else experience this?


I have seen this before! Disable overclock amd rma card. This happened two times in my live due to over voltage to my gpu's. I may be wrong but this will effect your games and continue to get worse. You will start to see lines and purple streaks in your games. Even if when you go back to stock I would still replace it before it's to late.


----------



## kayan

Quote:


> Originally Posted by *Alex132*
> 
> Does anyone else experience this?


Mine does this off and on, and it isn't overclocked. Usually when I turn vsync or AA on. Sometimes I get mad black horizontal lines on webpages.

Quote:


> Originally Posted by *y4h3ll*
> 
> I have seen this before! Disable overclock amd rma card. This happened two times in my live due to over voltage to my gpu's. I may be wrong but this will effect your games and continue to get worse. You will start to see lines and purple streaks in your games. Even if when you go back to stock I would still replace it before it's to late.


I tried requesting support RMA for my card, they told me it wasn't a problem with the card, but rather a faulty psu. I can't get the problem to happen all the time, so I can't purposefully replicate. Hasn't happened in about a month though, *crosses fingers*


----------



## Dagamus NM

Quote:


> Originally Posted by *Alex132*
> 
> Does anyone else experience this?


Before you go returning your GPU, check a different cable and maybe a different port (unless you are using dvi).

What connection to your monitor are you using?


----------



## kayan

Quote:


> Originally Posted by *Dagamus NM*
> 
> Before you go returning your GPU, check a different cable and maybe a different port (unless you are using dvi).
> 
> What connection to your monitor are you using?


Mini DP to DP. Has happened on two diff mobos. Amd and Intel.


----------



## y4h3ll

Quote:


> Originally Posted by *Dagamus NM*
> 
> Before you go returning your GPU, check a different cable and maybe a different port (unless you are using dvi).
> 
> What connection to your monitor are you using?


He has a point, make sure all connections of any monitors are clean. As far as the flickering I always rma and thoes dill hopes will accept the rma weather or not there is a problem. REMEMBER you are doing them a favor, Not them doing you a favor.


----------



## Alex132

It's 'alright'. I know the issue. It's not the cable (both DVI and DP give this) it's using MSI Afterburner to overclock actually. That happens at stock, but when I apply an overclock with MSI AB - no matter how small (even if the 'overclock' is stock clocks) it messes up like that. Something to do with the memory incorrectly going in and out of 2D settings when idle.

Apparently Trixx avoids this - but this only started to happen since recently - and getting worse and worse. Can't RMA the card so yeah. Have to deal with not overclocking unless Windows 10 or a new driver fixes it.


----------



## PCModderMike

Quote:


> Originally Posted by *Alex132*
> 
> It's 'alright'. I know the issue. It's not the cable (both DVI and DP give this) it's using MSI Afterburner to overclock actually. That happens at stock, but when I apply an overclock with MSI AB - no matter how small (even if the 'overclock' is stock clocks) it messes up like that. Something to do with the memory incorrectly going in and out of 2D settings when idle.
> 
> Apparently Trixx avoids this - but this only started to happen since recently - and getting worse and worse. Can't RMA the card so yeah. Have to deal with not overclocking unless Windows 10 or a new driver fixes it.


Doubt any new drivers would fix that. The type of behavior that card is exhibiting it appears to be physically damaged, probably from overclocking.


----------



## Alex132

Quote:


> Originally Posted by *PCModderMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> It's 'alright'. I know the issue. It's not the cable (both DVI and DP give this) it's using MSI Afterburner to overclock actually. That happens at stock, but when I apply an overclock with MSI AB - no matter how small (even if the 'overclock' is stock clocks) it messes up like that. Something to do with the memory incorrectly going in and out of 2D settings when idle.
> 
> Apparently Trixx avoids this - but this only started to happen since recently - and getting worse and worse. Can't RMA the card so yeah. Have to deal with not overclocking unless Windows 10 or a new driver fixes it.
> 
> 
> 
> Doubt any new drivers would fix that. The type of behavior that card is exhibiting it appears to be physically damaged, probably from overclocking.
Click to expand...

Mmm, but it only happens when I change the clocks via MSI AB - If I don't then it doesn't happen. The clock change could be -200 core and -200 memory and it would still happen.


----------



## Dagamus NM

So what is the issue with using trixx? I might have to give trixx a go. I cannot over clock my cards at all. Honestly until I get my waterblocks I am terrified to even think of overclocking these. I also do not understand the power delivery on these things. Why were they built with only two 8-pin connectors? Can they even get enough power to brother attempting to overclocking them? After seeing what some others have been through I am happy that mine work at all.


----------



## NBrock

Mine works fine and I am able to over clock quite a bit after three pretty simple mods.
I got a fan cable adapter so I can plug the VRM fan into the motherboard and control it's speed.
Since I had the cover off I removed the pumps/waterblocks and put better TIM on the GPUs.
I put two Corsair SP High Performance series (not the quiet edition) fans in push pull.

I run the vrm fan around 55% 24/7 and 100% when benching and trying higher clocks.

With those three mods Temps dropped quite a bit. My load temps with both GPUs going are a steady 70* @ 1100 core and 1300 mem no throttling. Without the overclock temps never went above 64* after the "mods".


----------



## elgreco14

Good score for this oc? http://www.3dmark.com/3dm/7383619

Btw is a 1100/1350 and 13+ mv oc save to use 24/7. Core temp stays 67-68 max then.


----------



## Mega Man

Quote:


> Originally Posted by *Alex132*
> 
> Does anyone else experience this?


no i have not, but imo that is not a driver issue, although you seem to think it is or you just like to blame it on amd, as you wrote " amd drivers ftw " as the title,

how do you know it isnt an issue with AB? which is what it sounds like,

is ab updated? are your drivers up to date ? if either are not then you either need to upgrade both to newest, or downgrade newest to match the time the other was released
Quote:


> Originally Posted by *y4h3ll*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> Before you go returning your GPU, check a different cable and maybe a different port (unless you are using dvi).
> 
> What connection to your monitor are you using?
> 
> 
> 
> He has a point, make sure all connections of any monitors are clean. As far as the flickering I always rma and thoes dill hopes will accept the rma weather or not there is a problem. REMEMBER you are doing them a favor, Not them doing you a favor.
Click to expand...

no. your dont doing anyone a favor, rmas only raise the cost of the product for everyone. it hurts all buys ( the company does not pay for rmas, the price that you buy a video card already has the rma % built in

the more rmas the more next gen pricing goes up. people all seem to think theft ( not saying this is theft or that it isnt ) or other things dont hurt anyone but they do. and i am sick of this attitude


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Does anyone else experience this?
> 
> 
> 
> 
> 
> 
> 
> 
> no i have not, but imo that is not a driver issue, although you seem to think it is or you just like to blame it on amd, as you wrote " amd drivers ftw " as the title,
> 
> how do you know it isnt an issue with AB? which is what it sounds like,
> 
> is ab updated? are your drivers up to date ? if either are not then you either need to upgrade both to newest, or downgrade newest to match the time the other was released
Click to expand...

Title was a joke-reference to something a friend and I say all the time








The AMD drivers aren't bad, they just could be better. My crashes/BSoDs due to drivers has been at an all time high with AMD though.

I think I last counted 57 software BSoDs. in 3 months with my 295X2 compared to 1 in 3 years with my 690. And those are BSoDs - not just driver freezes. (I can't count those with Windows).

I do think it's both? MSI AB never did this before on this scale. And I don't recall seeing it with 15.4. I had seen it happen once or twice maybe, but it's getting worse and worse. Today I lost signal from my GPU twice randomly - once in game and once at desktop - but no weird glitches like in that video since I haven't touched MSI AB since. So that does seem to be the cause.

NOTE: I don't overclock for anything but benchmarks. And I haven't run benchmarks on this card in 3+ weeks. On 15.5 drivers, 4.1.1 MSI AB, latest updated W7 x64.

I really, really, really, really hope this graphics card isn't dying.


----------



## Dagamus NM

Damn, I just typed a long response then checked another site to validate something I was saying and everything disappeared.

So I was so excited to get my x2 blocks in today that I was singing an ek-fc r9 295x2 waterblock song. It was just saying what the text on the box said to the beat of a Die Antwoord song. This block is absolutely gorgeous, probably my second favorite looking block that I have ever had second to the thermospheres I have (useless but gorgeous). Probably only the fury X block looks nicer.

Anyhow, acetal and nickel with an acetal triple link with the center section blocked off so that I can use both pcie 3.0x16 lanes on my motherboard.

Now I have a dilemma. I am not sure which setup I want to match up to which GPUs.

The first setup is a RIVE with copper acetal blocks (the LE version so only acetal visible when installed), an Intel 3930k with an ek supremacy evo elite nickel block, 64gb G.Skill 2133MHz ram with ek monarch blocks in nickel acetal, two Intel 480gb 730 series ssd's in raid 0.

The second setup is an Gigabyte UD7 990fx with a 9590 and ek nickel acetal block. (The only ud7 blocks I can find are on FCPU's inventory so they basically don't exist) this board has 32gb PNY 2133MHz with monarch waterblock in nickel acetal and two 480gb Intel 730 series ssd's in raid 0.

The GPU's to match up are the two r9 295x2's with the water blocks mentioned above and four MSI 280x 6gb cards with ek vga-supremacy hwbot blocks.

Any advice on which GPU's you would put with which build and why?


----------



## Mega Man

most will say intel, as of course that is the only one who can use it-- the amd will bottle neck

i on the other hand will say- which ever rig you want

i much prefer my AMD

and i will be retubing and using stock OEM cables till i get the time to finish sleeving and making hard tubes as seriously i am about to throw the 3930k out the door.

it does fine in games and other things,

but in windows, dang it is slow. i feel like i am on a Pent D

before anyone asks, i pass prime with flying colors and ANY OTHER stress ( 4.8ghz-2400cl10 quad channel ) . near anyone with a intel ( pick one - anyone ) and anyone with a amd ( pile driver reasonably clocked and stable ) , will say that the amd is far faster in windows --- aka it "feels" faster

i have decided that i will be making a 290x quadfire ( as i have 4 sitting on the shelf ) with one of my 83xx


----------



## DividebyZERO

Quote:


> Originally Posted by *Mega Man*
> 
> most will say intel, as of course that is the only one who can use it-- the amd will bottle neck
> 
> i on the other hand will say- which ever rig you want
> 
> i much prefer my AMD
> 
> and i will be retubing and using stock OEM cables till i get the time to finish sleeving and making hard tubes as seriously i am about to throw the 3930k out the door.
> 
> it does fine in games and other things,
> 
> but in windows, dang it is slow. i feel like i am on a Pent D
> 
> before anyone asks, i pass prime with flying colors and ANY OTHER stress ( 4.8ghz-2400cl10 quad channel ) . near anyone with a intel ( pick one - anyone ) and anyone with a amd ( pile driver reasonably clocked and stable ) , will say that the amd is far faster in windows --- aka it "feels" faster
> 
> i have decided that i will be making a 290x quadfire ( as i have 4 sitting on the shelf ) with one of my 83xx


My 9590 was faster for me in windows as well, up until i got this SR2 and to me they are close now but the 9590 still seems to have given me the best windows experience.(i used up to x79)


----------



## BootPirate

Quote:


> Originally Posted by *Alex132*
> 
> Title was a joke-reference to something a friend and I say all the time
> 
> 
> 
> 
> 
> 
> 
> 
> The AMD drivers aren't bad, they just could be better. My crashes/BSoDs due to drivers has been at an all time high with AMD though.
> 
> I think I last counted 57 software BSoDs. in 3 months with my 295X2 compared to 1 in 3 years with my 690. And those are BSoDs - not just driver freezes. (I can't count those with Windows).
> 
> I do think it's both? MSI AB never did this before on this scale. And I don't recall seeing it with 15.4. I had seen it happen once or twice maybe, but it's getting worse and worse. Today I lost signal from my GPU twice randomly - once in game and once at desktop - but no weird glitches like in that video since I haven't touched MSI AB since. So that does seem to be the cause.
> 
> NOTE: I don't overclock for anything but benchmarks. And I haven't run benchmarks on this card in 3+ weeks. On 15.5 drivers, 4.1.1 MSI AB, latest updated W7 x64.
> 
> I really, really, really, really hope this graphics card isn't dying.


I get some serious freezing when I play games. I too have only really experienced it with AMD.
I think I've had a few BSoD's when I first got the card, but haven't for a long time now (not since updating the drivers)


----------



## Dagamus NM

So gaming is cool but any configuration should yield good performance in games. My primary focus is Adobe apps and a few radiation effective dose calculators.

Pro's for using the x2's with Intel is pcie 3.0 and 12 logical cores. Also I will have 8 pcie lanes remaining so I could add Intel 750's. Probably a waste on this build as it is older. Honestly this rig is almost relegated to gaming as my 4930k build and two 5960x builds should do work. Well the 4930k build is whatever.

Pro's for using the x2's with the AMD setup: having 32 pcie lanes splits nicely with two physical cards and allows me to utilize the 40 lanes across the four 280x's.

Pro's for using the 280x's with Intel: again the pcie lanes, I would have x16/x8/x8/x8 vs. x8/x8/x8/x8 (and on pcie 2.0 to boot) and the nickel HWbot blocks match the CPU block.

Pro's for using the 280x's with AMD: it is already assembled. The loop hasn't been filled, my air pressure gauge and adapters come Thursday so that I can pressure test the loop. Frees the 295x2's to be used in the Intel build.

So this is my dilemma.

Sorry you are not happy with your 3930k Mega, mine has been making me happy for a couple of years now. My 4930k just doesn't have the same traits as the 3930k. I hope my 5960x's make me not care about any of these older setups but I need a few more parts before I fire the first one up for the first time. Radiator screws are holding up my project. M3-0.5x40mm are hard to find lol. I have 100 coming in Thursday through fastenal. Aqua computer airplex rads are hard to find screws for.


----------



## Mega Man

dont get me wrong it is fine, i just prefer amd for day to day, both rigs have their strong suits ( as both intel and amd have their strong suits ) , and i use both actively

but recently ( a windows update ) windows on this rig has been super groggy vs amd that just does it


----------



## ramos29

i purshaced a samsung 28 4k monitor

the box dos not contain a display cable
and as amd owners are smart enough to designe a gpu without hdmi output i am forced to use a dvi-hdmi cable
the sceen is recognised as a full hd monitor
dvi can not transfert 4k resolution?
i am struggling to find a displaycable in here
for now i am using dynamic super resolution: is this option delivering a true 4k experience on my screen or not


----------



## kayan

Quote:


> Originally Posted by *ramos29*
> 
> i purshaced a samsung 28 4k monitor
> 
> the box dos not contain a display cable
> and as amd owners are smart enough to designe a gpu without hdmi output i am forced to use a dvi-hdmi cable
> the sceen is recognised as a full hd monitor
> dvi can not transfert 4k resolution?
> i am struggling to find a displaycable in here
> for now i am using dynamic super resolution: is this option delivering a true 4k experience on my screen or not


When I bought a UD590 from Best Buy earlier this year, there was no DP cable in the box. There was HDMI and DVI I believe. DVI can't do 4k I believe. I ended up returning the monitor.


----------



## ramos29

i am thinking the same : return the monitor


----------



## xer0h0ur

DVI has never supported 4K at any refresh rate. HDMI 1.4 will do 4K @ 30Hz, HDMI 2.0 will do 4K @ 60Hz but no AMD cards have HDMI 2.0 and Displayport 1.2a will do 4K @ 60Hz.


----------



## ramos29

ok thx for your reply


----------



## wermad

You're not gonna get a dp cable. Its rare for a monitor to come w/ one. Get on ebay and buy one. I have a bunch of them here. Even some 10' cables (10' @ 30hz). Though, I only have dp to dp. You need a mini dp to dp adapter. The ones MSI included in my Lighting 7970s are the dongle kind and not the adapter which seems flimsy. Newegg has a startech mini dp to dp cable for ~$10-15. I'm using this one and had no issues at all. Still running 60hz on my Sammy.


----------



## Alex132

You could just get a mDP to DP cable - that's what I did. It was $7 for a 2m cable









So dumb that monitors don't come with DP cables. Like ITS A GREAT STANDARD. USE IT PEOPLE.


----------



## ramos29

Quote:


> Originally Posted by *Alex132*
> 
> You could just get a mDP to DP cable - that's what I did. It was $7 for a 2m cable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So dumb that monitors don't come with DP cables. Like ITS A GREAT STANDARD. USE IT PEOPLE.


i have a mDP adaptator so i seek DP-DP cable

Quote:


> Originally Posted by *wermad*
> 
> You're not gonna get a dp cable. Its rare for a monitor to come w/ one. Get on ebay and buy one. I have a bunch of them here. Even some 10' cables (10' @ 30hz). Though, I only have dp to dp. You need a mini dp to dp adapter. The ones MSI included in my Lighting 7970s are the dongle kind and not the adapter which seems flimsy. Newegg has a startech mini dp to dp cable for ~$10-15. I'm using this one and had no issues at all. Still running 60hz on my Sammy.


i reanched an agreement over the transfert of a dp to my house this saturday, i will keep using my tv as a monitor untill i receive my cable thx to you all bro


----------



## wermad

Here's the one i bought,

http://www.newegg.com/Product/Product.aspx?Item=N82E16812200602

Its actually cheaper now









Don't get the 10'. Though, it works @ 1080/1200 in 60hz, 4k has a tendency of either dropping to 30hz or dropping the signal all together. Stick w/ 6' (2m) or less.


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> You're not gonna get a dp cable. Its rare for a monitor to come w/ one. Get on ebay and buy one. I have a bunch of them here. Even some 10' cables (10' @ 30hz). Though, I only have dp to dp. You need a mini dp to dp adapter. The ones MSI included in my Lighting 7970s are the dongle kind and not the adapter which seems flimsy. Newegg has a startech mini dp to dp cable for ~$10-15. I'm using this one and had no issues at all. Still running 60hz on my Sammy.


I got one with mine


----------



## wermad

Holly schnitzel, is that a curved 4k monitor? For the amount those go, I would expect free dp cable!


----------



## xer0h0ur

My monitor came with DVI to DVI, HDMI to HDMI and Full DP to Full DP cables when I ordered it from Amazon. I had to buy a cablematters miniDP to Full DP cable on Amazon. Then I found out that even while tri-fired I could have just used the Full DP to Full DP cable and connected to the 290X instead of to the 295X2. Even though the 290X is the secondary card. Whoops. At least I learned something out of it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> My monitor came with DVI to DVI, HDMI to HDMI and Full DP to Full DP cables when I ordered it from Amazon. I had to buy a cablematters miniDP to Full DP cable on Amazon. Then I found out that even while tri-fired I could have just used the Full DP to Full DP cable and connected to the 290X instead of to the 295X2. Even though the 290X is the secondary card. Whoops. At least I learned something out of it.


Yeah, the primary card is the one you connect to your monitor.

Only found that one out a while ago myself......pity we cant use multiple inputs from mutiple cards though


----------



## xer0h0ur

Well that is the thing actually. I am able to connect to either card. Doesn't matter which of the two is the primary. I had just assumed before that the primary was the one that I needed to connect the monitor to. As for using multiple outputs on multiple video cards you can do that but you just can't for gaming purposes. You would just need to disable crossfire. That is ultimately how we did it for my cousin's 6 monitor trading station. Was originally a 290X and a 270 for just trading. Then I gave him the gamer itch and he swapped the 270 for another 290X. So when he is gaming he enables crossfire and only used 3 of the 6 1080p screens and when trading he disables crossfire and uses the 6 screens as a video wall.


----------



## wermad

Think he means running eyefinity off multiple cards like Nvidia does.


----------



## xer0h0ur

Eyefinity can be a group of displays that never games.


----------



## wermad

Amd:



Nvidia:


----------



## TooManyAlpacas

Quote:


> Originally Posted by *wermad*
> 
> Amd:
> 
> 
> 
> Nvidia:


This is very accurate my r9 295x2 just has 4 mini dp to hdmi adapters


----------



## wermad

Those are actually display ports







. That's the mythical 5870 Eyefinity edition. Its when Amd starting touting 3x2 Eyefinity. Eventually, with displayport 1.2 and the MST Hubs, you no longer needed a ton of dp.

But yes, my cards also came w/ mini dp to hdmi. I was very surprised since my 7970s (lightnings) came with mini dp to dp.


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> Holly schnitzel, is that a curved 4k monitor? For the amount those go, I would expect free dp cable!


It's not 4K, but I love this monitor, it's gorgeous. I suspect Samsung will release a 4K version soon, and of course I'll buy


----------



## wermad




----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Think he means running eyefinity off multiple cards like Nvidia does.


Yep
Quote:


> Originally Posted by *wermad*


Wait....I can download a 4k monitor?

Thats better than downloading more RAM!


----------



## wermad

I'm about 50% on my download. Its going over to the 3d-printer in a bit


----------



## kayan

Quote:


> Originally Posted by *SAFX*
> 
> I got one with mine


I have the LG 34" Curved Ultrawide. It's great, except when games don't support it. Mine came with a DP cable too.

If you have issue with resolution SAFX check this out: https://www.flawlesswidescreen.org


----------



## rycust

Does anyone have problems with black screening/driver crashing in The Witcher 2 with a 295X2? My 295X2 is at stock clocks and it does this after 10-30 minutes of gameplay with max settings at 1440p excluding ubersampling.

I also ran Heaven 3 times, played TERA for 2 hours, DiRT RALLY for an hour, Metro LL Redux with 4x SSAA for 40 minutes, and Sniper Elite with 4x SSAA with no problems so I'm not sure what's up. All these games are just as demanding too with up to 100% load on both GPUs. Oh and I'm on 15.6 Beta and all my temperatures in my system are fine.


----------



## xer0h0ur

Quote:


> Originally Posted by *rycust*
> 
> Does anyone have problems with black screening/driver crashing in The Witcher 2 with a 295X2? My 295X2 is at stock clocks and it does this after 10-30 minutes of gameplay with max settings at 1440p excluding ubersampling.
> 
> I also ran Heaven 3 times, played TERA for 2 hours, DiRT RALLY for an hour, Metro LL Redux with 4x SSAA for 40 minutes, and Sniper Elite with 4x SSAA with no problems so I'm not sure what's up. All these games are just as demanding too with up to 100% load on both GPUs. Oh and I'm on 15.6 Beta and all my temperatures in my system are fine.


I don't know if your system is affected like mine has been in the recent past but Afterburner alone has caused more headaches for me than it seems to be worth. The OSD alone causes multiple games to crash. If you use Afterburner I suggest trying to play without it running to see if you get the same result.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rycust*
> 
> Does anyone have problems with black screening/driver crashing in The Witcher 2 with a 295X2? My 295X2 is at stock clocks and it does this after 10-30 minutes of gameplay with max settings at 1440p excluding ubersampling.
> 
> I also ran Heaven 3 times, played TERA for 2 hours, DiRT RALLY for an hour, Metro LL Redux with 4x SSAA for 40 minutes, and Sniper Elite with 4x SSAA with no problems so I'm not sure what's up. All these games are just as demanding too with up to 100% load on both GPUs. Oh and I'm on 15.6 Beta and all my temperatures in my system are fine.
> 
> 
> 
> I don't know if your system is affected like mine has been in the recent past but Afterburner alone has caused more headaches for me than it seems to be worth. The OSD alone causes multiple games to crash. If you use Afterburner I suggest trying to play without it running to see if you get the same result.
Click to expand...

I run Afterburner all the time but i never install Rivatuner, i either use HWiNFO64 on a side monitor and i have afterburner's app on my G19's LCD at all times for temps, GPU load, clock speeds and vram, system ram and pagefile usage.

never had many issues running things like that


----------



## fat4l

guys......im still tempted to go over to the Green Team








SLI 980 Ti w/cooled... + Asus Swift + WORKING G-SYNC in SLI .......


----------



## Sgt Bilko

Quote:


> Originally Posted by *fat4l*
> 
> guys......im still tempted to go over to the Green Team
> 
> 
> 
> 
> 
> 
> 
> 
> SLI 980 Ti w/cooled... + Asus Swift + WORKING G-SYNC in SLI .......




__
https://www.reddit.com/r/3b47rs/freesync_crossfire_test_r9_295x2/

I think the next driver will have Freesync working with Crossfire


----------



## xer0h0ur

No one is stopping you from going green.


----------



## rycust

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know if your system is affected like mine has been in the recent past but Afterburner alone has caused more headaches for me than it seems to be worth. The OSD alone causes multiple games to crash. If you use Afterburner I suggest trying to play without it running to see if you get the same result.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> I run Afterburner all the time but i never install Rivatuner, i either use HWiNFO64 on a side monitor and i have afterburner's app on my G19's LCD at all times for temps, GPU load, clock speeds and vram, system ram and pagefile usage.
> 
> never had many issues running things like that


Yup looks like it was related to RTSS, I kept MSI Afterburner on but uninstalled RTSS and it doesn't crash now. I had OSD disabled but I put the Hardware Monitor on my second monitor to monitor my stuff cause I didn't like having text on my games. I think RTSS is required to monitor fps on the Hardware Monitor though (don't see fps on the list after uninstalling it but other things such as temps, clocks, are still there) so I think I might just reinstall it. The crash only happens in Witcher 2 so I can just exit it if I wanna play. Thanks for the help guys!


----------



## Sgt Bilko

Quote:


> Originally Posted by *rycust*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I don't know if your system is affected like mine has been in the recent past but Afterburner alone has caused more headaches for me than it seems to be worth. The OSD alone causes multiple games to crash. If you use Afterburner I suggest trying to play without it running to see if you get the same result.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I run Afterburner all the time but i never install Rivatuner, i either use HWiNFO64 on a side monitor and i have afterburner's app on my G19's LCD at all times for temps, GPU load, clock speeds and vram, system ram and pagefile usage.
> 
> never had many issues running things like that
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Yup looks like it was related to RTSS, I kept MSI Afterburner on but uninstalled RTSS and it doesn't crash now. I had OSD disabled but I put the Hardware Monitor on my second monitor to monitor my stuff cause I didn't like having text on my games. I think RTSS is required to monitor fps on the Hardware Monitor though (don't see fps on the list after uninstalling it but other things such as temps, clocks, are still there) so I think I might just reinstall it. The crash only happens in Witcher 2 so I can just exit it if I wanna play. Thanks for the help guys!
Click to expand...

Glad it worked for you









for FPS i just use fraps, HWiNFO picks up on that and i think HWMonitor does as well.....not too sure


----------



## joeh4384

Any rumors on the unified driver for 200/300 series? I hope one comes out quick as Frame limiter and crossfire freesync would be nice to have.


----------



## Themisseble

11111111


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> But yes, my cards also came w/ mini dp to hdmi. I was very surprised since my 7970s (lightnings) came with mini dp to dp.


Yep, I still have many of those mini dp to dp connectors from my 7970/7950's. I never used them and probably never will. Startech has the 6' mini dp to dp cables, about $10-15 depending if you go to newegg, tiger, or Amazon. I personally am not a big fan of adapters unless they are active. I say this as I am currently using the dp to HDMI dongle with the x2's but it is only temporary.


----------



## SAFX

Quote:


> Originally Posted by *kayan*
> 
> I have the LG 34" Curved Ultrawide. It's great, except when games don't support it. Mine came with a DP cable too.
> 
> If you have issue with resolution SAFX check this out: https://www.flawlesswidescreen.org


Did you find yourself using that? I've had pretty good success with my monitor, except for the occasional issue when it fails to wake up, have to remove then attach power cord to get a picture.


----------



## kayan

Quote:


> Originally Posted by *SAFX*
> 
> Did you find yourself using that? I've had pretty good success with my monitor, except for the occasional issue when it fails to wake up, have to remove then attach power cord to get a picture.


I haven't had an issue with it not waking up, but yes, I've used the program for the games stated in my previous post. I have a backlog of like well, let's just say Steam sales are a bain to my wallet, so I've been playing some older games lately. It works well for what it supports so far. Check out the supported list on the website I linked and then you can decide if you need it. It's made Skyrim and Black Ops, which were completely unplayable due to UI issues, to where I can now play both in fullscreen at the full resolution of my monitor (same as yours).


----------



## SAFX

Anyone have success changing the fan on the card (not the rad fan) for lower temps?


----------



## joeh4384

Quote:


> Originally Posted by *SAFX*
> 
> Anyone have success changing the fan on the card (not the rad fan) for lower temps?


I lowered temps to 60s by switching from stock fan to a pair of corsair sp120s high performance.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Anyone have success changing the fan on the card (not the rad fan) for lower temps?


Huge success.

I have 2 Cougar Vortex fans in there, and my maximum temperature I have ever seen is 65'c on both GPUs with 1200/1500 +100mV +50%.
While stock I get around ~52-55'c.

I have the fans attached to my motherboard header, and then a fan-curve set up in speedfan acording to the GPU temperatures.

Best part? They don't go beyond 800rpm when gaming with a single GPU. And dual GPUs they're around 1050rpm.


----------



## SAFX

Quote:


> Originally Posted by *joeh4384*
> 
> I lowered temps to 60s by switching from stock fan to a pair of corsair sp120s high performance.


Umm, what?







how did you fit 2 SP120s to replace the fan on the card?
Quote:


> Originally Posted by *Alex132*
> 
> Huge success.
> 
> I have 2 Cougar Vortex fans in there, and my maximum temperature I have ever seen is 65'c on both GPUs with 1200/1500 +100mV +50%.
> While stock I get around ~52-55'c.
> 
> I have the fans attached to my motherboard header, and then a fan-curve set up in speedfan acording to the GPU temperatures.
> 
> Best part? They don't go beyond 800rpm when gaming with a single GPU. And dual GPUs they're around 1050rpm.


Got any photos? I was talking about the red LED fan on the card


----------



## NBrock

Quote:


> Originally Posted by *SAFX*
> 
> Anyone have success changing the fan on the card (not the rad fan) for lower temps?


I did the "fan mod" I got an adapter so I could plug it into my motherboard and control the fan speed. It is set at 55% for my daily stuff and gaming. For benching I just crank it up to 100%.
I did that and swapped the TIM on the gpus and it made a big difference. currently running 1100 core and 1300 mem @ 68* I can do 1200 for benching with fan speed up all the way.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Got any photos? I was talking about the red LED fan on the card


It's 4:30am and I can't read









I would love to do this mod too, however you'd need a very, very specific fan-size and type and etc.

You'd end up just buying the same fan again really...


----------



## SAFX

Quote:


> Originally Posted by *NBrock*
> 
> I did the "fan mod" I got an adapter so I could plug it into my motherboard and control the fan speed. It is set at 55% for my daily stuff and gaming. For benching I just crank it up to 100%.
> I did that and swapped the TIM on the gpus and it made a big difference. currently running 1100 core and 1300 mem @ 68* I can do 1200 for benching with fan speed up all the way.


I used Arctic MX-4 on my gpus last week, reduced temps by about 1C. I was expecting more because I did the same on my CPU last month, results were amazing:


----------



## SAFX

Quote:


> Originally Posted by *joeh4384*
> 
> I lowered temps to 60s by switching from stock fan to a pair of corsair sp120s high performance.


I'm assuming that's push/pull? what are you using for fan control?

I tried a single SP120 PE on the rad too, but the noise was too much, switched to Gentle Typhoon AP-15, max temp 66c @100% for 15m, good enough for me.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> It's 4:30am and I can't read
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would love to do this mod too, however you'd need a very, very specific fan-size and type and etc.
> 
> You'd end up just buying the same fan again really...


All good!







I figured so


----------



## Alex132

The stock goop on the 295X2s is probably Shin Etsu, good stuff. And it isn't applied horribly either.

I also got about no difference applying MX4.


----------



## NBrock

I used PK-1


----------



## SAFX

Quote:


> Originally Posted by *NBrock*
> 
> I used PK-1


Any improvement over stock paste?


----------



## Alex132

pk1 is about the same as mx4


----------



## fat4l

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> __
> https://www.reddit.com/r/3b47rs/freesync_crossfire_test_r9_295x2/
> 
> I think the next driver will have Freesync working with Crossfire


this is very interesting. has anyone tried it ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 
> __
> https://www.reddit.com/r/3b47rs/freesync_crossfire_test_r9_295x2/
> 
> I think the next driver will have Freesync working with Crossfire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is very interesting. has anyone tried it ?
Click to expand...

If i had a Freesync monitor i would have done it by now but sadly i don't know of anyone else besides him that has done it.

Either way I'd wait till the next driver release (shouldn't be far away) and see what's in it


----------



## SAFX

Looking for specs for water block and mounting hole dimensions, anyone?


----------



## kayan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If i had a Freesync monitor i would have done it by now but sadly i don't know of anyone else besides him that has done it.
> 
> Either way I'd wait till the next driver release (shouldn't be far away) and see what's in it


I doubt we'll see any new drivers for any cards between now and the launch of W10. With how much we keep hearing on AMD's future successes being tied to W10, I doubt they're (the driver team that is)working on anything else.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> If i had a Freesync monitor i would have done it by now but sadly i don't know of anyone else besides him that has done it.
> 
> Either way I'd wait till the next driver release (shouldn't be far away) and see what's in it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I doubt we'll see any new drivers for any cards between now and the launch of W10. With how much we keep hearing on AMD's future successes being tied to W10, I doubt they're (the driver team that is)working on anything else.
Click to expand...

just because Win 10 is launching doesn't mean there won't be a driver before it that is certified for it.


----------



## kayan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> just because Win 10 is launching doesn't mean there won't be a driver before it that is certified for it.


True, true, I guess I wasn't clear. I meant that I doubt that we will see anything that isn't for w10.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> just because Win 10 is launching doesn't mean there won't be a driver before it that is certified for it.
> 
> 
> 
> True, true, I guess I wasn't clear. I meant that I doubt that we will see anything that isn't for w10.
Click to expand...

i can't remember when (or even if) AMD has released a driver update for only one OS.....if a new driver comes out it'll be for more than Win 10.


----------



## SAFX

Has anyone tried modding with separate AIO loops and rads? For instance, replacing the Asetek system with 2 Corsair H50's, one for each gpu?

Here's another option, bigger rad, all AIO, maybe leads to lower temps? (just not sure if it's compatible)

http://asetek.com/press-room/news/2010/new-asetek-liquid-cpu-coolers-support-pcs-with-92mm-fans.aspx


----------



## cennis

Quote:


> Originally Posted by *SAFX*
> 
> Has anyone tried modding with separate AIO loops and rads? For instance, replacing the Asetek system with 2 Corsair H50's, one for each gpu?
> 
> Here's another option, bigger rad, AIO, lower temps? (just not sure if it's compatible)


I have done this before, but these AIO blocks' mounting is different than the other CPU AIOs. I guess you could zip tie them on.

Also the cold plate has a raised part that fits the shape of the GPU die.

295x2 AIO vs H90


I ended up getting someone elses 295x2 stock cooler to use perform this mod. (with 2 pumps hanging outside)



If you really want to put 2 x H50s, you need to unscrew the coldplate/mounting ring which will cause some of the coolant to spill out, which may cause air bubble noises in some rad orientations,


----------



## SAFX

Quote:


> Originally Posted by *cennis*
> 
> I have done this before, but these AIO blocks' mounting is different than the other CPU AIOs. I guess you could zip tie them on.
> 
> Also the cold plate has a raised part that fits the shape of the GPU die.
> 
> 295x2 AIO vs H90
> 
> 
> I ended up getting someone elses 295x2 stock cooler to use perform this mod. (with 2 pumps hanging outside)
> 
> 
> 
> If you really want to put 2 x H50s, you need to unscrew the coldplate/mounting ring which will cause some of the coolant to spill out, which may cause air bubble noises in some rad orientations,


AWESOME!

I was thinking about using the block/rad from another 295x2 as well.

Was it worth the trouble?
How were the temps?
How did you manage the power connectors for the blocks?


----------



## SAFX

Quote:


> Originally Posted by *cennis*
> 
> I have done this before, but these AIO blocks' mounting is different than the other CPU AIOs. I guess you could zip tie them on.
> 
> Also the cold plate has a raised part that fits the shape of the GPU die.
> 
> 295x2 AIO vs H90


Forgot to ask, so the retention ring on the H90 is not the same size? Technically, any low-profile block with a compatible retention ring should work, right?


----------



## cennis

Quote:


> Originally Posted by *SAFX*
> 
> AWESOME!
> 
> I was thinking about using the block/rad from another 295x2 as well.
> 
> Was it worth the trouble?
> How were the temps?
> How did you manage the power connectors for the blocks?


Quote:


> Originally Posted by *SAFX*
> 
> Forgot to ask, so the retention ring on the H90 is not the same size? Technically, any low-profile block with a compatible retention ring should work, right?


If you look at the comparison , the retention ring is screwed to the bottom, instead of the regular ones that "hook" on to the sides. Thus the height of this ring is specifically designed for the copper cold plate with the raised area for the gpu die, i think the regular flat cold plate will not even make contact with the die if you use 295x2's retention ring.

EDIT: CPU retention rings do not work since the holes will not match the 295x2 holes

And switching the copper cold plate causes loss coolant and these things are hard to fill. The kit will likely survive this transition, but there may be extra air in the system.

Using another 295x2 set costs money and it is not very elegant due to 2 extra pumps hanging out. The pumps can be powered by any 3pin fan header, they are not variable speed to my knowledge anyway.

Temps stabilized around 60c full load but the problem with these cards is the VRM overheat/throttling. I cant find the photos but I made a copper piece that let me mod one of the extra pumps onto the VRMs. So if you can imagine it is 3 pumps in total across the card. I didn't do much testing but it seemed to work okay.

I have already moved on to a custom block since its much more elegant.


----------



## SAFX

Quote:


> Originally Posted by *cennis*
> 
> If you look at the comparison , the retention ring is screwed to the bottom, instead of the regular ones that "hook" on to the sides. Thus the height of this ring is specifically designed for the copper cold plate with the raised area for the gpu die, i think the regular flat cold plate will not even make contact with the die if you use 295x2's retention ring.
> 
> EDIT: CPU retention rings do not work since the holes will not match the 295x2 holes
> 
> And switching the copper cold plate causes loss coolant and these things are hard to fill. The kit will likely survive this transition, but there may be extra air in the system.
> 
> Using another 295x2 set costs money and it is not very elegant due to 2 extra pumps hanging out. The pumps can be powered by any 3pin fan header, they are not variable speed to my knowledge anyway.
> 
> Temps stabilized around 60c full load but the problem with these cards is the VRM overheat/throttling. I cant find the photos but I made a copper piece that let me mod one of the extra pumps onto the VRMs. So if you can imagine it is 3 pumps in total across the card. I didn't do much testing but it seemed to work okay.
> 
> I have already moved on to a custom block since its much more elegant.


Jesus Christ, 3 pumps, lol. I'd pay anything to have seen that!

Screw it, you just convinced me to go custom loop.....XSPC, Raystorm, Photon light tube, here I come BABY!


----------



## cennis

Quote:


> Originally Posted by *SAFX*
> 
> Jesus Christ, 3 pumps, lol. I'd pay anything to have seen that!
> 
> Screw it, you just convinced me to go custom loop.....XSPC, Raystorm, Photon light tube, here I come BABY!


I still have all the parts, I can do a mock up later and take some pics.

Good choice, but I'm not too sure about XSPC GPU blocks are good for vrm cooling, seem like aquacomputers(i have) is good for VRMs.
If you plan to overclock, pick wisely


----------



## electro2u

Quote:


> Originally Posted by *SAFX*
> 
> Jesus Christ, 3 pumps, lol. I'd pay anything to have seen that!
> 
> Screw it, you just convinced me to go custom loop.....XSPC, Raystorm, Photon light tube, here I come BABY!


Think about this carefully. I got into wc because of the 295x2 and it pretty much put the nails in the coffin on my marriage. Its like getting a dog. Huge commitment.


----------



## SAFX

Quote:


> Originally Posted by *cennis*
> 
> I still have all the parts, I can do a mock up later and take some pics.
> 
> Good choice, but I'm not too sure about XSPC GPU blocks are good for vrm cooling, seem like aquacomputers(i have) is good for VRMs.
> If you plan to overclock, pick wisely


But XSPC has active vrm cooling? You're saying aquacomputer full card blocks perform better on the vrm? damnit, I really like xspc's leds









according to this, the results seem pretty good:


----------



## SAFX

Quote:


> Originally Posted by *electro2u*
> 
> Think about this carefully. I got into wc because of the 295x2 and it pretty much put the nails in the coffin on my marriage. Its like getting a dog. Huge commitment.


lol, I'm not married, so I think I'm ok


----------



## cennis

Quote:


> Originally Posted by *SAFX*
> 
> But XSPC has active vrm cooling? You're saying aquacomputer full card blocks perform better on the vrm? damnit, I really like xspc's leds
> 
> 
> 
> 
> 
> 
> 
> 
> 
> according to this, the results seem pretty good:


Almost all full cover blocks are "active vrm cooling" with few exceptions.



Based on this, it seems like the AC block/backplate is the best combo for VRM.

Sadly there are no reviews like that for 295x2.

Source:
http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/


----------



## cennis

Quote:


> Originally Posted by *cennis*
> 
> Almost all full cover blocks are "active vrm cooling" with few exceptions.
> 
> 
> 
> Based on this, it seems like the AC block/backplate is the best combo for VRM.
> 
> Sadly there are no reviews like that for 295x2.
> 
> Source:
> http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/


It seems like the AC backplate is the important factor, you could maybe get XSPC block + AC backplate? (not sure if it will fit)
The back plate is thick and very solid to maybe ensure enough pressure on the vrms so they make good contact with the block.

However, the active vs passive AC backplates dont seem to make much of a difference.

I have no way to tell if these results translated to the 295x2 version. I don't have a good way to measure vrm temps and there is no software reporting for this card









But i never throttled at 1200mhz and +100mv core even though my case's airflow is very very limited.


----------



## kayan

Quote:


> Originally Posted by *SAFX*
> 
> Jesus Christ, 3 pumps, lol. I'd pay anything to have seen that!
> 
> Screw it, you just convinced me to go custom loop.....XSPC, Raystorm, Photon light tube, here I come BABY!


I like your choices. XSPC Raystorm CPU block, and a Photon 170. This thing is gorgeous. Make sure your case can fit it though, it's HUGE! I had dual XSPC Razor Gpu block + backplate for my dual 290x, before swapping to a 295x2. I have not heard good things about the Razor GPU block though for the 295. Let me find the link for you. Nevermind, can't find the link, or it seems to not exist anymore, so take that with a grain of salt. That being said, I LOVED the 290x XSPC blocks.

I want to throw this GPU into my custom loop, but mine doesn't overclock at all. Like not even a little, so I'm not sure the cost would be a benefit to me.
Quote:


> Originally Posted by *electro2u*
> 
> Think about this carefully. I got into wc because of the 295x2 and it pretty much put the nails in the coffin on my marriage. Its like getting a dog. Huge commitment.


Marriage is not quite the same as a dog. A dog will love you whether you spend a grand on watercooling parts or not...









That being said, my wife encouraged my foray into custom loop, and even was the deciding factor in my finally doing it.

Edit: Correction.


----------



## NBrock

Quote:


> Originally Posted by *SAFX*
> 
> Any improvement over stock paste?


It's hard to say because I did that at the same time I swapped to dual Corsair SP High Performance fans in push pull. Together though my load temps at stock clocks came down about 18*c.

There was way more stock TIM than necessary on the pumps/heatsinks though. So maybe a nice tidy layer helped as well as the PK-1 being a bit better.


----------



## SAFX

Quote:


> Originally Posted by *cennis*
> 
> Based on this, it seems like the AC block/backplate is the best combo for VRM.
> 
> Sadly there are no reviews like that for 295x2.
> 
> Source:
> http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/


This is confusing: does "block only" = "passive backplate"? If so, why does it say "active backplate is not worth it" while the graph shows VRM temp differences between "block only" and AC as 71 and 24C, respectfully?

What am I missing?


----------



## electro2u

Quote:


> Originally Posted by *kayan*
> 
> Marriage is not quite the same as a dog. A dog will love you whether you spend a grand on watercooling parts or not...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That being said, my wife encouraged my foray into custom loop, and even was the deciding factor in my finally doing it.











yes, marriage isnt anything like getting a dog. Watercooling is like getting a dog.
Quote:


> Originally Posted by *SAFX*
> 
> This is confusing: does "block only" = "passive backplate"? If so, why does it say "active backplate is not worth it" while the graph shows VRM temp differences between "block only" and AC as 71 and 24C.


No block only means no backplate.

AC Passive backplate has almost same vrm temps as active.


----------



## ColeriaX

Well I found some time yesterday to dig into tuning my PC. Heres some potatoe phone pics and results with quadfire + X99 5930K. BTW my VRM temps are better with the backplate...I can give much more voltage now than I ever could on the AIO


----------



## SAFX

Quote:


> Originally Posted by *electro2u*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yes, marriage isnt anything like getting a dog. Watercooling is like getting a dog.
> No block only means no backplate.
> 
> AC Passive backplate has almost same vrm temps as active.


OK, I see my confusion, I thought "AC" meant "Active" backplate, it looks like they're just mixing blocks and backplates in the article, right?
So it appears EK is compatible with Aquacomputer backplate, I wonder if XSPC works with AC backplate, I'll email XSPC to find out.


----------



## cennis

Quote:


> Originally Posted by *SAFX*
> 
> OK, I see my confusion, I thought "AC" meant "Active" backplate, it looks like they're just mixing blocks and backplates in the article, right?
> So it appears EK is compatible with Aquacomputer backplate, I wonder if XSPC works with AC backplate, I'll email XSPC to find out.


http://shop.aquacomputer.de/index.php?cPath=7_11_149_150

look at products 1 and 4, first is passive backplate and fourth is active backplate


----------



## cennis

Quote:


> Originally Posted by *SAFX*
> 
> Jesus Christ, 3 pumps, lol. I'd pay anything to have seen that!
> 
> Screw it, you just convinced me to go custom loop.....XSPC, Raystorm, Photon light tube, here I come BABY!


The modded 295x2 looked a little something like this.
Didn't end up making a better mounting solution for the vrm AIO and the shroud + zip tie held it down for my testing.

The copper piece in the middle I made is basically a replica of the original vrm heatsink minus the fins. That then comes into contact with the middle AIO. The actual VRM is much lower than the faceplate and other electronics so you cannot directly apply the AIO onto the vrms.


----------



## Dagamus NM

So I find it interesting that the xspc block in that report did not have the protrusion that brings the block into contact with the gpu. The xspc solution is thicker TIM to fill the gap in between whereas EK has used a copper shim for this purpose in the past for the Tahiti gpus. I wonder if this was corrected for the 295x2.

I wanted aquacomputer for these but couldn't get them as they were not in stock. The EK blocks in nickel acetal are gorgeous however. I know that the EK and AC are the more expensive solutions so if the others can be obtained for considerably cheaper then a few degrees difference is worth it.


----------



## SAFX

Effective cooling, I'm sure, but you're right, not so elegant looking with 3 pumps; style is important


----------



## cennis

Quote:


> Originally Posted by *SAFX*
> 
> Effective cooling, I'm sure, but you're right, not so elegant looking with 3 pumps; style is important


Much more elegant : http://www.overclock.net/t/1548834/ft03-mini-mod-with-295x2-fully-water-cooled-cooled-by-2x120mm-2x180mm


----------



## SAFX

Quote:


> Originally Posted by *cennis*
> 
> Much more elegant : http://www.overclock.net/t/1548834/ft03-mini-mod-with-295x2-fully-water-cooled-cooled-by-2x120mm-2x180mm


Super craftmanship! nice work, dude, love the colors,









...giving me lots of ideas!


----------



## AeroXbird

I've got this odd issue with my 295x2.
Every time whilst playing GTA 5 ( also happens with other demanding games ) my card starts downclocking between 800-1000mhz, severely impacting performance and causing intense frametime spikes.
The temperatures are fine, they range between 60-65C so this should not be a problem.
My initial thoughts were VRM overheating, but as far as I have read, when the VRMs overheat, the clockspeed drops to 300mhz and does not start fluctuating like in my case.

Any ideas?

specs are in my signature


----------



## xer0h0ur

As I told the last guy, go into Afterburner and select "without powerplay support" under the unofficial overclocking mode. This will force your 3D clocks non-stop and simply won't allow any downclocks whatsoever.


----------



## AeroXbird

Quote:


> Originally Posted by *xer0h0ur*
> 
> As I told the last guy, go into Afterburner and select "without powerplay support" under the unofficial overclocking mode. This will force your 3D clocks non-stop and simply won't allow any downclocks whatsoever.


Thanks this seems to have solved the problem for me.
Still getting FPS drops in GTA 5 below 50fps though, which puzzles me, but the stable clock speeds have at least stopped the random absurd high frametimes


----------



## xer0h0ur

That hitching may be due to a specific setting in GTAV. I can't comment further as I don't own the game. I refuse to pay the price they want for a game that is old already.


----------



## AeroXbird

Quote:


> Originally Posted by *xer0h0ur*
> 
> That hitching may be due to a specific setting in GTAV. I can't comment further as I don't own the game. I refuse to pay the price they want for a game that is old already.


Could very well be, CPU usage is also extremely high for me in this game, so it could very well be that I've got myself a CPU bottleneck ( first I've gotten in 5 years ).
I don't blame you for not paying for it, it sure was not worth the 60$ imo, but nonetheless I try to play it to make myself feel like I put that 60$ to good use


----------



## xer0h0ur

Well it also doesn't help us one bit that we still haven't received the DX11 CPU overhead improvements that the Windows 10 driver implements. Either way even if AMD never ports over those improvements into the Win7/8.1 driver then you will get a significant boost in CPU bound games if you upgrade to Windows 10.


----------



## SAFX

Are these (more or less) the correct essential steps *after* installing a new driver given these settings tend to reset?

1) Set EnableUlps to "0" in registry
2) (MSI only) Unofficial overclocking mode: *without PowerPlay Support*
3) CCC -> Enable *Show AMD CF Status icon* (not required, but good litmus test)


----------



## fat4l

it would be even more cool if any of you know how to add more volts than +100mV in afterburner


----------



## xer0h0ur

Far as I know the only way you would manage that is flashing the BIOS so its already using a higher voltage by default then raising the voltage even more through Afterburner.


----------



## ur4skin

Hi guys...I have a question, I recently came across two r9 295x2 and installed them on my sabertooth z97....problem is on gpuz it sais crossfire disabled, my ccc doesnt have the option to enable it....how do I get it enabled.

P.S my valley results are crazy low like 33.8 FPS and a score off 1416 at 4k res.


----------



## xer0h0ur

Have you verified that each card works properly by using only one at a time? Did you use DDU to completely wipe out any previous driver installation? Did you disable ULPS through Afterburner?


----------



## SAFX

Quote:


> Originally Posted by *ur4skin*
> 
> Hi guys...I have a question, I recently came across two r9 295x2 and installed them on my sabertooth z97....problem is on gpuz it sais crossfire disabled, my ccc doesnt have the option to enable it....how do I get it enabled.
> 
> P.S my valley results are crazy low like 33.8 FPS and a score off 1416 at 4k res.


Uninstall drivers using DDU.
Before reinstalling drivers, manually delete these folders:

Program Data/AMD
Program Files/AMD
Program Files (x86)/AMD
Install drivers, retest


----------



## ur4skin

both gpu's definitely works, I have uninstalled old drivers and tried new ones....beta and other. When you say DDU what do you mean???


----------



## xer0h0ur

You said 2 295X2's which implies 4 GPUs. The 295X2 by default is already crossfired. If you are only using a single 295X2 then you will not have a crossfire tab to disable/enable crossfire. The 295X2 crossfires by default and you need to create application profiles to disable crossfire. Only if you have dual 295X2's for a total of 4 GPUs should you be seeing a tab in the CCC to enable/disable crossfire (I think). You may need to get one of the few people that have dual 295X2's to clarify this.

DDU is display driver uninstaller. It wipes out driver installations fairly thoroughly. The program needs to be run in safe mode or you can just run it in Windows and allow it to restart you into safe mode when prompted.


----------



## ur4skin

Yes its two r9 295x2 so 4 gpu....I just followed what SAFX said...wich drivers must i install I have 14.12 and the beta ones??


----------



## xer0h0ur

Just download the latest beta 15.6 driver. The 14.12 Omega is quite far behind on optimizations and crossfire profiles which are very important for you in running a quad GPU setup.


----------



## ur4skin

Ok I just did all that and no difference, is it not maybe a bios setting or something....or this bridge-less cf.


----------



## xer0h0ur

Its not the XDMA. Like I said before, did you make sure that each card is working on an individual basis? This means disconnecting the power cables from the card you're not testing. By doing this you are both verifying that each PCI-E slot is functioning and also verifying that each card works by itself.

Edit: Are you even sure you have the power supply connected properly to both video cards so you're not overloading any 12V rail?


----------



## ur4skin

ok will try that again in the morn.....thanks for the help.


----------



## xer0h0ur

I am just going to leave this here for you: http://forum.corsair.com/v3/showthread.php?t=131107

That power supply needs you to install the link software and change a setting so you don't wind up with problems running dual 295X2's. I wouldn't even be surprised if right now your cards aren't working properly due to them not being powered properly.


----------



## ur4skin

my psu settings are all on 40A already.

I just noticed on my gpuz when I select one of the second r9 295x2 gpu's in the bus interface it sais "[email protected] 2.0, but the first any of the first radeon gpu's sais @16 3.0, does that mean I have made a idiotic move and inserted the second gpu in the wrong slot. My mobo manual said x16/16/16


----------



## xer0h0ur

There is a button in GPU-Z to test your PCI-E bandwidth under load. Click it so you can see what gen and speed your PCI-E slots are running at under load.


----------



## wermad

Quote:


> Originally Posted by *ur4skin*
> 
> my psu settings are all on 40A already.
> 
> I just noticed on my gpuz when I select one of the second r9 295x2 gpu's in the bus interface it sais "[email protected] 2.0, but the first any of the first radeon gpu's sais @16 3.0, does that mean I have made a idiotic move and inserted the second gpu in the wrong slot. My mobo manual said x16/16/16


It should be the top two pcie 16 slots. Once you populate both, it should be 8x 3.0 on each core (via the plx chip).

Have you updated your mobo bios?


----------



## ur4skin

the first card is running @16 3.0 but the second card is @ 1 2.0


----------



## ur4skin

Mine is the mark S but im sure its the same...And my second card is in the 3rd slot...that's prob the error here...sorry for the noob questions here...its my first crossfire attempt.


----------



## xer0h0ur

Even though he linked the wrong motherboard he is correct. The SABERTOOTH Z97 MARK S uses the top two PCI-E slots for Crossfire/SLI. You can't use the third bottom-most slot.

Manual: http://dlcdnet.asus.com/pub/ASUS/mb/LGA1150/SABERTOOTH_Z97_MARK_S/e9704_sabertooth_z97_mark_s_ug_for_web_only.pdf?_ga=1.121002632.1078437766.1436046134


----------



## wermad

Quote:


> Originally Posted by *ur4skin*
> 
> Mine is the mark S but im sure its the same...And my second card is in the 3rd slot...that's prob the error here...sorry for the noob questions here...its my first crossfire attempt.


I know cooling will be a bit hampered for the top card, but that's the two slots to use the cards w/. If you wanna get a bit better air cooling, look for an lga1150 w/ three slots in between making sure they're run at least @ 8x 3.0.

edit:

You should see something like this in gpuz (although not @ 16x speed).


----------



## F4ze0ne

Happy 4th guys (in America)...


----------



## ur4skin

Yes I saw my mistake.....I understood the manual wrong. Thanks for all the help guys, will start looking at upgrading to x99 system.


----------



## rakesh27

Sorry for change in subject, I need help

At present i have a 295x2 and 290x, with current amd drivers including beta its more headache then its worth constant crashes.

So i have a 980ti coming, im thinking take out the single 290x and then put in the 980ti, then which ever gpu (amd or nvidea) i want to use i power that card only, this is all depending my 295x2 behaves properly without the 290x.

Or should i sell both 295x2 and 290x and get another 980ti for sli gaming, will it be better then a single 295x2 and 2 x gpu gaming, meaning would sli be better in gaming then crossfire ?

Whats should i do ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rakesh27*
> 
> Sorry for change in subject, I need help
> 
> At present i have a 295x2 and 290x, with current amd drivers including beta its more headache then its worth constant crashes.
> 
> So i have a 980ti coming, im thinking take out the single 290x and then put in the 980ti, then which ever gpu (amd or nvidea) i want to use i power that card only, this is all depending my 295x2 behaves properly without the 290x.
> 
> Or should i sell both 295x2 and 290x and get another 980ti for sli gaming, will it be better then a single 295x2 and 2 x gpu gaming, meaning would sli be better in gaming then crossfire ?
> 
> Whats should i do ?


Both Crossfire and SLI have their issues but 980Ti SLI will be more powerful than the 295x2 on it's own.

as for running them in the same rig i can't see an issue with the way you want to do it but i can see driver conflictions more than anything else


----------



## rakesh27

Thanks for the response, I've been a gamer for many years and I work in I.T., I wish AMD got there act togeather, it's a shame really,.

I have 3 of the best gpus you could buy and still they won't play ball..

I remember a time when I had a BFG 8800 GTX, I loved it, what a card, everything I played was awesome, fluid, smooth, fast etc... Don't get me wrong I've had many ati/amd cards and they've been just good.

What I really wanna know is sli gaming more perfected then crossfire, I'm seriously considering going back to the green mean machine...

I think 2 x 980TI is definitely more powerful than a 295x2 and should serve me well into future, what do you all think...


----------



## Sgt Bilko

Quote:


> Originally Posted by *rakesh27*
> 
> Thanks for the response, I've been a gamer for many years and I work in I.T., I wish AMD got there act togeather, it's a shame really,.
> 
> I have 3 of the best gpus you could buy and still they won't play ball..
> 
> I remember a time when I had a BFG 8800 GTX, I loved it, what a card, everything I played was awesome, fluid, smooth, fast etc... Don't get me wrong I've had many ati/amd cards and they've been just good.
> 
> What I really wanna know is sli gaming more perfected then crossfire, I'm seriously considering going back to the green mean machine...
> 
> I think 2 x 980TI is definitely more powerful than a 295x2 and should serve me well into future, what do you all think...


Well i can tell you that Crossfire scales better than SLI, as for the usability.......well hopefully someone who has used both will chime in.

I'm using Tri-Fire atm and i'm not having many issues if i'm honest, the 295x2 by itself was great, no stutter, good scaling, good fps etc.


----------



## Alex132

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well i can tell you that Crossfire scales better than SLI, as for the usability.......well hopefully someone who has used both will chime in.


Crossfire scales well when it works, SLI works better in most ways.

SLI works in more games, and is easier to enable/disable. SLI also has less micro-stutter and less general issues.
I have been plagued by Crossfire issues with microstutter lately. Hoping Win10 + new drivers fix it...

I would suggest going with the 980 Tis.
Personally I am waiting for Pascal as I have been plagued by software issues with this graphics card / AMD, *much more than others it seems*.

I have a fun little daily graph on my whiteboard =_= (daily crashes, BSODs, freezes, lockups etc. that cause my computer to crash from Drivers)


Spoiler: Warning: Spoiler!


----------



## Mega Man

yes we all know your an amd hater

cfx works fine in most games in those that dont you generally just need to make a cfx profile

ironically i NEVER crash, i do mean never, i do crash when running dvdfab rendering a bluray and netflix together... the solution dont

i have not have a bsod in well over several months

so i dont crash or bsod ever,

you sure your oc is stable ? it sounds to me like either your oc is not stable- you have corrupted something or maybe your running beta programs ( or maybe alpha ) ?


----------



## blue1512

Quote:


> Originally Posted by *rakesh27*
> 
> Thanks for the response, I've been a gamer for many years and I work in I.T., I wish AMD got there act togeather, it's a shame really,.
> 
> I have 3 of the best gpus you could buy and still they won't play ball..
> 
> I remember a time when I had a BFG 8800 GTX, I loved it, what a card, everything I played was awesome, fluid, smooth, fast etc... Don't get me wrong I've had many ati/amd cards and they've been just good.
> 
> What I really wanna know is sli gaming more perfected then crossfire, I'm seriously considering going back to the green mean machine...
> 
> I think 2 x 980TI is definitely more powerful than a 295x2 and should serve me well into future, what do you all think...


FuryX2 is the card for you. Current NVidia's offerings are not the future proof for DX12.


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> yes we all know your an amd hater


I love how I give genuine complaints about a product and you senselessly defend it with comments like this.

Quote:


> Originally Posted by *Mega Man*
> 
> cfx works fine in most games in those that dont you generally just need to make a cfx profile


Not in the games that I play, and if I force it (if I can) it results in massive graphical errors - or straight up doesn't work or will not be properly implements (Ark). Gameworks is partially to blame, awful Nvidia strategy









Quote:


> Originally Posted by *Mega Man*
> 
> ironically *i NEVER crash*, i do mean never, *i do crash* when running dvdfab rendering a bluray and netflix together... the solution dont
> 
> i have not have a *bsod in well over several months*
> 
> so *i dont crash or bsod ever,*


The contradiction is real. 1 BSOD is bad, no matter what. 1 crash as a result of running something is bad, no matter what. I'm not saying Nvidia is perfect, but I have had severely less software-related issues with Nvidia. Nvidia tends to just under-make their reference GPUs. I am very impressed with many features of the 295X2 - most notably the quality of the hardware as well as the nice features of CCC which enable you to alter the settings for different games.

However when a program freezes, crashes, causes me to BSOD, etc. because of the drivers freaking out - I don't see why I should excuse it? That's just fanboyism.

I definitely had issues with my GTX 690 - but it was much more hardware-related. This is almost the exact opposite of that









Quote:


> Originally Posted by *Mega Man*
> 
> you sure your oc is stable ? it sounds to me like either your oc is not stable- you have corrupted something or maybe your running beta programs ( or maybe alpha ) ?


I run stock. I get software BSODs that are a result of the driver freezing.

I have stated that I am waiting for Win10 + new drivers to give this graphics card another chance.

Quote:


> Originally Posted by *blue1512*
> 
> FuryX2 is the card for you. Current NVidia's offerings are not the future proof for DX12.


I'm going to wait for Pascal. I'm kinda over dual-GPU issues.


----------



## xer0h0ur

For the past 2 months I have believed AMD's drivers had been BSODing my system badly and it turned out Afterburner/Rivatuner have been the source of my headaches. Ever since I started closing Afterburner/Rivatuner after startup I no longer get BSODs while gaming. You may want to take another hard look at your OS and everything else running in the background.


----------



## wermad

Afterburner tends to crash with ie or firefox. I use Chrome now as it scales better in 4k anyways. I do get occasional bsod but that's from cpu and ram just being finicky with their clocks.


----------



## Dagamus NM

Quote:


> Originally Posted by *Alex132*
> 
> SLI is easier to enable/disable. SLI also has less micro-stutter and less general issues.


What on earth are you talking about?

To enable or disable crossfire I right click on my desktop, launch CCC, click enable/disable and then apply. For SLI I right click and go into my nvidia controls, go into the SLI/maximize performance tab and choose enable/disable, then I get the notification that both aqua suite and Adobe creative cloud must be disabled before I can proceed, then I cntrl+alt+delete to get to the task manager, then have to end the process trees for aquasuite and creative cloud, then I can go back to nvidia control and enable/disable.

Granted if you don't run either aquasuite or Adobe CC then they would be equivalent but for me enabling/disabling SLI is obnoxious.

They both have the same amount of micro stutter and general issues from experience.

The only real differences I see are in games supported by one side or the other upon release.

I see open CL performance giving me better times in Adobe than Cuda acceleration.


----------



## xer0h0ur

The simple fact is that no matter how you slice it you always end up with more headaches with multi-gpu solutions when playing nearly anything new. So if you go into it expecting things to be sunshine and rainbows then you're fooling no one but yourself. Unless DX12/Vulkan end up being some savior to the multi-gpu crowd then nothing is likely to change that either.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> For the past 2 months I have believed AMD's drivers had been BSODing my system badly and it turned out Afterburner/Rivatuner have been the source of my headaches. Ever since I started closing Afterburner/Rivatuner after startup I no longer get BSODs while gaming. You may want to take another hard look at your OS and everything else running in the background.


What are you saying, exactly? Avoid MSI, or just RivaST?
Correct me if I'm wrong, but one can use MSI without Riva assuming a different app is used for OSD FPS, like Fraps?


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> What on earth are you talking about?
> 
> To enable or disable crossfire I right click on my desktop, launch CCC, click enable/disable and then apply. For SLI I right click and go into my nvidia controls, go into the SLI/maximize performance tab and choose enable/disable, then I get the notification that both aqua suite and Adobe creative cloud must be disabled before I can proceed, then I cntrl+alt+delete to get to the task manager, then have to end the process trees for aquasuite and creative cloud, then I can go back to nvidia control and enable/disable.
> 
> Granted if you don't run either aquasuite or Adobe CC then they would be equivalent but for me enabling/disabling SLI is obnoxious.
> 
> They both have the same amount of micro stutter and general issues from experience.
> 
> The only real differences I see are in games supported by one side or the other upon release.
> 
> I see open CL performance giving me better times in Adobe than Cuda acceleration.


Amd does not give you a crossfire on/off switch with a single 295x2. If you add a 290/290x or a second 295x2, the crossfire option comes up. From what ppl discuss here, if they must disable crossfire for a game (one card), they end up creating a profile to disable it there. I've been running both my cards since February of this year w/ no issues. Granted, I don't jump on the newest drivers to come out (more then likely since I won't pay the premium of new games right away).

For what's available now, the Ti w/ 6gb is the better choice vs Fury X. Two of them are always a good choice. Nvidia is known for poor real world 3 and 4 way sli. I'm really interested in what the later #s are as this seems like the logical choice to move on to in a year or two.


----------



## xer0h0ur

While most of my crashes have been associated with Rivatuner (RTSS), or in other words the OSD within Afterburner, undoing the OSD monotiring in Afterburner only cuts down on the BSODing but doesn't stop it altogether until I stop running Afterburner. I don't make the assumption that every system works the same so you may get a different result than I do by just undoing all of the OSD monitoring so that Rivatuner never runs while Afterburner is open.


----------



## Alex132

My BSODs are and are not related to MSI AB. MSI AB does cause an increase in them, and 50% of the time results in an instant lock-up when I launch it. But I still get crashes, etc. when not running it.


----------



## xer0h0ur

Except since Rivatuner is the main culprit for me its also worth noting that other 3rd party applications also use Rivatuner even if you're not using Afterburner. I am only giving a heads up here. Not trying to solve your mess Alex.


----------



## rakesh27

Guys.

I'm not a amd/ati or Nvidea fan, all I want is good gameplay and graphics.

I've always brought good hardware as I'm fortunate to do so (basically working and saving), so I don't expect my rig to crash most of the time....

I've even wiped my drive loaded my OS, drivers, etc... And still same problem.

This is what I'm gonna do, when I get my 980TI, take out my 290x and test without my 980ti my 295x2 I suspect it will be ok, then take that and put in my 980ti and test and compare.

Since I've been ati/amd fan for long time, I'm leaning towards selling my 290x and 295x2 and get another 980ti for sli...

It would be better then my 295x2, I'll see how it goes you never I may stay with the red team....

Great reading your responses, thanks for your help, let the fun continue....


----------



## xer0h0ur

You're likely just better off overclocking that single 980 Ti than dropping in another 980 Ti and dealing with SLI issues. I may not like Nvidia but those damn cards are legit with a waterblock and backplate on em.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Amd does not give you a crossfire on/off switch with a single 295x2. If you add a 290/290x or a second 295x2, the crossfire option comes up. From what ppl discuss here, if they must disable crossfire for a game (one card), they end up creating a profile to disable it there.


Sorry, I was referring to crossfire/SLI in general. For the case of the single dual-gpu card setup I suppose there is a difference there. Would have to consider the gtx690 for a straight up comparison.

For games that don't use crossfire (some of the Lego games my son likes watching, too young to play them yet) I just turn off full-screen. I suppose I could make a specific profile but that would be more of a pain in the ass than viewing with the windows border.

In general, for my other amd CFX setups, turning on and off is easier than SLI.


----------



## ur4skin

Hi guys. my system seems ok now.
Heres my valley run.
Is it low or normal??? seems low to me.


----------



## xer0h0ur

Have you used any monitoring program during the benchmarking to monitor your clock speeds and usage on your vRAM, GPUs and CPU? Have you also disabled ULPS within Afterburner or manually within the windows registry? If you're experiencing GPU clock fluctuations, open Afterburner's settings and under the unofficial overclocking setting select without powerplay support to force 3D clocks non-stop. This will stop the GPUs from down-clocking but it will directly affect your temperatures as even just within Windows while idling your GPUs will be operating at full tilt.


----------



## ur4skin

just did that with msi afterburner...didnt go more thn 6fps????? thats weird


----------



## xer0h0ur

I don't know if you know this or not but Crossfire will not work if you're not in fullscreen mode. Windowed or borderless windowed will automatically disable crossfire. You can only run windowed while crossfired in Mantle.


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> Sorry, I was referring to crossfire/SLI in general. For the case of the single dual-gpu card setup I suppose there is a difference there. Would have to consider the gtx690 for a straight up comparison.
> 
> For games that don't use crossfire (some of the Lego games my son likes watching, too young to play them yet) I just turn off full-screen. I suppose I could make a specific profile but that would be more of a pain in the ass than viewing with the windows border.
> 
> In general, for my other amd CFX setups, turning on and off is easier than SLI.


I'm not 100% sure, but i remember Malta did have the option to disable crossfire via CCC. I ended up going w/ 4-way tahiti as the mining craze drove up the prices of Malta insanely high. So, like many, i was really surprised amd added another handicap to Vesuvius.

Amd has always had quirky crossfire results. I do credit them for typically scaling better vs Nvidia beyond two cards but @ four cards, there was a bit left to be desired of in Hawaii. Most everyone said stick w/ tri-fire at most. Anyways, browsing the hardware section, the FuryX is doing really good in crossfire in 4k. Really didn't expect to see this but I'll wait for more xfire reviews to gauge crossfire powah. I'm sure someone is already concocting a 4-way FuryX rig.


----------



## xer0h0ur

To some it may be quirky but its actually better for me that the CCC's crossfire tab only toggles tri-fire on/off for me. Some games simply don't want to play nice in tri-fire but will work fine for me in crossfire so that makes it easy for me to disable the 290X and leave the 295X2 to do work. Otherwise I would be left with pulling the power cables on the 290X to get the same result.


----------



## Mega Man

ill just dismantle this piece by piece
Quote:


> Originally Posted by *Alex132*
> 
> My BSODs are and are not related to MSI AB. MSI AB does cause an increase in them, and 50% of the time results in an instant lock-up when I launch it. But I still get crashes, etc. when not running it.


please prove this is amds fault and NOT a conflict with another program ?
Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> yes we all know your an amd hater
> 
> 
> 
> I love how I give genuine complaints about a product and you senselessly defend it with comments like this.
Click to expand...

you have and will always be an amd hater tbh i dont know why you even still have a amd card would you like me to go back and start requoting you ?
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> cfx works fine in most games in those that dont you generally just need to make a cfx profile
> 
> 
> 
> Not in the games that I play, and if I force it (if I can) it results in massive graphical errors - or straight up doesn't work or will not be properly implements (Ark). Gameworks is partially to blame, awful Nvidia strategy
Click to expand...

even in those i usually dont have a problem
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> ironically *i NEVER crash*, i do mean never, *i do crash* when running dvdfab rendering a bluray and netflix together... the solution dont
> 
> i have not have a *bsod in well over several months*
> 
> so *i dont crash or bsod ever,*
> 
> 
> 
> The contradiction is real. 1 BSOD is bad, no matter what. 1 crash as a result of running something is bad, no matter what. I'm not saying Nvidia is perfect, but I have had severely less software-related issues with Nvidia. Nvidia tends to just under-make their reference GPUs. I am very impressed with many features of the 295X2 - most notably the quality of the hardware as well as the nice features of CCC which enable you to alter the settings for different games.
> 
> However when a program freezes, crashes, causes me to BSOD, etc. because of the drivers freaking out - I don't see why I should excuse it? That's just fanboyism.
> 
> I definitely had issues with my GTX 690 - but it was much more hardware-related. This is almost the exact opposite of that
Click to expand...

there is no contradiction i had ONE bsod which WAS NOT driver related. my driver crashed, when was the last driver update neded for dvdfab or silverlight, is it possible that either of those needs an update ?? lastly is is possible that those + a few other programs can cause issues running at the same time ?

one in several months doing something non standard ( something people dont test for )vs 10 a day

o wait i forgot your philosophy " blame it on amd"
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> you sure your oc is stable ? it sounds to me like either your oc is not stable- you have corrupted something or maybe your running beta programs ( or maybe alpha ) ?
> 
> 
> 
> I run stock. I get software BSODs that are a result of the driver freezing.
> 
> I have stated that I am waiting for Win10 + new drivers to give this graphics card another chance.
Click to expand...

i bet there is more then you wanna are leading us to believe if you are getting 10 bsod a day i call shenanigans that is not amds issue, that is either faulty equip or ebkac


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> please prove this is amds fault and NOT a conflict with another program ?


>Have GTX690
>No driver freezes or BSODs

>Get 295X2
>Tons of driver freezes and BSODs.

And I can pull up Event Viewer for the BSODs and driver crashes.


Spoiler: Warning: Spoiler!








Quote:


> Originally Posted by *Mega Man*
> 
> you have and will always be an amd hater tbh i dont know why you even still have a amd card would you like me to go back and start requoting you ?


So I have to be an AMD fanboy to own an AMD card? Don't be stupid.

Quote:


> Originally Posted by *Mega Man*
> 
> there is no contradiction i had ONE bsod which WAS NOT driver related. my driver crashed, when was the last driver update neded for dvdfab or silverlight, is it possible that either of those needs an update ?? lastly is is possible that those + a few other programs can cause issues running at the same time ?
> 
> one in several months doing something non standard ( something people dont test for )vs 10 a day
> 
> o wait i forgot your philosophy " blame it on amd"



You never listed the amount of BSODs in the period of "several months".
You shouldn't be restricted to not running programs simultaneously and just try to chalk it up to "oh you just can't do that". No, you should be able to do that. If you could do that on an Intel or Nvidia GPU - why is it acceptable that you cannot do that on an AMD card?
My philosophy is not "blame it on AMD". If anything it would be: "Blame it on the problem". And you seem to *instantly* think I am *illogically* hating on *AMD as a whole* for this. The issues I am having are purely software, I have never once said I disliked AMD as a whole - and my previous posts actually praise the hardware quality level of this card. As I stated Nvidia's reference (690) was poor in comparison. I do not see why you think I am illogically hating on AMD. If I dislike a problem is that not logical? Or do you think it is logical to overlook issues based on a personal preference for a vendor/product? Because that is fanboyism.

Quote:


> Originally Posted by *Mega Man*
> 
> i bet there is more then you wanna are leading us to believe if you are getting 10 bsod a day i call shenanigans that is not amds issue, that is either faulty equip or ebkac


I never said 10 a day, don't make up figures. And see my graphs. It's pretty easy to average them out from there. I have had this card for 65 days. In those 65 days I have had 325 Event Viewer-logged AMD Display Driver crashes. And 48 Kernel-based BSODs. 325 / 65 = 5 Display Driver crashes per day. 48 / 65 = 0.74 BSODs per day. Or 5.74 BSODs + Display Driver crashes per day. Of course this fluctuates day-to-day, but that's the average.

And before you ask, the rest of my system is extremely stable. Stress tested with a week of solid folding + P95 + LinX + general gaming and then I underclocked it by 200mhz on the CPU and added a bit of voltage on the RAM. I do not run MSI AB anymore (even though I think this is not a fix, but a work-around) and I did a clean install of my system when installing my 295X2. Oh and it's always stock GPU clocks.


----------



## Mega Man

ok my bad you had 7 in your picture that you update daily - that would mean 7 per day?


----------



## xer0h0ur

Have you ever considered the possibility you have a defective card?


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> ok my bad you had 7 in your picture that you update daily - that would mean 7 per day?


7 On that day in particular, today - luckily - has been 0.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you ever considered the possibility you have a defective card?


Yes, well... kind of. Not all my issues would make sense to purely a defective card. The BSODs I could understand, but everything is driver-related.
And for example; launching MSI AB and BSODing -> that's not hardware. Or micro-stutter in games even in single-GPU mode.

All the signs are pointing towards bad software, every single freeze up is Display Driver related, and every BSOD is the exact same. Strangely enough they seem to have increased since about 1 month ago.

Sadly I cannot RMA this card (complex story), so yeah. Better it software than hardware actually









If only I had that 7990 still to test out it out


----------



## wermad

probably time for a clean os install? Even though my old 2700k cpu was able to hit 5.0, it would randomly bsod. 4.8 was the most stable of the highest clocks and so this was my daily setup. Maybe he should take is clocks down a notch or two (or stock) and see what happens then? Most of my gpu related issues, the screen goes blank (driver or gpu failure).


----------



## XAslanX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you ever considered the possibility you have a defective card?


This is what I'm thinking it is. I've been using ATi/AMD cards for 12 years now and the only BSODs I've experienced with them is when the card starts to have issues itself, not the drivers.


----------



## xer0h0ur

You can put lipstick on a pig but its still going to be a pig. You're the only one having this many crashes so all you're doing is dancing around what seems to be obvious to me. Hardware problems. AMD's drivers may not be the best but no one is crashing as hard as you are and the only time I ever have experienced this much crashing as you're showing was because of a defective 290X.


----------



## wermad

I believe Alex mentioned this card was rma'd. Wondering if he got a refurb unit or got the same one????


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> probably time for a clean os install? Even though my old 2700k cpu was able to hit 5.0, it would randomly bsod. 4.8 was the most stable of the highest clocks and so this was my daily setup. Maybe he should take is clocks down a notch or two (or stock) and see what happens then? Most of my gpu related issues, the screen goes blank (driver or gpu failure).


It's running at 4.8Ghz right now actually









And yeah, waiting for Win10 to do a fresh install.

Quote:


> Originally Posted by *XAslanX*
> 
> This is what I'm thinking it is. I've been using ATi/AMD cards for 12 years now and the only BSODs I've experienced with them is when the card starts to have issues itself, not the drivers.


Quote:


> Originally Posted by *xer0h0ur*
> 
> You can put lipstick on a pig but its still going to be a pig. You're the only one having this many crashes so all you're doing is dancing around what seems to be obvious to me. Hardware problems. AMD's drivers may not be the best but no one is crashing as hard as you are and the only time I ever have experienced this much crashing as you're showing was because of a defective 290X.


Really don't want this thing to be dying... We'll have to see when Windows 10 rolls out how it behaves then though.


----------



## xer0h0ur

Do you have another rig you can test in? If the problems remain in another rig then you're basically assured its hardware related.


----------



## wermad

edit: ^^^







'd









Do you have anyone else that you can take your card to and test it w/ their system in your area? Really, that could narrow down if indeed the card is acting up.

Fyi, ppcs.com has the Koolance blocks back in stock (5 in stock at this time) ~$110:

http://www.performance-pcs.com/koolance-vid-ar295x2-water-block-amd-radeon-r9-295x2.html


----------



## SAFX

If there's one thing I hate in windows, it's *Event Viewer*.....the text book definition of how _not_ to build a user interface


----------



## DividebyZERO

Quote:


> Originally Posted by *Alex132*
> 
> >Have GTX690
> >No driver freezes or BSODs
> 
> >Get 295X2
> >Tons of driver freezes and BSODs.
> 
> And I can pull up Event Viewer for the BSODs and driver crashes.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> So I have to be an AMD fanboy to own an AMD card? Don't be stupid.
> 
> You never listed the amount of BSODs in the period of "several months".
> You shouldn't be restricted to not running programs simultaneously and just try to chalk it up to "oh you just can't do that". No, you should be able to do that. If you could do that on an Intel or Nvidia GPU - why is it acceptable that you cannot do that on an AMD card?
> My philosophy is not "blame it on AMD". If anything it would be: "Blame it on the problem". And you seem to *instantly* think I am *illogically* hating on *AMD as a whole* for this. The issues I am having are purely software, I have never once said I disliked AMD as a whole - and my previous posts actually praise the hardware quality level of this card. As I stated Nvidia's reference (690) was poor in comparison. I do not see why you think I am illogically hating on AMD. If I dislike a problem is that not logical? Or do you think it is logical to overlook issues based on a personal preference for a vendor/product? Because that is fanboyism.
> I never said 10 a day, don't make up figures. And see my graphs. It's pretty easy to average them out from there. I have had this card for 65 days. In those 65 days I have had 325 Event Viewer-logged AMD Display Driver crashes. And 48 Kernel-based BSODs. 325 / 65 = 5 Display Driver crashes per day. 48 / 65 = 0.74 BSODs per day. Or 5.74 BSODs + Display Driver crashes per day. Of course this fluctuates day-to-day, but that's the average.
> 
> And before you ask, the rest of my system is extremely stable. Stress tested with a week of solid folding + P95 + LinX + general gaming and then I underclocked it by 200mhz on the CPU and added a bit of voltage on the RAM. I do not run MSI AB anymore (even though I think this is not a fix, but a work-around) and I did a clean install of my system when installing my 295X2. Oh and it's always stock GPU clocks.


Good old sandy bridge, I used to get alot of the same stuff on my 2600k when overclocked. I went through 2 of them(2600k) and 3 boards, in the end i gave up the platform. The issue also is comparing 690gtx to 295x2 isn't going to do any good. You cant compare them because your cpu will work harder(speed) with one over the other. Changing the hardware with a system instability doesn't mean the old gpu was your fault, it could just be that it pushes your system harder exposing overclock instability. Maybe you could have everything stock including ram/gpu/cpu and run it through the mill. Hate to say it but, prime 95/linx don't do jack really for stability testing aside from possiblly making you think its stable. Every game and program works your system differently and you could be a whole week prime 95 stable and crash in 3 minutes in say BF4.

If you run the system completely stock and can still have issues then it could be your 295x2 or another system component.


----------



## wermad

Alex, how much voltage are you pumping in the cpu (and ram)?


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> If there's one thing I hate in windows, it's *Event Viewer*.....the text book definition of how _not_ to build a user interface


You haven't experienced the glory of applying GPOs in Windows Server I see









Quote:


> Originally Posted by *wermad*
> 
> edit: ^^^
> 
> 
> 
> 
> 
> 
> 
> 'd
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you have anyone else that you can take your card to and test it w/ their system in your area? Really, that could narrow down if indeed the card is acting up.


I do - but I guess we'd have to swap GPUs for a few days or so to see if the problems are hardware or not. He has a i7 2600 + P8P67 Pro + R9 280X.

Quote:


> Originally Posted by *DividebyZERO*
> 
> Changing the hardware with a system instability doesn't mean the old gpu was your fault, it could just be that it pushes your system harder exposing overclock instability.
> 
> Maybe you could have everything stock including ram/gpu/cpu and run it through the mill. Hate to say it but, prime 95/linx don't do jack really for stability testing aside from possiblly making you think its stable. Every game and program works your system differently and you could be a whole week prime 95 stable and crash in 3 minutes in say BF4.


My CPU and RAM are stable as stated before.

How I tested my CPU overclock was kinda simple, I got my OC stable at 4.9Ghz with offset voltage (only manual seemed to be good for 5Ghz+) using a week of solid folding, 2 days of P95, 2 days of LinX, and a month of gaming on/off in War Thunder, Diablo 3 and any other games I played at that time as well as rendering / audio production and general use.

Once I found the 4.9Ghz stable point I lowered it to 4.8Ghz and kept the voltage the same. I hadn't had a single BSOD related to 0x0124 since then.


Spoiler: CPU under load







For RAM, it's running slightly underclocked (lowered the Command Rate to 2T instead of 1T) clocks with 1.525v instead of 1.5v.

One thing I have noticed is it tends to be very related to Crossfire and other programs. Most noticeably Flash, MSI AB and [email protected] But I still get on/off BSODs or freezes for no reason it seems.... and then some days I get glimmers of hope with no problems at all, like today.

EDIT- One thing I haven't considered in my motherboard, doesn't Hawaii not play nicely with PCI-E 2.0?


----------



## PR-Imagery

It's definitely software. I have the same problems minus the BSODs (mine just simply locks up constantly and I have to reset the system) with my 7970s and there's no way all three are defective.


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> You haven't experienced the glory of applying GPOs in Windows Server I see
> 
> 
> 
> 
> 
> 
> 
> 
> I do - but I guess we'd have to swap GPUs for a few days or so to see if the problems are hardware or not. He has a i7 2600 + P8P67 Pro + R9 280X.
> My CPU and RAM are stable as stated before.
> 
> How I tested my CPU overclock was kinda simple, I got my OC stable at 4.9Ghz with offset voltage (only manual seemed to be good for 5Ghz+) using a week of solid folding, 2 days of P95, 2 days of LinX, and a month of gaming on/off in War Thunder, Diablo 3 and any other games I played at that time as well as rendering / audio production and general use.
> 
> Once I found the 4.9Ghz stable point I lowered it to 4.8Ghz and kept the voltage the same. I hadn't had a single BSOD related to 0x0124 since then.
> 
> 
> Spoiler: CPU under load
> 
> 
> 
> 
> 
> 
> 
> For RAM, it's running slightly underclocked (lowered the Command Rate to 2T instead of 1T) clocks with 1.525v instead of 1.5v.
> 
> One thing I have noticed is it tends to be very related to Crossfire and other programs. Most noticeably Flash, MSI AB and [email protected] But I still get on/off BSODs or freezes for no reason it seems.... and then some days I get glimmers of hope with no problems at all, like today.
> 
> EDIT- One thing I haven't considered in my motherboard, doesn't Hawaii not play nicely with PCI-E 2.0?


Think we had a chat about this a while ago, think you got a bit too much voltage in the cpu. Try 4.5 and setting the voltage in auto (w/ SB ~1.20-1.30 volts). Give it a day or two and see what happens. Almost 1.5v to hit 4.8, I was ~1.35v @ 4.8 and did 1.425v @ 5.0, albeit it was a 2700k. Seems a tad high. 4.5 is pretty healthy, even for SB, see what happens. Wqhd doesn't put a lot of burden on the cpu and more on the gpu while gaming. I'm @ 4.6, though clock for clock, it may or may not make much difference vs a 4.5 SB, but i have some games I can run so we can compare (i'll disable crossfire).


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> You can put lipstick on a pig but its still going to be a pig. You're the only one having this many crashes so all you're doing is dancing around what seems to be obvious to me. Hardware problems. AMD's drivers may not be the best but no one is crashing as hard as you are and the only time I ever have experienced this much crashing as you're showing was because of a defective 290X.


or when i OC and bench XD


----------



## gatygun

Holy crap at those bsod's and driver crashes. That's absolute not normal. I would underclock your 295x2 and see how stable it stays with a higher volt.

There is something going very wrong there.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Think we had a chat about this a while ago, think you got a bit too much voltage in the cpu. Try 4.5 and setting the voltage in auto (w/ SB ~1.20-1.30 volts). Give it a day or two and see what happens. Almost 1.5v to hit 4.8, I was ~1.35v @ 4.8 and did 1.425v @ 5.0, albeit it was a 2700k. Seems a tad high. 4.5 is pretty healthy, even for SB, see what happens. Wqhd doesn't put a lot of burden on the cpu and more on the gpu while gaming. I'm @ 4.6, though clock for clock, it may or may not make much difference vs a 4.5 SB, but i have some games I can run so we can compare (i'll disable crossfire).


Hmm, I guess I could. But this CPU is very slow for me even at 4.8. Especially in single-threaded games like Diablo 3 or applications like Ableton Live. I haven't had issues with it at this voltage before, but it might be degradation as this CPU is rather old now.

And oddly enough since talking about this (yesterday) I have had a stable system with no crashes...

Have any of you folded on the 295X2 before? Because (I haven't really much) folding on GPU0 and trying to do other things will 100% result in a BSOD/Driver crash. For example, if I try fold on GPU0 and watch a video - BSOD. If I try watch a Flash video and it doesn't load - BSOD, etc.


----------



## NBrock

Quote:


> Originally Posted by *Alex132*
> 
> Hmm, I guess I could. But this CPU is very slow for me even at 4.8. Especially in single-threaded games like Diablo 3 or applications like Ableton Live. I haven't had issues with it at this voltage before, but it might be degradation as this CPU is rather old now.
> 
> And oddly enough since talking about this (yesterday) I have had a stable system with no crashes...
> 
> Have any of you folded on the 295X2 before? Because (I haven't really much) folding on GPU0 and trying to do other things will 100% result in a BSOD/Driver crash. For example, if I try fold on GPU0 and watch a video - BSOD. If I try watch a Flash video and it doesn't load - BSOD, etc.


That's odd I don't have any of those issues on my 295x2 while Folding on one or both of the gpus.


----------



## Alex132

Quote:


> Originally Posted by *NBrock*
> 
> That's odd I don't have any of those issues on my 295x2 while Folding on one or both of the gpus.


Folding on GPU0 (main GPU used for other hardware accelerated stuff) + watching a movie = instant driver lockup. Using anything GPU-accelerated also has a very high chance in a GPU lockup.

Better than my 690 where I couldn't fold on it or I wouldn't be able to use the system at least


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Far as I know the only way you would manage that is flashing the BIOS so its already using a higher voltage by default then raising the voltage even more through Afterburner.


Does anyone / you know how to do it ? Cuz my voltage on gpu1 and gpu2 varies by 0.05v.
~1.18v
~1.13v

Would love to make them both the same....1.3v for example


----------



## PCModderMike

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Think we had a chat about this a while ago, think you got a bit too much voltage in the cpu. Try 4.5 and setting the voltage in auto (w/ SB ~1.20-1.30 volts). Give it a day or two and see what happens. Almost 1.5v to hit 4.8, I was ~1.35v @ 4.8 and did 1.425v @ 5.0, albeit it was a 2700k. Seems a tad high. 4.5 is pretty healthy, even for SB, see what happens. Wqhd doesn't put a lot of burden on the cpu and more on the gpu while gaming. I'm @ 4.6, though clock for clock, it may or may not make much difference vs a 4.5 SB, but i have some games I can run so we can compare (i'll disable crossfire).
> 
> 
> 
> Hmm, I guess I could. But this CPU is very slow for me even at 4.8. Especially in single-threaded games like Diablo 3 or applications like Ableton Live. I haven't had issues with it at this voltage before, but it might be degradation as this CPU is rather old now.
> 
> And oddly enough since talking about this (yesterday) I have had a stable system with no crashes...
> 
> Have any of you folded on the 295X2 before? Because (I haven't really much) folding on GPU0 and trying to do other things will 100% result in a BSOD/Driver crash. For example, if I try fold on GPU0 and watch a video - BSOD. If I try watch a Flash video and it doesn't load - BSOD, etc.
Click to expand...

Wow, over 100 unread posts, and you guys are still going on about your crashes?
I told you I've suspected your overclocks are the root of the issue. Either they're too high, or you have damaged hardware from sustained high voltage being pumped through your components over time. Sure they may have been stable at one time, but how long have you overclocked that 2500K that high? It's almost assumed you are shortening the life of your components when overclocking and the time may be up for your stuff.


----------



## DividebyZERO

Quote:


> Originally Posted by *PCModderMike*
> 
> Wow, over 100 unread posts, and you guys are still going on about your crashes?
> I told you I've suspected your overclocks are the root of the issue. Either they're too high, or you have damaged hardware from sustained high voltage being pumped through your components over time. Sure they may have been stable at one time, but how long have you overclocked that 2500K that high? It's almost assumed you are shortening the life of your components when overclocking and the time may be up for your stuff.


Not to mention how overclocking isnt supported by any measure. Also, there is no 100% definitive way to check stability for us end users. Its all subjective per user as well, it games fine for 5 hours at a time, or it runs prime/linx/occt etc.. for 24hrs. Add in silicon lottery and thermals and its just a guess at best. Its easier to blame a new piece of hardware than to think you might be wrong about stability.

I don't even suggest overclocking anymore outside of benching to my friends. They already have a hard enough time dealing with all the other crap that's now standard(console ports,early access,new flavors of operating systems. Etc..)


----------



## Alex132

Experiencing some more random crashes, BSODs today, with the CPU at stock (3.3Ghz with Turbo off too) and RAM at 1333Mhz 1.525v.

The weirdest error for me is the loss of display signal for ~5 seconds, and then it comes back. Really does hint at a dying GPU.

My guesses is probably that the 295X2 is faulty, specifically the first GPU. The 2nd GPU seems to be fine with folding for up to 4 days in a row. First GPU just doesn't like doing anything.


----------



## Sgt Bilko

Catalyst 15.7 is out and among the new additions is Crossfire support for Freesync









Linky: http://support.amd.com/en-us/download


----------



## SAFX

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Catalyst 15.7 is out and among the new additions is Crossfire support for Freesync
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Linky: http://support.amd.com/en-us/download


Does Freesync truly eliminate tearing?


----------



## Mega Man

does your monitor support freesync ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Catalyst 15.7 is out and among the new additions is Crossfire support for Freesync
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Linky: http://support.amd.com/en-us/download
> 
> 
> 
> Does Freesync truly eliminate tearing?
Click to expand...

I dont have a Freesync monitor to tell you but from what ive heard it does yeah


----------



## SAFX

Quote:


> Originally Posted by *Mega Man*
> 
> does your monitor support freesync ?


I don't believe so, but that's not stopping me from upgrading


----------



## Medusa666

Freesync is the best upgrade I have done since I got an SSD, made an insane difference to me while game, so much that I sacrificed raw fps for the feature to be enabled on the old drivers when it was one gpu only.


----------



## SAFX

How does it impose limitations on FPS?

On GTA 5 with V-Sync enabled in game (and Framepacing in CCC), I'm locked at 60FPS, but tearing is eliminated, that's my current solution minus a Freesync monitor.


----------



## tagaxxl

Hello my friends...i need you help again....got lg 24" /tn/5ms/1080p monitor and i want something better. So pls suggest a monitor with freesync, for my r9 295x2 to upgrade...my budget is 500 to 600 euro.


----------



## PCModderMike

Quote:


> Originally Posted by *tagaxxl*
> 
> Hello my friends...i need you help again....got lg 24" /tn/5ms/1080p monitor and i want something better. So pls suggest a monitor with freesync, for my r9 295x2 to upgrade...my budget is 500 to 600 euro.


Well your list of choices is pretty slim at this time. There's not very many monitors out there that support Freesync.
Quote:


> FreeSync monitors available as of May 2015
> Monitor Size / Resolution Refresh rate Panel type Price
> Acer XG270HU 27-inch 2560x1440 40 - 144 Hz TN $500
> BenQ XL2730Z 27-inch 2560x1440 40 - 144 Hz TN $850
> LG 29UM67 29-inch 2560x1080 48 - 75 Hz IPS $450
> LG 34UM67 34-inch 2560x1080 48 - 75 Hz IPS $650


http://www.pcgamer.com/the-g-sync-and-freesync-monitors-available-right-now/

I personally with go with either the Acer or the BenQ because 2560x1440 is a gamers "sweet spot" and the 144Hz refresh rate is very nice.


----------



## tagaxxl

ASUS MG279Q 144Hz FreeSync 1440p

http://www.computeruniverse.net/en/products/90594323/asus-mg279q.asp

ips panel 4ms

Samsung Monitor U28E590D UHD

http://www.computeruniverse.net/en/products/90606309/samsung-monitor-u28e590d.asp

tn panel 1 ms

BenQ XL2730Z 1440p

http://www.computeruniverse.net/en/products/90583366/benq-xl2730z.asp

tn panel 1ms

which one is better?


----------



## SAFX

What monitor you got?


----------



## ErikV55

Hello all. For those using the 295x2 in the Corsair 780t. Will the rad reach one of the two intake fan slots on the front of the case? I ask because i have a noctua nhd14 which makes mounting the card's rad as an exhaust impossible.


----------



## tagaxxl

at the moment i have lg 24" /tn/5ms/1080p monitor


----------



## gatygun

To bad those 1440p 144hz screens cost a fortune.


----------



## wermad

Quote:


> Originally Posted by *ErikV55*
> 
> Hello all. For those using the 295x2 in the Corsair 780t. Will the rad reach one of the two intake fan slots on the front of the case? I ask because i have a noctua nhd14 which makes mounting the card's rad as an exhaust impossible.


780t, might be a stretch but if you place the fan as push and between the case and rad, it might work. i would find some 780 owners and ask them the distance. I was looking at tiny-toms ax1500 review and he has one and two cards with the rads on the front of a 760t (slightly shorter then the 780). There's a bit of slack but its getting tighter.

(760t)


----------



## SAFX

NO!









Got played, big time, thought my monitor had Freesync (DP 1.2, Freesync never mentioned in manual)
Article is definitely wrong


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> NO!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got played, big time, thought my monitor had Freesync (DP 1.2, Freesync never mentioned in manual)
> Article is definitely wrong


It was up to the manufactorers if they wanted to include Freesync on the monitor or not.....kinda sucks that they never included it on half that were demoed but oh well


----------



## SAFX

Losing confidence with MSI Afterburner

Last night, no reason at all, GPU-Z, GTA 5, Valley, all suffer from horrendous FPS, average 15-20. GPU 2 never goes about 300Mhz.
Checked Ulps in registry, MSI settings, CCC, all looks good. Uninstalled 15.4, installed new 15.7 drivers, same deal.
Uninstalled MSI, restarted, now I'm back to normal.

I'm considering Trixx, just need to find FPS alternative.


----------



## Agent Smith1984

Quote:


> Originally Posted by *SAFX*
> 
> Losing confidence with MSI Afterburner
> 
> Last night, no reason at all, GPU-Z, GTA 5, Valley, all suffer from horrendous FPS, average 15-20. GPU 2 never goes about 300Mhz.
> Checked Ulps in registry, MSI settings, CCC, all looks good. Uninstalled 15.4, installed new 15.7 drivers, same deal.
> Uninstalled MSI, restarted, now I'm back to normal.
> 
> I'm considering Trixx, just need to find FPS alternative.


I've gone through my share of woes with AB before too....

One card working, the other not.... uninstall/reinstall, problem gone....

I still can't explain what it was that ever caused it.

I really like trixx' simplicity, but the monitoring in AB is so nice to have.....


----------



## SAFX

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It was up to the manufactorers if they wanted to include Freesync on the monitor or not.....kinda sucks that they never included it on half that were demoed but oh well


Totally sucks, big time, will need a new monitor in the future,


----------



## SAFX

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I really like trixx' simplicity, but the monitoring in AB is so nice to have.....


I use AIDA64 for monitoring, not as nice, but very comprehensive, definitely on par with MSI, no FPS though


----------



## Alex132

15.7 has been great for me actually









MSI AB is not good for AMD cards


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> 15.7 has been great for me actually
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI AB is not good for AMD cards


what you using for OC, Trixx?


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> 15.7 has been great for me actually
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI AB is not good for AMD cards
> 
> 
> 
> what you using for OC, Trixx?
Click to expand...

I don't OC for anything but benchmarks - which I almost never run, MSI AB had been giving me tons of issues anyway.


----------



## tagaxxl

guys if i change the fan on the radiator do i lose the warranty? if no, then what are the best fans to lower the temp


----------



## elgreco14

Quote:


> Originally Posted by *tagaxxl*
> 
> ASUS MG279Q 144Hz FreeSync 1440p
> 
> http://www.computeruniverse.net/en/products/90594323/asus-mg279q.asp
> 
> ips panel 4ms
> 
> Samsung Monitor U28E590D UHD
> 
> http://www.computeruniverse.net/en/products/90606309/samsung-monitor-u28e590d.asp
> 
> tn panel 1 ms
> 
> BenQ XL2730Z 1440p
> 
> http://www.computeruniverse.net/en/products/90583366/benq-xl2730z.asp
> 
> tn panel 1ms
> 
> which one is better?


The asus MG279Q is better. IPS has very nice colors and freesync range is 35-90HZ which is not a problem. Because you see almost no tearing at 144hz. MG279Q and R9 295x2 is a sweet combo


----------



## SAFX

Quote:


> Originally Posted by *tagaxxl*
> 
> guys if i change the fan on the radiator do i lose the warranty? if no, then what are the best fans to lower the temp


Glad you asked, I tested many fans last month, here's my results.

Gentle Typhoon AP-15 rules, period.
Did not test Noctua fans, though I suspect similar if not identical performance to AP-15, for a price.
Cougar Vortext results were a bit surprising, was expecting performance on-par with SP120 PE, didn't come close.

Test conditions: AIDA64 Stability Test: GPUs @100% for 10m, fan @100% power.
I purchased my GT AP-15 from Amazon:


----------



## xer0h0ur

I am actually interested in swapping out the Nidec radiator fan I re-used from my old Asetek CPU AIO on my 120mm Monsta radiator with one of those gentle typhoons.


----------



## Alex132

Yeah I don't trust those Cougar reviews. I wish I didn't give my SP120QEs away to prove you otherwise - but I currently have 2x Cougar Vortex fans and they kick ass on this radiator.
SP120QEs push no air, and are mechanically really annoyingly loud for their air movement. Corsair fans have been proven to be over-hyped / rated though, so I won't talk much about that.

Also 100% = variances, etc.
I'd still just recommend the same fans as always, in the RPM flavor of your choice:

EK Vardar
GT Typhoon
Noise Blocker / Cougar / bequiet!


----------



## wermad

Cougar Cf140 are the schnizzle. Just make sure you got lots of rad to run them ultra quiet (~5v). 120s arent thay great and were horrendously noisy.

Great fan for low noise operation even @12v. Kept all four of my lightning tahiti's nice and cool (~40c).


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Yeah I don't trust those Cougar reviews.
> I currently have 2x Cougar Vortex fans and they kick ass on this radiator.
> [/LIST]


Are you doing push/pull with your cougars? Vortex HDB CFV12HB? What temps are you getting on stock/OC?

You don't trust _my_ Cougar reviews, or Cougar reviews in general? As I said, I was surprised by the results, and to be sure, I double checked my raw data, clocks were definitely stock during my test, not OC.

I really like the Gentle Typhoons. I use an aggressive response curve in SpeedFan and run at nearly 100% with no distractions


----------



## BradleyW

Quote:


> Originally Posted by *tagaxxl*
> 
> guys if i change the fan on the radiator do i lose the warranty? if no, then what are the best fans to lower the temp


Will they ever know?


----------



## joeh4384

Quote:


> Originally Posted by *SAFX*
> 
> Glad you asked, I tested many fans last month, here's my results.
> 
> Gentle Typhoon AP-15 rules, period.
> Did not test Noctua fans, though I suspect similar if not identical performance to AP-15, for a price.
> Cougar Vortext results were a bit surprising, was expecting performance on-par with SP120 PE, didn't come close.
> 
> Test conditions: AIDA64 Stability Test: GPUs @100% for 10m, fan @100% power.
> I purchased my GT AP-15 from Amazon:
> 
> 
> Results, Gently Typhoon kicks rules, always


How would you rank them in noise? How loud are the GTs?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Cougar Cf140 are the schnizzle. Just make sure you got lots of rad to run them ultra quiet (~5v). 120s arent thay great and were horrendously noisy.
> 
> Great fan for low noise operation even @12v. Kept all four of my lightning tahiti's nice and cool (~40c).


Hmmm, I have both. The Cougar Vortex 1500rpm are great for ~800-1200rpm range. 1200-1500 is really noisy, but moves a crap-ton of air. So that's why a fan-curve is important. Only got these over the EK/GTs because of price and availability









The Cougar 120s I have now run at 800rpm almost all the time, only ramp up to ~1050rpm when both GPUs are under load. The vrm fan is much louder.

I actually prefer the 800D stock 140mm for case fans to the Cougar 140mm. The stock 800D is quieter. My only reason for using it actually haha. The 140mm Cougar does push a lot more air, and air:noise ratio of it is very good... but when I just need something to gently blow over my HDDs... I'll take the quietest fan I've got









Quote:


> Originally Posted by *SAFX*
> 
> Are you doing push/pull with your cougars? Vortex HDB CFV12HB? What temps are you getting on stock/OC?
> 
> You don't trust _my_ Cougar reviews, or Cougar reviews in general? As I said, I was surprised by the results, and to be sure, I double checked my raw data, clocks were definitely stock during my test, not OC.
> 
> I really like the Gentle Typhoons. I use an aggressive response curve in SpeedFan and run at nearly 100% with no distractions


The only other type of 120mm Cougar I have seen (and have 3 of on my RX360) is the Cougar Turbine. Basically a Vortex fanblade with a plain, 800-1200rpm and mostly OEM.

I have 2 of these Cougar fans (in black) on my 295X2's radiator. And like I stated before, I have a fan-curve setup in speefan as follows:


My gaming temperatures vary from around ~48-55'c for single GPU games, and 56-62'c for dual-GPU games. The red highlighted line is where the temps normally range from, and the red line is 60'c (1100rpm). Never seen it go above 63'c before while gaming.

OC is a little more, around 53-58'c for single GPU gaming, and 58-66'c for dual GPU. But I only played Ark:Survival OC'ed for about an hour.

Right now I am folding on my 2nd GPU, and using my main GPU for watching movies - and my temperatures are 47/54'c


----------



## SAFX

Quote:


> Originally Posted by *joeh4384*
> 
> How would you rank them in noise? How loud are the GTs?


I tried testing noise levels, but without proper equipment, I relied on decibel meter apps on my phone, and man, are they sensitive! Even after unplugging several appliances it was impossible to eliminate all background noise, so I gave up. Here's my unscientific results









(on max power, not down-volting)
SP120 PE - Jet Engine
SP120 QE - Not audible
Corsair Vortex - Barely audible
Gentle Typhoon - Barely/not audible


----------



## Alex132

I prefer to rate fans at similar rpm/airfow for their noise.

Saying X fan is louder than Y fan is so... vague.
X fan could be running at 1500rpm while Y fan is running at 1200rpm.
Or X fan could produce less SP/CFM at 1200rpm than Y fan - even if at that RPM their noise levels were the same.










edit- if you want true turbine-sounding - check out the CM Excalibers. 2000rpm of ear-shattering pain.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> I prefer to rate fans at similar rpm/airfow for their noise.
> 
> Saying X fan is louder than Y fan is so... vague.


Hence *"unscientific"* results









I never go with fan specs, too many variables effect air flow, temp, pressure, etc., that's why I tested several fans to see what worked best for me.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alex132*
> 
> I prefer to rate fans at similar rpm/airfow for their noise.
> 
> Saying X fan is louder than Y fan is so... vague.
> X fan could be running at 1500rpm while Y fan is running at 1200rpm.
> Or X fan could produce less SP/CFM at 1200rpm than Y fan - even if at that RPM their noise levels were the same.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit- if you want true turbine-sounding - check out the CM Excalibers. 2000rpm of ear-shattering pain.


NF-F12 iPPC 3000rpm fans......nothing i've owned comes close to them on noise levels so far


----------



## SAFX

Quote:


> Originally Posted by *Sgt Bilko*
> 
> NF-F12 iPPC 3000rpm fans......nothing i've owned comes close to them on noise levels so far


Does that fan work well for air flow or static pressure?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> NF-F12 iPPC 3000rpm fans......nothing i've owned comes close to them on noise levels so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does that fan work well for air flow or static pressure?
Click to expand...

Airflow mainly, does decently for static pressure but not as good as other fans that are quieter and cheaper.

they do well on thin rads (like my EX240's) but on thicker ones you'd want better









I'm thinking i might go with the EK Vardar's next time round


----------



## Petet1990

Whats up guys? i just picked up an R9 295x2. Im having issues with the GPU usage. GPU 1 goes to 100% as soon as i run a game and GPU 2 stays at zero. i dont think im bottlenecked, i had my CPU (5820k) at 4.5. Weird thing is at first both GPUs were at 100% but it would stutter. Now only one GPU is being used.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Petet1990*
> 
> Whats up guys? i just picked up an R9 295x2. Im having issues with the GPU usage. GPU 1 goes to 100% as soon as i run a game and GPU 2 stays at zero. i dont think im bottlenecked, i had my CPU (5820k) at 4.5. Weird thing is at first both GPUs were at 100% but it would stutter. Now only one GPU is being used.


Silly question time....

Are you in Fullscreen?

Do you have both 8 Pins firmly plugged in?

what game and what settings/res?


----------



## Petet1990

i am in full screen, yes both 8 pins are plugged in and im running 4320x2560. I tried it on BFH


----------



## Sgt Bilko

Quote:


> Originally Posted by *Petet1990*
> 
> i am in full screen, yes both 8 pins are plugged in and im running 4320x2560. I tried it on BFH


Very interesting....

No reason why both cores shouldn't be loading up...tried upping the powerlimit in afterburner or CCC?

other than that i'd say reinstall the drivers if this is happening in more than one game


----------



## Petet1990

ya power limit is up, i guess im just going to reinstall drivers and see what happens. Thanks


----------



## xer0h0ur

You may need to do a thorough manual uninstall to get rid of it. Don't know what method you normally use but I no longer am just DDUing and installing new driver. The manual method has been giving me less headaches with driver performance.


----------



## Petet1990

Quote:


> Originally Posted by *xer0h0ur*
> 
> You may need to do a thorough manual uninstall to get rid of it. Don't know what method you normally use but I no longer am just DDUing and installing new driver. The manual method has been giving me less headaches with driver performance.


Which way is the manual way? Thanks


----------



## xer0h0ur

I have been doing it like this now:

I uninstalled using the AMD Catalyst Install Manager, restarted then followed these instructions: http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers, restarted into safe mode to run DDU, booted back into windows to uninstall Afterburner without keeping its settings, restarted and installed the Cat 15.7, restarted and installed Afterburner. Afterburner then of course required restarts to read my hardware and then again to disable ULPS.

I only still used DDU as I am not completely certain what DDU wipes out that I may not be getting to with those manual instructions.


----------



## Petet1990

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have been doing it like this now:
> 
> I uninstalled using the AMD Catalyst Install Manager, restarted then followed these instructions: http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers, restarted into safe mode to run DDU, booted back into windows to uninstall Afterburner without keeping its settings, restarted and installed the Cat 15.7, restarted and installed Afterburner. Afterburner then of course required restarts to read my hardware and then again to disable ULPS.
> 
> I only still used DDU as I am not completely certain what DDU wipes out that I may not be getting to with those manual instructions.


Im going to try this in the morning thanks for the help


----------



## xer0h0ur

No problem. I had all sorts of odd issues with my driver installs over the past couple of months or so. Particularly BSODing simply trying to load games or having Afterburner go flat out nuts on me by screwing up my clock speed sliders, making my clocks go all over the place on three gpus etc. etc.

Just in case the driver re-install doesn't do it for you, I have an ace in the hole. Assuming you are using Afterburner. Open it and go to the settings. Under the setting called "Unofficial overclocking mode" select "without powerplay support." Doing this forces your 3D clocks non-stop, even while just idling in Windows. I warn you though, this setting may force you to have to re-install Afterburner to undo it and it of course will affect your temperatures as your GPUs will be at constant load.


----------



## joeh4384

Quote:


> Originally Posted by *SAFX*
> 
> I tried testing noise levels, but without proper equipment, I relied on decibel meter apps on my phone, and man, are they sensitive! Even after unplugging several appliances it was impossible to eliminate all background noise, so I gave up. Here's my unscientific results
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (on max power, not down-volting)
> SP120 PE - Jet Engine
> SP120 QE - Not audible
> Corsair Vortex - Barely audible
> Gentle Typhoon - Barely/not audible


Thanks, considering replacing my SP120s as I can't run the 2nd one at full blast without sounding like a jet engine.Is this the AP 15?

http://www.amazon.com/Nidec-Gentle-Typhoon-D1225C12B5AP-Silent/dp/B001Q6RUVO/ref=sr_1_1?ie=UTF8&qid=1436631189&sr=8-1&keywords=AP+15


----------



## wermad

Quote:


> Originally Posted by *joeh4384*
> 
> Thanks, considering replacing my SP120s as I can't run the 2nd one at full blast without sounding like a jet engine.Is this the AP 15?
> 
> http://www.amazon.com/Nidec-Gentle-Typhoon-D1225C12B5AP-Silent/dp/B001Q6RUVO/ref=sr_1_1?ie=UTF8&qid=1436631189&sr=8-1&keywords=AP+15


imagine 65 sp120 high performance! Mix2 controller solves that and they run very nicely @ 4.8v


----------



## Dagamus NM

I hope I like the 140mm Cougars, I have 30 of them. Enough for five aquacomputer 420 rads in PP.

So what would be the recommendation for two 24" monitors matched to 2X2s? I see that Dell makes a 24" 4k monitor that seems reasonably priced. I have two asus 28" 4ks right no and they are just too long for where I want them.

I would even go 2k. The Rog swift seems nice enough but again will be too big. 24" would be just right. Under $1K each seems to be my only other requirement.

Unrelated, but I got my four asus 980ti's in day before yesterday and I really do like the look of the reference cooler but Jesus, for $650 each you think they could have included a backplate. Well O should have some nickel back plates for them at some point.


----------



## tagaxxl

did you connect your Cougar fans on a fan controller or on the graphic card fan cable?

so what to buy Cougar fans http://www.cougargaming.com/products/fans/vortex_pwm.html
or Gentle Typhoon
http://www.amazon.co.uk/Scythe-Case-Gentle-Typhoon-D1225C12B5AP-15/dp/B001Q6RUVO/ref=sr_1_1?ie=UTF8&qid=1436633921&sr=8-1&keywords=Nidec+Servo+Gentle+Typhoon+D1225C12B5AP
?

i want to keep temp 65c max


----------



## ur4skin

I have both my rads in push/pull configuration with corsair sp fans and my temps dont go more than 68c


----------



## SAFX

Quote:


> Originally Posted by *joeh4384*
> 
> Thanks, considering replacing my SP120s as I can't run the 2nd one at full blast without sounding like a jet engine.Is this the AP 15?
> 
> http://www.amazon.com/Nidec-Gentle-Typhoon-D1225C12B5AP-Silent/dp/B001Q6RUVO/ref=sr_1_1?ie=UTF8&qid=1436631189&sr=8-1&keywords=AP+15


Yup, that's it, you can also get them at performance-pcs.com, sleeved, much fancier









http://www.performance-pcs.com/misc-120x25mm-fans/darkside-gentle-typhoon-performance-radiator-fan-2150rpm-68cfm-black-edition.html


----------



## SAFX

Quote:


> Originally Posted by *ur4skin*
> 
> I have both my rads in push/pull configuration with corsair sp fans and my temps dont go more than 68c


I get the same temps, max 68C, but using a single AP-15, pushing cold air from _outside_ the case. I never thought this would work since it's counter-intuitive to normal air flow, but it works very well









This is my fan setup (SP120 is in the front, not rear, it's an older photo). I had to remove the entire drive bay in my case for this to work, otherwise, there's simply no room for those rads in the front. Originally, the rads were located where the Cougar 140's are now; Phantom 410 has lots of room up there, but this setup (pictured below) is way more efficient. Cold intake for rads is a must. Yes, it blows a crap-load of hot air _into_ the case, but dual 140's on top exhaust heat very quickly.

Edit - In case anyone's wondering about the Akasa 140 Viper, I got mine off ebay, shipped from England to New York.


----------



## SAFX

Quote:


> Originally Posted by *tagaxxl*
> 
> did you connect your Cougar fans on a fan controller or on the graphic card fan cable?
> 
> so what to buy Cougar fans http://www.cougargaming.com/products/fans/vortex_pwm.html
> or Gentle Typhoon
> http://www.amazon.co.uk/Scythe-Case-Gentle-Typhoon-D1225C12B5AP-15/dp/B001Q6RUVO/ref=sr_1_1?ie=UTF8&qid=1436633921&sr=8-1&keywords=Nidec+Servo+Gentle+Typhoon+D1225C12B5AP
> ?
> i want to keep temp 65c max


I use SpeedFan to control my x2 rad fan, which is connected to my motherboard.


----------



## rakesh27

Off topic... I had a 290x + 295x2 with latest drivers including beta what a nightmare.... So I decide to take my 290x out and put in a 980TI.

Great I'm thinking from what I've heard on the web running amd and nvidia cards together is not to good...

It works, installed drivers for each then when ever I want to play with a particular brand I just plug the monitor cable into that card...

I've tested a few games and it works, not to many problems I've even upgraded my amd driver to the official one that's just come out and it went smoothly...

It's great getting the best of both AMD and Nvidia. It works, I tell u, it works


----------



## Thelrior

Hey guys,
Well i didnt found a answer to my question now and if u dont know it who else.
Is there a way too disable my 2nd GPU on my 295x2 for Single Card Benchmark Scores. I got a monster GPU chip, which is ocd to 1200/1625 without custom bios, but my 4790k bottlenecks me on my dualgpu scores, wgich leads me to single gpu scores where the bottleneck ia less. Hope u guys can help me.


----------



## Thelrior

Quote:


> Originally Posted by *Thelrior*
> 
> Hey guys,
> Well i didnt found a answer to my question now and if u dont know it who else.
> Is there a way too disable my 2nd GPU on my 295x2 for Single Card Benchmark Scores. I got a monster GPU chip, which is ocd to 1200/1625 without custom bios, but my 4790k bottlenecks me on my dualgpu scores, wgich leads me to single gpu scores where the bottleneck ia less. Hope u guys can help me.


Ps. My fire strike score:http://www.3dmark.com/fs/5374012


----------



## crislevin

I think you can disable crossfire in the ccc, if you just want it disabled for this benchmark, you can add an application in gaming settings in ccc, and disable crossfire for it.


----------



## Thelrior

Quote:


> Originally Posted by *crislevin*
> 
> I think you can disable crossfire in the ccc, if you just want it disabled for this benchmark, you can add an application in gaming settings in ccc, and disable crossfire for it.


Oh really i havent found that now may i take another lookat it when im @home. But thanks


----------



## Dagamus NM

The guy above is correct about the custom profile to disable crossfire for fire strike of whatever app you are interested in.


----------



## xer0h0ur

Application profile for 3dmark. Assuming you can find the path to the executable for it. I vaguely remember some sort of shenanigans like multiple executables for 3dmark.


----------



## joeh4384

Quote:


> Originally Posted by *SAFX*
> 
> Yup, that's it, you can also get them at performance-pcs.com, sleeved, much fancier
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.performance-pcs.com/misc-120x25mm-fans/darkside-gentle-typhoon-performance-radiator-fan-2150rpm-68cfm-black-edition.html


I ordered a pair. It might be worth it to make it quieter.


----------



## SAFX

Quote:


> Originally Posted by *joeh4384*
> 
> I ordered a pair. It might be worth it to make it quieter.


Hope it works, and take a look at the photo I posted yesterday illustrating my fan setup, hope it helps


----------



## Dagamus NM

Ok, so after wasting the weekend I think I have found the monitors I will use with my X2's. I just want to see if anybody has any comments on these one way or another. Acer h257hu.

They are 25" panels which will be tight on my desk but better than the oversized asus 28" uhd monitors I have. They have the lowest response time of any of the IPS I have seen at 4ms, slower than the 1ms of the asus but that is a TF screen.

Looks like it can go to 80Hz, I will probably just leave it at 60 though.


----------



## Mega Man

Quote:


> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SAFX*
> 
> I tried testing noise levels, but without proper equipment, I relied on decibel meter apps on my phone, and man, are they sensitive! Even after unplugging several appliances it was impossible to eliminate all background noise, so I gave up. Here's my unscientific results
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (on max power, not down-volting)
> SP120 PE - Jet Engine
> SP120 QE - Not audible
> Corsair Vortex - Barely audible
> Gentle Typhoon - Barely/not audible
> 
> 
> 
> Thanks, considering replacing my SP120s as I can't run the 2nd one at full blast without sounding like a jet engine.Is this the AP 15?
> 
> http://www.amazon.com/Nidec-Gentle-Typhoon-D1225C12B5AP-Silent/dp/B001Q6RUVO/ref=sr_1_1?ie=UTF8&qid=1436631189&sr=8-1&keywords=AP+15
Click to expand...

ap 15 is really a misnomer but yes it is a "ap-15" yes they are real ( the ones you get are NOT ap-15s- ap-15 has something to do with customer/order numbers

the speed is actually a different digit in the MN but i am to lazy to look it up feel free to ask in the GT club, iirc the mn breakdown is in the op but i could be wrong )

just a fyi coolerguys has great CS !~ def would order from them again
ps just order it directly from coolerguys [email protected]~~~


----------



## al3x360

Hey guyz, i have a problem with my R9 295 X2 , the temps are very high, i mean near 70° without overclock. i have a GPU throttle WHen gaming the Frequency don't stay at 1015Mhz all the time, if the temp reach about 65~70° the GPU Freq drops :/ . dou you have an idea about this problem ?


----------



## TooManyAlpacas

I can't tell you how to fix the clock drops but, if your temps are getting too high try to get a second fan on the rad to do push/pull it lowers the temps about 5-10 degrees


----------



## al3x360

hey, my set up is already on push pull, but i did it with the fronts (stock) fans from the Case, on the front i've 2 140MM blowing air into the case, then i switched the Watercool stock rad from the R9 295x2, with a Corsair SP 120 Quiet edition .

i increased the RPM of the front fans so more air is pulled from outside to go inside after 15~30 Min GTAV ( not maxed out because i get < 40 FPS ... i dont understand why ? + its unplayable ) i was reaching as you said about 60 °C max .

i switched to Bios 1 to see if anything has changed but nothing changed .


----------



## Alex132

What temperatures are your VRMs?


----------



## SAFX

Quote:


> Originally Posted by *al3x360*
> if the temp reach about 65~70° the GPU Freq drops


That doesn't make sense; what are you using to read temperatures?

Quote:


> Originally Posted by *al3x360*
> 
> then i switched the Watercool stock rad from the R9 295x2, with a Corsair SP 120 Quiet edition


SP120 Quiet Edition's are worthless (for radiators), switch to Performance Edition, Gentle Typhoons, or Noctua's.


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *al3x360*
> if the temp reach about 65~70° the GPU Freq drops
> 
> 
> 
> That doesn't make sense; what are you using to read temperatures?
> 
> Quote:
> 
> 
> 
> Originally Posted by *al3x360*
> 
> then i switched the Watercool stock rad from the R9 295x2, with a Corsair SP 120 Quiet edition
> 
> Click to expand...
> 
> SP120 Quiet Edition's are worthless (for radiators), switch to Performance Edition, Gentle Typhoons, or Noctua's.
Click to expand...

EK Varder => GT > NB = Cougar = beQuiet = Noctua > Corsair fans + rest

In terms of noiseerformance and probably overall performance too.


----------



## k18682699

hello I count on Your source to http://www.solohardware.net/products/19888-thermaltake-toughpower-grand-tpg-0850dpcg-850-w/
I get an error shuts down the PC When you play

Which Serves me ?

http://www.solohardware.net/products/10836-thermaltake-toughpower-grand-tpg-1200m-1200-w/
http://www.solohardware.net/products/12837-evga-supernova-120-g2-1000-xr-1000-w/
http://www.solohardware.net/products/17853-corsair-rm-series-rm1000-1000-w/
http://www.solohardware.net/products/16653-thermaltake-toughpower-1200w-gold-ps-tpd-1200mpcg-1200-w/

only one r9 295x2 sapphire

please help


----------



## kayan

Quote:


> Originally Posted by *k18682699*
> 
> hello I count on Your source to http://www.solohardware.net/products/19888-thermaltake-toughpower-grand-tpg-0850dpcg-850-w/
> I get an error shuts down the PC When you play
> 
> Which Serves me ?
> 
> http://www.solohardware.net/products/10836-thermaltake-toughpower-grand-tpg-1200m-1200-w/
> http://www.solohardware.net/products/12837-evga-supernova-120-g2-1000-xr-1000-w/
> http://www.solohardware.net/products/17853-corsair-rm-series-rm1000-1000-w/
> http://www.solohardware.net/products/16653-thermaltake-toughpower-1200w-gold-ps-tpd-1200mpcg-1200-w/
> 
> only one r9 295x2 sapphire
> 
> please help


Personally I'd go with either the EVGA g2, (I just built a pc for a friend with the 750w version, and it was a really good quality psu) or a cooler master v1000, from personal experience.


----------



## k18682699

one thermaltake 850w no found


----------



## Dagamus NM

I also agree with the eVGA PSU. 1000W P2 is what I would get if I were you. I have one in the PC I built for my mom with sli 670ftws. I love it.

I have a Lepa 1600W and it is not my favorite. I am over the multirail psus as they are not up to the task of running two X2's imho. I have a single rail eVGA 1600W P2 coming to do what I wanted this stupid Lepa to do in the first place.


----------



## D3AD PIX3L

Does anyone on here play CSGO? I have looked all over the web and can't find a solution to my problem. Since I have gotten this card - When playing CSGO I get these random micro stutters from time to time. I have tried it with and without crossfire disabled and still getting the random stutter. Anyone on here have a solution?


----------



## DividebyZERO

Quote:


> Originally Posted by *Dagamus NM*
> 
> I also agree with the eVGA PSU. 1000W P2 is what I would get if I were you. I have one in the PC I built for my mom with sli 670ftws. I love it.
> 
> I have a Lepa 1600W and it is not my favorite. I am over the multirail psus as they are not up to the task of running two X2's imho. I have a single rail eVGA 1600W P2 coming to do what I wanted this stupid Lepa to do in the first place.


Not sure why the Lepa isnt working for you but are you aware of the shared rails on the connector side of the psu? I am not sure where the diagram is but when using so many rails some of them are shared and can cause OCP trips if you are using both at the same time.

I have 4 290xs on a lepa g1600 and it does the job, my problem is i am running dual socket 1366 so i cant go high on gpu overclocks like i could before on my x79 quadcore. If your using a 4670k i don't see how your running out of juice.

Edit the 4670k comment part was me viewing the wrong sig rig. I see yours isn't listed so i can only assume your running power heavy platform?


----------



## Sgt Bilko

Quote:


> Originally Posted by *D3AD PIX3L*
> 
> Does anyone on here play CSGO? I have looked all over the web and can't find a solution to my problem. Since I have gotten this card - When playing CSGO I get these random micro stutters from time to time. I have tried it with and without crossfire disabled and still getting the random stutter. Anyone on here have a solution?


Borderless windowed mode plays pretty well


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> I also agree with the eVGA PSU. 1000W P2 is what I would get if I were you. I have one in the PC I built for my mom with sli 670ftws. I love it.
> 
> I have a Lepa 1600W and it is not my favorite. I am over the multirail psus as they are not up to the task of running two X2's imho. I have a single rail eVGA 1600W P2 coming to do what I wanted this stupid Lepa to do in the first place.


(looks at own rig)


----------



## DividebyZERO

Quote:


> Originally Posted by *wermad*
> 
> (looks at own rig)


I believe this is correct but not 100% sure however it is lepa's own site?

http://www.lepatek.eu/g1600/



For the record my only complaint about this PSU in my use is it's quite loud when under load. It is way louder than the rest of my system which is all watercooled.


----------



## electro2u

Good lord what a mess


----------



## wermad

Rails #3 & 4 go to card #1; rails #5 & 6 go to card #2. I just removed one 6+2 cable from four pcie harnesses to clean things a bit. Rails 1 and 2 do 20 amps each, while 3-6 have 30amps each ( 28 amps for each 8 pin or 50 amps combined for dual 8 pin per amd).

Big thanks to Megaman for this info prior to getting my lepa connected


----------



## DividebyZERO

well its just easier to



EDIT: Also, Wermad if you look at the diagram above it shows exactly what you mentioned? i know it shows 4 gpus assuming your running 4 singles, however it looks correct for 295x2 also as it shows 3/4 rail to first 2 gpus and 5/6 to other two.


----------



## wermad

That's for 4-way xfire (ie 4x 290x).

295x has two 8-pin, so with the lepa 1600, it's recommended to use one rail for each plug.

Notice in the pic (lepa also includes the same diagram) the pcie plugs have "12V3" etc., where the number after the voltage identifies the rail it uses. This is crucial as some vga/pcie plugs on the psu share a rail. With the lepa, just make sure you're only using one rail *per* 8-pin.

The important thing to know is how many rails and how much does each one run. In a single rail unit like the Ax1500, any vga/pcie cable will work.

I crated a list (now in the op) to list some of the units that meet the amd requirements. But, it does not tell you the distribution for multi rail units and their plugs. The manufacturer should have this

I'll do a diagram in a bit (on cell atm).

edit:



Note how the two left plugs are shared with rail #3 and the bottom-middle and bottom-right share #5 rail.


----------



## Mega Man

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *al3x360*
> if the temp reach about 65~70° the GPU Freq drops
> 
> 
> 
> That doesn't make sense; what are you using to read temperatures?
> 
> Quote:
> 
> 
> 
> Originally Posted by *al3x360*
> 
> then i switched the Watercool stock rad from the R9 295x2, with a Corsair SP 120 Quiet edition
> 
> Click to expand...
> 
> SP120 Quiet Edition's are worthless (for radiators), switch to Performance Edition, Gentle Typhoons, or Noctua's.
> 
> Click to expand...
> 
> EK Varder => NB = Cougar = beQuiet = Noctua > Corsair fans + rest
> though GT > all !~
> In terms of noiseerformance and probably overall performance too.
Click to expand...

fixed for you -- your welcome








Quote:


> Originally Posted by *Dagamus NM*
> 
> I also agree with the eVGA PSU. 1000W P2 is what I would get if I were you. I have one in the PC I built for my mom with sli 670ftws. I love it.
> 
> I have a Lepa 1600W and it is not my favorite. I am over the multirail psus as they are not up to the task of running two X2's imho. I have a single rail eVGA 1600W P2 coming to do what I wanted this stupid Lepa to do in the first place.


Quote:


> Originally Posted by *DividebyZERO*
> 
> well its just easier to
> 
> 
> 
> EDIT: Also, Wermad if you look at the diagram above it shows exactly what you mentioned? i know it shows 4 gpus assuming your running 4 singles, however it looks correct for 295x2 also as it shows 3/4 rail to first 2 gpus and 5/6 to other two.


have to add lepa g1600 at its time of release was the best, it is not top dog anymore - which is ok, but they did not have single rail units ( nor the tech at the time to make them ) this is a OLD PSU now theres the leadex platform single rail up to 2kw in size --- do you know the years difference between the lepa and the leadex ??


----------



## al3x360

Quote:


> Originally Posted by *Alex132*
> 
> What temperatures are your VRMs?


i use CCC catalyst control center + Open HW monitor + CPUID HW Mon when the GPU ( Bios 2 ) is > 60°C i have GPU freq drops ... i think the motherboard is the main problem here, i've 1600W Leadex platinum to fuel the Radeon 295X2 ( i plan to buy another soon ) .


----------



## Dagamus NM

Haha, you guys are too much. Ditching afterburner helped with some of the stability issues I was seeing but I still cannot get through a single Heaven run without hitting 75C and throttling. I have my water blocks so I guess I should install them. If the core is reaching 75C I can old imagine the lava at the vrms.

I tried lowering the VDDC to 0950 in Trixx and it looked like EDC. Rainbow starbursts and lines in Heaven. Well I just tried a few other settings and none were too happy. The cards seem happy to run at 1078MHz at normal voltage until they get too hot and throttle.

I have mine setup the way you have it listed wermad. I am moving the Lepa to my CL case where I will have two matching PSUs. So I wanted another one and decided to give the eVGA a go. I will tear it apart when my PSU comes in sometime this week and I will stick the blocks on.

I think I will run the two X2's off of the Aquacomputer modularity 420mm with built in pump and reservoir. I am not sure I want to add my CPU to the loop quite yet. I think I will get another single loop 420 and butt them side by side with another pump thrown in the mix somewhere.

Anybody here see 8 pack's octofire x99 Pi calculator from across the pond?


----------



## D3AD PIX3L

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Borderless windowed mode plays pretty well


Tried that, still getting the stutter. I may try a fresh install next. It's a total bummer to have such a powerful card and have these issues with a game that isn't very demanding.


----------



## boredmug

Quote:


> Originally Posted by *DividebyZERO*
> 
> well its just easier to
> 
> 
> 
> EDIT: Also, Wermad if you look at the diagram above it shows exactly what you mentioned? i know it shows 4 gpus assuming your running 4 singles, however it looks correct for 295x2 also as it shows 3/4 rail to first 2 gpus and 5/6 to other two.


LMAO! This picture!


----------



## Dagamus NM

My new PSU just arrived. Oh how I want to ditch my responsibilities and go tear it down. Nope, gotta force myself to perform some audits and go to a root cause analysis meeting for a patient that passed away when they shouldn't have. Once I get into the meeting the last thing I will be thinking about is my computer, but I have five hours between now and then.

Anyhow, first impression is that the cables are far better than the Lepa and even the platimax I have. I very much like the inclusion of four separate 6+2 pcie vga cables. Perfect for the needs of two X2's. Yes it still has several of the 6+2's with the connected six on it and yes I am glad that I won't need to use those.

I would not have bought this if I did not have a need for another PSU. I would have stuck it out with the Lepa.

For somebody making the choice of which one to buy, at this point I would say the eVGA. I still need to get it up and running and then decide if there is any difference.


----------



## SAFX

Quote:


> Originally Posted by *DividebyZERO*
> 
> well its just easier to
> 
> 
> 
> EDIT: Also, Wermad if you look at the diagram above it shows exactly what you mentioned? i know it shows 4 gpus assuming your running 4 singles, however it looks correct for 295x2 also as it shows 3/4 rail to first 2 gpus and 5/6 to other two.










funny


----------



## magicase

Would 390 or 390x be the better choice to CF with a 295x2?


----------



## wermad

It's been a while but amd xfire allows you to run each gpu at its speed. It no longer down-clocks the faster gpu's to match the lower clocked gpu(s). You can mix 290/290x/295x2/390/390x. Tbh, you can save some dough and get a 290/290x since vram will be dropped on the 390x to match the 4gb's of 295x2.


----------



## magicase

Impossible to get a 290x now unfortunately.


----------



## wermad

Look for used. I've seen a bunch of Oz forums that may be a source for one.

It's a complete waste imho pairing a 390/390x with 4gb 29X. If you have the 295x2, sell it and get two 390x.


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicase*
> 
> Impossible to get a 290x now unfortunately.


XFX DD 290x Here $469.00

Asus DCU II 290x Here $495.00

Gigabyte WF3 R9 290x Here $439.00

That's a couple of options, Scorptec also has a Giga R9 290 in stock as well if that interests you?


----------



## magicase

Quote:


> Originally Posted by *wermad*
> 
> Look for used. I've seen a bunch of Oz forums that may be a source for one.
> 
> It's a complete waste imho pairing a 390/390x with 4gb 29X. If you have the 295x2, sell it and get two 390x.


That would be quite a costly change just to have a minor upgrade in performance
Quote:


> Originally Posted by *Sgt Bilko*
> 
> XFX DD 290x Here $469.00
> 
> Asus DCU II 290x Here $495.00
> 
> Gigabyte WF3 R9 290x Here $439.00
> 
> That's a couple of options, Scorptec also has a Giga R9 290 in stock as well if that interests you?


Prices after shipping would be around the same as 390 unfortunately


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicase*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Look for used. I've seen a bunch of Oz forums that may be a source for one.
> 
> It's a complete waste imho pairing a 390/390x with 4gb 29X. If you have the 295x2, sell it and get two 390x.
> 
> 
> 
> That would be quite a costly change just to have a minor upgrade in performance
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> XFX DD 290x Here $469.00
> 
> Asus DCU II 290x Here $495.00
> 
> Gigabyte WF3 R9 290x Here $439.00
> 
> That's a couple of options, Scorptec also has a Giga R9 290 in stock as well if that interests you?
> 
> Click to expand...
> 
> Prices after shipping would be around the same as 390 unfortunately
Click to expand...

390 has 2560 SP's, 290x has 2816.

that's the difference


----------



## wermad

Well, if he wants to use a 390x gimped to 4gb to match the 295x2, have at it. We've already suggested plenty enough but his mind seems to have made a decision already. Good luck


----------



## Ironjer

Hi guys i update my ring from 2 gtx 980 SC to r9 295x2 but i have a problem with my board doesnt post with 2 295 the led debug show me d4 (pci resource allocation error. Out of resources) the board is an evga x99 micro then I will change for another with crossfire x support what do you recommend me?


----------



## wermad

Does it post with one 295x2? Also, which cpu are you running?

I would definitely get a full atx board to at least give that top card some breathing room.


----------



## Ironjer

Quote:


> Originally Posted by *wermad*
> 
> Does it post with one 295x2? Also, which cpu are you running?
> 
> I would definitely get a full atx board to at least give that top card some breathing room.


Yep, both post separately. Yeah I will maybe an eatx.


----------



## wermad

Its possible your board is assigning 16x to the top card and 4x to the bottom card if you have a 5280k. Check your bios if you can manually set them to 8x each.
Quote:


> Specifications
> 
> Chipset - Intel X99
> SLI - 2/3way
> SATA - 6 Native SATA 6.0Gb/s Ports
> RAID 0, 1, 5, 10, JBOD
> USB - 6 Native USB 3.0 / 8 USB 2.0
> Memory Support - 4 DIMM Quad-Channel (up to 64GB) DDR4 3200MHz+
> Capacitors - Solid State
> Ethernet - 1x Intel Gigabit NIC
> Audio - 8 Channel High Definition Audio + Optical
> Fan Headers - 5 (2 PWM, 3 DC)
> PCB - 8 Layers
> PCI-E Slot Arrangement - 1x16, 2x16, 3x8
> NVMe Support - Yes (PCI-E)
> Form Factor
> 
> M-ATX Form Factor
> Length: 9.6in - 244mm
> Width: 9.6in - 244mm


http://www.evga.com/Products/Product.aspx?pn=131-HE-E995-KR


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> I would definitely get a full atx board to at least give that top card some breathing room.


Do this, it will be very important actually.


----------



## Ironjer

I used the first and second PCIE_X16_1 and PCIE_X16_2

ummm i will try when get home.

I remember it was in pcie configuration option (AUTO) http://www.evga.com/support/manuals/files/BIOS/131-HE-E995_BIOS_GUIDE.pdf

my CPU has 40 lanes (5960x)


----------



## wermad

Quote:


> Originally Posted by *Ironjer*
> 
> I used the first and second PCIE_X16_1 and PCIE_X16_2
> 
> ummm i will try when get home.
> 
> I remember it was in pcie configuration option (AUTO) http://www.evga.com/support/manuals/files/BIOS/131-HE-E995_BIOS_GUIDE.pdf
> 
> my CPU has 40 lanes (5960x)


Fill out your specs so we can see your components. Click your profile and go to the rig editor at the bottom.

K, yeah your cpu should allow 16x/16x. Try the bottom slot?


----------



## Ironjer

Done.

But the button slot is 8x and will disable the usb headers power etc.


----------



## Ironjer

maybe the UEFI bios in the VGA?

what do you think? when i did test both cards was with normal bios.

i knowledge this cars comes with normal bios, i just update with UEFI bios uploaded in this thread but one card only.

maybe flashing the second card with the same UEFI bios will work?

PD: i am running UEFI mode for motherboard too.


----------



## wermad

Each card should have dual bios. 8x 3.0 is OK to run with this card.


----------



## Ironjer

Quote:


> Originally Posted by *wermad*
> 
> Each card should have dual bios. 8x 3.0 is OK to run with this card.


yeah, i know. i flashed the dual bios (MASTER and SLAVE).


----------



## gatygun

Quote:


> Originally Posted by *D3AD PIX3L*
> 
> Tried that, still getting the stutter. I may try a fresh install next. It's a total bummer to have such a powerful card and have these issues with a game that isn't very demanding.


Can you not just disable one core, 140 fps on a single 290x core will do fine probably.
Quote:


> Originally Posted by *Ironjer*
> 
> Hi guys i update my ring from 2 gtx 980 SC to r9 295x2 but i have a problem with my board doesnt post with 2 295 the led debug show me d4 (pci resource allocation error. Out of resources) the board is an evga x99 micro then I will change for another with crossfire x support what do you recommend me?


Beast of a machine, gratz on it. Looks awsome.


----------



## D3AD PIX3L

Quote:


> Originally Posted by *gatygun*
> 
> Can you not just disable one core, 140 fps on a single 290x core will do fine probably.


I tried disabling crossfire, but still getting the issue. Is there another way to disable the second core?


----------



## wermad

Quote:


> Originally Posted by *Ironjer*
> 
> yeah, i know. i flashed the dual bios (MASTER and SLAVE).


One card is flashed on both bios? And other stock bios? Tried switching the cards around?

Might be the slot. A lot matx boards force you to run the bottom slot for a a second card (a'la z77). Even with blocks you do run into the issue of blocking off your headers. If all this don't work, hit up the evga forum and ask there. A few evga gurus and employees hang out there and can answer this issue.

Wow, octo-core....would definitely drop in a RVE in there. My old RIVE was a rock and did everything without a hicup. Evga boards.....ehhh...they have improved but I won't trust them anymore. I'll drop their gpu's in my rig with no questions though


----------



## Ironjer

interesting post here http://forums.evga.com/Post-Code-D4-On-X99-Classified-m2309631.aspx i will try force to Gen3 all the PCIE


----------



## joeh4384

Quote:


> Originally Posted by *Ironjer*
> 
> Hi guys i update my ring from 2 gtx 980 SC to r9 295x2 but i have a problem with my board doesnt post with 2 295 the led debug show me d4 (pci resource allocation error. Out of resources) the board is an evga x99 micro then I will change for another with crossfire x support what do you recommend me?


My vrms throttled on a normal atx board running a hg10 cooled 290x below. I am sure you will run into vrm throttling.


----------



## wermad

Quote:


> Originally Posted by *Ironjer*
> 
> interesting post here http://forums.evga.com/Post-Code-D4-On-X99-Classified-m2309631.aspx i will try force to Gen3 all the PCIE


Awesome, let us know what happens







.

Btw, I'm sure you have the latest bios right?



http://www.evga.com/support/manuals/files/BIOS/131-HE-E995_BIOS_GUIDE.pdf

Quote:


> Originally Posted by *joeh4384*
> 
> My vrms throttled on a normal atx board running a hg10 cooled 290x below. I am sure you will run into vrm throttling.


Yeah, a few of us immediately suggested getting a board for better cooling if he's not gonna use the bottom pcie slot. He's running a 5960x, might as well drop in a RVE or x99-E WS


----------



## Ironjer

Yeah, i am 1.17


----------



## Ironjer

Bad news doesn't work. I tried everything. What do you think Gigabyte X99 G1 WIFI?


----------



## wermad

Lots of bios issues w/ the GB x99 series. I would go w/ Asus to be on the safe side. Rampage V Extreme is probably the most desired of all the X99 boards. I still don't trust eVGA with mb's but I've heard the Dark x79 was a good jump from their early terrible X79 boards. Just check out reviews if you decide to stick w/ EVGA (X99 Classy). Personally, I would pick up this beauty:


----------



## kayan

Quote:


> Originally Posted by *Ironjer*
> 
> Bad news doesn't work. I tried everything. What do you think Gigabyte X99 G1 WIFI?


Speaking from experience, I would stay very very far away from Giga x99 boards at the moment. The BIOS is really, really bad. I got a Giga UD-4 x99 board, and had nothing at all but problems with it. Ended up getting the Asrock x99x Killer 3.0 board. It's been rock solid on the 1.9 BIOS.

I say this coming from a diehard Gigabyte motherboard fan too. I've used them a ton over the last many years, but x99 is not up to par yet.


----------



## wermad

I must say some features of the Z97 "Sniper 7" (aka BLK Wifi) are a bit of a let down vs my old and lovely Sniper 5 Z87. But bios have been very sketchy for Gb X99. I've hesitated buying an "open box" from newegg despite the crazy low prices they were being listed at (numerous returns, retailer refurbs them, sells them for "open box").

How's Asrock X99 in terms of controlling heat? It was such a pain with the Pro X79 and the Extreme11. In the Asrock x79 club, the loud little fan they added to most models was a pain in the arse. I will admit RIVE was a bit loud but not as annoying as the Extreme11. The Pro get hot really quick though nothing dangerous it was just not my cup of tea especially w/ quad GTX 480s on water.


----------



## Ironjer

I have an ASUS Z10PE-D8 WS saved but won't post with my 5960X and the Corsair 3000. I don't have other ram for testing.


----------



## Roaches

Quote:


> Originally Posted by *Ironjer*
> 
> I have an ASUS Z10PE-D8 WS saved but won't post with my 5960X and the Corsair 3000. I don't have other ram for testing.


I'm afraid 2P boards are for Xeon chips only.

https://www.asus.com/us/Commercial_Servers_Workstations/Z10PED8_WS/HelpDesk_CPU/


----------



## wermad

Iron, stick to the forums and sooner or later you'll pick up on what's hot, what's not, and what's in between. With such a highend cpu, I vote for Rampage V Extreme and call it a day


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> I must say some features of the Z97 "Sniper 7" (aka BLK Wifi) are a bit of a let down vs my old and lovely Sniper 5 Z87. But bios have been very sketchy for Gb X99. I've hesitated buying an "open box" from newegg despite the crazy low prices they were being listed at (numerous returns, retailer refurbs them, sells them for "open box").
> 
> How's Asrock X99 in terms of controlling heat? It was such a pain with the Pro X79 and the Extreme11. In the Asrock x79 club, the loud little fan they added to most models was a pain in the arse. I will admit RIVE was a bit loud but not as annoying as the Extreme11. The Pro get hot really quick though nothing dangerous it was just not my cup of tea especially w/ quad GTX 480s on water.


I'm happy with the Asrock x99, but it doesn't feel as premium as a Giga or Asus board. The temps aren't hot and I definitely can't hear the board. I had an issue with random reboots, but after an email to support, got worked out. Apparently, Asrock boards don't like drives populating the 3_0 and/or 3_1 slots. Once I moved my boot drive to 3_0 and my other down to at least 3_2, all my issues went away. My CPU started to overclock, and my gpu stopped working improperly. It is a solid board but I do miss the little extras that really set Giga and Asus apart, but stability counts for a lot more than some perceived niceties.


----------



## Alex132

If I were to get an X99 board, I'd get the MSI X99 SLI PLUS. It's at a really good price range, and looks pretty damn good IMO

Or maybe a high-end EVGA X99, simply for the 90' 24pin and I/O shroud.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Iron, stick to the forums and sooner or later you'll pick up on what's hot, what's not, and what's in between. With such a highend cpu, I vote for Rampage V Extreme and call it a day


Wermad is right (as usual), you want the RVE or the Asus X99 WS-E. I kind of wish I had gone with one of each, but I went with two RVE's for my 5960X's and I won't complain other than part of my motivation behind EK monoblocks was to get rid of those garish "Extreme" IO covers they put on them.

Don't get me wrong, I like the idea of covering the IO area, but I might just have to sand it down and paint it to make it look like it should. I will still toss the rest for making my blocks work, to bad as the ROG light up thing in the middle is cute.


----------



## rakesh27

Ill chime in on this, i went from AMD all my life to my first Intel rig, i was set on gigabyte board as i had the amd ulitmate gigabyte board, which was fantastic.

Instead i went with the Asus X99 E WS board, as the specs were right for me, it reminded of my old gigabyte ud7 mobo and all i can say it blows anything away the asus does.

Check my specs, this rig is running 295x2 and 980Ti all in the same ring, i just swap monitor connections when i was wanna use a particular card for gaming, ive oc my intel 5960x to 3.6Ghz @ v1.3 and ram 32gb ram, @ 3000Mhz, quad channel.

Dont bet me wrong i had a few hiccups on the way but now it solid, its a beast of a board like my prev amd gigabyte 990fx ud7..


----------



## SAFX

I recommend ASRock X99 Extreme6/ac, great quality, UEFI, smooth OC (RAM 3000, no problem),...and not that it means anything, but it matches my favorite vest







, color match LIKE A BOSS!


----------



## xer0h0ur

Does that vest come with its own track suit?


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Does that vest come with its own track suit?


it does, but it's in the wash, ...who's DA MAN?!?


----------



## xer0h0ur

You da man


----------



## wermad

Lets post some game benchies to compare cpu powah!? Really interested in what these intel/amd octo's and hexa's compare to the quadies we lesser beings have (







).


----------



## xer0h0ur

Lets revisit this when DX 12 games using true multi-threading are in people's hands.


----------



## Dagamus NM

Quote:


> Originally Posted by *Alex132*
> 
> If I were to get an X99 board, I'd get the MSI X99 SLI PLUS. It's at a really good price range, and looks pretty damn good IMO
> 
> Or maybe a high-end EVGA X99, simply for the 90' 24pin and I/O shroud.


I do wish that Asus would put 90' on the 24 pin and all of the other connections at the edges. It looks super clean and is functional when there is something like a GPU trying to sit on top of it.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> Lets revisit this when DX 12 games using true multi-threading are in people's hands.


Only reason why I am gonna get Skylake-E when the hell ever it comes is because of DX 12








Quote:


> Originally Posted by *Dagamus NM*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> If I were to get an X99 board, I'd get the MSI X99 SLI PLUS. It's at a really good price range, and looks pretty damn good IMO
> 
> Or maybe a high-end EVGA X99, simply for the 90' 24pin and I/O shroud.
> 
> 
> 
> I do wish that Asus would put 90' on the 24 pin and all of the other connections at the edges. It looks super clean and is functional when there is something like a GPU trying to sit on top of it.
Click to expand...

Asus said that they didn't want to risk incompatibility with cases.

I mean, sure on lower-end motherboards... but on the X99 stuff just come on. Do it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Lets post some game benchies to compare cpu powah!? Really interested in what these intel/amd octo's and hexa's compare to the quadies we lesser beings have (
> 
> 
> 
> 
> 
> 
> 
> ).


Quote:


> Originally Posted by *xer0h0ur*
> 
> Lets revisit this when DX 12 games using true multi-threading are in people's hands.


I'd be down for that, should have a 4k monitor coming in the next couple of months so that'll be nice too


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> Lets revisit this when DX 12 games using true multi-threading are in people's hands.


Now is always a good time. No excuses, run what you brung!









I'm also interested in seeing how quads fair against tri-fire guys (295x2 + 290x). Quad Hawaii didn't get much praise like quad Tahiti did.
Quote:


> Originally Posted by *Dagamus NM*
> 
> I do wish that Asus would put 90' on the 24 pin and all of the other connections at the edges. It looks super clean and is functional when there is something like a GPU trying to sit on top of it.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Only reason why I am gonna get Skylake-E when the hell ever it comes is because of DX 12
> 
> 
> 
> 
> 
> 
> 
> 
> Asus said that they didn't want to risk incompatibility with cases.
> 
> I mean, sure on lower-end motherboards... but on the X99 stuff just come on. Do it.
> *snip*
Click to expand...

EVGA has been doing it for a while. I have seen some items go 90 such as usb 3.0 headers and gpu auxiliary power (molex and 6-pin), but not the primary ones. Besides the usual answer of compatibility with cases, I'm sure pro benchers won't like this when they get review samples. Its a good idea imho.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd be down for that, should have a 4k monitor coming in the next couple of months so that'll be nice too


Cool









I'm really hesistent of benching @ 2k since Amd has a tendency of stumbling at lower res but I know not everyone has a wqhd or uhd monitor yet. Last year's 3dfanboy competition had some benches in 720p and scores of 4/3way were lower then single or dual (







). Crappy bench? yeah, but amd suffered more and again its down to these highend gpu's amd has focused on running higher and extreme resolutions. We'll skip 3x2 wqhd eyefinity fo shur







.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Lets revisit this when DX 12 games using true multi-threading are in people's hands.
> 
> 
> 
> Now is always a good time. No excuses, run what you brung!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also interested in seeing how quads fair against tri-fire guys (295x2 + 290x). Quad Hawaii didn't get much praise like quad Tahiti did.
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> I do wish that Asus would put 90' on the 24 pin and all of the other connections at the edges. It looks super clean and is functional when there is something like a GPU trying to sit on top of it.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Only reason why I am gonna get Skylake-E when the hell ever it comes is because of DX 12
> 
> 
> 
> 
> 
> 
> 
> 
> Asus said that they didn't want to risk incompatibility with cases.
> 
> I mean, sure on lower-end motherboards... but on the X99 stuff just come on. Do it.
> *snip*
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> EVGA has been doing it for a while. I have seen some items go 90 such as usb 3.0 headers and gpu auxiliary power (molex and 6-pin), but not the primary ones. Besides the usual answer of compatibility with cases, I'm sure pro benchers won't like this when they get review samples. Its a good idea imho.
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I'd be down for that, should have a 4k monitor coming in the next couple of months so that'll be nice too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm really hesistent of benching @ 2k since Amd has a tendency of stumbling at lower res but I know not everyone has a wqhd or uhd monitor yet. Last year's 3dfanboy competition had some benches in 720p and scores of 4/3way were lower then single or dual (
> 
> 
> 
> 
> 
> 
> 
> ). Crappy bench? yeah, but amd suffered more and again its down to these highend gpu's amd has focused on running higher and extreme resolutions. We'll skip 3x2 wqhd eyefinity fo shur
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Yeah, on the lower res side of things my FX can't keep up with Trifire but at 1440p for the 295x2 alone it's fine, i always had plans to go to 4k so getting a 290x to throw in there was a no brainer imo.


----------



## Thelrior

Hello Com,
I've a problem with Catzilla 720p. My Results are about 11000 points which is pretty low in my opinion.

My Rig:
I7 4790k @4.9ghz
Gpu: R9 295x2 @1200/1625
Ram: Corsair Vengeance 1600 cl9
Psu: Leadex 1200w
Os: win7 pro
I only have that problem with Catzilla my firestrike score is 19100 with the same clocks.
Hope u guys can help me.
Regards Thelrior


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Lets post some game benchies to compare cpu powah!? Really interested in what these intel/amd octo's and hexa's compare to the quadies we lesser beings have (
> 
> 
> 
> 
> 
> 
> 
> ).


Is this what we are doing this weekend? What games?

Gives me a reason to get my blocks on.


----------



## Alex132

I guess I could participate too, but we should stick to stock.

Ol' 2500k still has some grunt in her.... I hope


----------



## Sgt Bilko

Quote:


> Originally Posted by *Thelrior*
> 
> Hello Com,
> I've a problem with Catzilla 720p. My Results are about 11000 points which is pretty low in my opinion.
> 
> My Rig:
> I7 4790k @4.9ghz
> Gpu: R9 295x2 @1200/1625
> Ram: Corsair Vengeance 1600 cl9
> Psu: Leadex 1200w
> Os: win7 pro
> I only have that problem with Catzilla my firestrike score is 19100 with the same clocks.
> Hope u guys can help me.
> Regards Thelrior


Run Catzilla, hit Esc when the benchmark starts then hit run again.

for some reason Crossfire doesn't kick in the first time around, if you want better scores then set the crossfire profile to AFR friendly and disable Tessellation

That should fix it for you


----------



## Alex132

Quote:


> Originally Posted by *Thelrior*
> 
> Hello Com,
> I've a problem with Catzilla 720p. My Results are about 11000 points which is pretty low in my opinion.
> 
> My Rig:
> I7 4790k @4.9ghz
> Gpu: R9 295x2 @1200/1625
> Ram: Corsair Vengeance 1600 cl9
> Psu: Leadex 1200w
> Os: win7 pro
> I only have that problem with Catzilla my firestrike score is 19100 with the same clocks.
> Hope u guys can help me.
> Regards Thelrior


Are you 100% sure your memory / core are stable? Overclock only 1 at a time.

And with GDDR5 it will correct any errors it can before crashing / failing. So you will think it's stable, and it looks stable, but you will get lower FPS / score.

For example, I can run my memory at 1550Mhz - but I get a lower score in 3DMark than at 1500Mhz.


----------



## wermad

Catzilla 720 overall was a bad bench. That was the bench I got some weird results in last years 3dfanboy competition. My single 7970 score was higher then my 3 and 4 way scores. This was the case for most amd owners. it wasn't that bad for Nvidia owners but I feel it might have lent to their overall come-from-behind victory. A few of us voiced our concern that it should be excluded next time (even the nvidia crowd).


----------



## Dagamus NM

Quote:


> Originally Posted by *Alex132*
> 
> I guess I could participate too, but we should stick to stock.
> 
> Ol' 2500k still has some grunt in her.... I hope


So defaults on both CPU and GPUs?


----------



## wermad

Maybe we should do stock, medium oc, and max oc. SB 2500k @ 3.7 vs a 5.0 9590....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Thelrior*
> 
> Hello Com,
> I've a problem with Catzilla 720p. My Results are about 11000 points which is pretty low in my opinion.
> 
> My Rig:
> I7 4790k @4.9ghz
> Gpu: R9 295x2 @1200/1625
> Ram: Corsair Vengeance 1600 cl9
> Psu: Leadex 1200w
> Os: win7 pro
> I only have that problem with Catzilla my firestrike score is 19100 with the same clocks.
> Hope u guys can help me.
> Regards Thelrior


At 1100/1400 on 2) 290's I was scoring 22k graphics....

You probably have something clocked too high, and it is throttling, or you are cranking your memory past it's comfort zone...

My Hynix based Tri-X 290 would do 1650MHz on the VRAM "stable" however, it would score lower and lower the further past 1500MHz I went....

Also, 1200 core on a 295x2 is pretty unbelievable. What voltage are you using for that?

At those clocks you should be seeing 23-25k in FireStrike, to be so shy of that mark tells me something is wrong for sure.


----------



## Dagamus NM

Medium is somewhat difficult to quantify and probably offers little for comparison unless it is what we run 24/7 and just have for the heck of it. Stock and maximum should be useful though.

So what game benchmarks? What games have benchmarks optimized for AMD?


----------



## Agent Smith1984

Ooooohhhhh

OC'd 295x2's battle it out on different CPU platforms.... LOVE IT


----------



## Alex132

Quote:


> Originally Posted by *Agent Smith1984*
> 
> At those clocks you should be seeing 23-25k in FireStrike, to be so shy of that mark tells me something is wrong for sure.


My firestrike at 1200/1500 for reference http://www.3dmark.com/fs/4954418


----------



## ViRuS2k

bios tweaking, you gotta love it
check out my insane 2d idle voltages


----------



## ViRuS2k

Quote:


> Originally Posted by *Alex132*
> 
> My firestrike at 1200/1500 for reference http://www.3dmark.com/fs/4954418


Something wrong with your score there, i can get 18xxxx with just 1150/1600 speeds.

my results @1150/1600

http://www.3dmark.com/fs/5408030

Graphics 25496
Score 18xxxx


----------



## Alex132

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> My firestrike at 1200/1500 for reference http://www.3dmark.com/fs/4954418
> 
> 
> 
> Something wrong with your score there, i can get 18xxxx with just 1150/1600 speeds.
> 
> my results @1150/1600
> 
> http://www.3dmark.com/fs/5408030
> 
> Graphics 25496
> Score 18xxxx
Click to expand...

2500k


----------



## ViRuS2k

Also our BIOS tweaking lets you have UEFI








i also added UEFI compatibilty to R9295x2, you guys should check it out in the GURU forums


----------



## Alex132

Quote:


> Originally Posted by *ViRuS2k*
> 
> Also our BIOS tweaking lets you have UEFI
> 
> 
> 
> 
> 
> 
> 
> 
> i also added UEFI compatibilty to R9295x2, you guys should check it out in the GURU forums


You're totally gonna have to tell me how to do this









Wait, is EUFI vBIOS even important?


----------



## Dagamus NM

Quote:


> Originally Posted by *ViRuS2k*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bios tweaking, you gotta love it
> check out my insane 2d idle voltages


For some reason that image just makes me think of


----------



## ViRuS2k

Quote:


> Originally Posted by *Dagamus NM*
> 
> For some reason that image just makes me think of


It should have been resized by the forum so dunno lol

anyway for bios tweaks guys just check out GURU3D forums.
and look for the : 390x bios leaked thread.


----------



## ViRuS2k

Quote:


> Originally Posted by *Alex132*
> 
> You're totally gonna have to tell me how to do this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wait, is EUFI vBIOS even important?


Its very very important, if you want to be able to use SECUREBOOT with UEFI for super fast windows bootup.
and also bios tweaking as msi afterburner does not work with secureboot but it does when you bios tweak in combination with UEFI addition ;D


----------



## Ironjer

The beast is alive new mobo asus x99 pro 3.1 testing


----------



## Alex132

Quote:


> Originally Posted by *Ironjer*
> 
> The beast is alive new mobo asus x99 pro 3.1 testing


Are you sucking hot air right on to the 295X2s, at the front?


----------



## Ironjer

Yeah, 200 mm in front (cosmos ii)


----------



## Alex132

I more meant from the 1 295X2 radiator, that heat is going straight into the 295X2's VRM fans


----------



## Ironjer

Not much, i will install two fan in the side panel.


----------



## Dagamus NM

These VRMs get super hot. I had a 200mm side fan blowing right onto the sides and both radiators in push pull exhausting out the front and rear of the case. With more spacing than you have in between the cards and could only get about halfway through a heaven run before the cards start throttling due to heat. While your layout looks nice and clean, I suspect that you will have the same heat issues I had if not worse.

I just finished installing my ek blocks.


----------



## xer0h0ur

You can retain the original cooler if you drop in some Fujipoly Ultra Extreme pads on the VRMs and I am sure if you do the manual control fan mod then you will have an even better result.


----------



## Thelrior

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Run Catzilla, hit Esc when the benchmark starts then hit run again.
> 
> for some reason Crossfire doesn't kick in the first time around, if you want better scores then set the crossfire profile to AFR friendly and disable Tessellation
> 
> That should fix it for you


That Esc thing worked for me my score is now 36381 points which isnt that bad 
thank you very much


----------



## Sgt Bilko

Quote:


> Originally Posted by *Thelrior*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Run Catzilla, hit Esc when the benchmark starts then hit run again.
> 
> for some reason Crossfire doesn't kick in the first time around, if you want better scores then set the crossfire profile to AFR friendly and disable Tessellation
> 
> That should fix it for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That Esc thing worked for me my score is now 36381 points which isnt that bad
> thank you very much
Click to expand...

You're welcome









It's a funny thing with Crossfire and Catzilla


----------



## Ironjer

Umm i dont wanna spent more money with waterblock etc i hope two exhaust fan 120mm side panel help


----------



## Ironjer

What do you think?


----------



## Dagamus NM

I think you should flip the 200mm at the front to exhaust and flip that rad too. You will still be heat limited but it won't be as bad.

On air I had the 200mm as intake as you do with the X2 rad up above it exhausting out of the 5.25 bays


----------



## Ironjer

i did flip the radiator fan but the heat was worse.


----------



## xer0h0ur

Unfortunately that is the flaw in the AIO cooler used for these cards. It was a half measure by only cooling the GPUs with it. At least AMD learned that lesson and applied it to the Fury X's cooling solution.


----------



## L1N3B3CK

Hi,

I'm planning to make an Mini ITX based build.

Is the 295x2 compatible with RMi corsair psu ? I'll buy the RM1000i one and i didn't see any mention of it in first post.

Thanks


----------



## electro2u

Quote:


> Originally Posted by *L1N3B3CK*
> 
> Hi,
> 
> I'm planning to make an Mini ITX based build.
> 
> Is the 295x2 compatible with RMi corsair psu ? I'll buy the RM1000i one and i didn't see any mention of it in first post.
> 
> Thanks


it's new and hasn't been verified yet but it will work fine. you'll want to set it to single rail mode using corsair link software before putting the card under a heavy load.


----------



## L1N3B3CK

Thanks for the tip !

Here are the others parts of my future build, have you any advice, or others tips ?









i7 4790k (with max oc)
Asus z97i-plus
2x8 ddr3 ram
Enermax liqmax ii 240
500gigs ssd and 1tb hdd
Corsair RM1000i
Phanteks enthoo evolv itx

I'm planning to play on a samsung u28d590d 4k monitor


----------



## tagaxxl

guys i got 2 noctua nf f12 pwn for push/pull connected with extension cable to the motheborad and and using speed fan at 100% they only work at 1050rpm - 1100rpm

motheboard msi x79a-gd45


----------



## blue1512

Quote:


> Originally Posted by *L1N3B3CK*
> 
> Hi,
> 
> I'm planning to make an Mini ITX based build.
> 
> Is the 295x2 compatible with RMi corsair psu ? I'll buy the RM1000i one and i didn't see any mention of it in first post.
> 
> Thanks


For new build I would suggest FuryX or FuryX2 if your budget allows. Those cards are made for ITX


----------



## Ragsters

There is a guy in my local craigslist who is sell 2 x 295x2 for $500 obo. Should I jump on this.


----------



## L1N3B3CK

Quote:


> Originally Posted by *blue1512*
> 
> For new build I would suggest FuryX or FuryX2 if your budget allows. Those cards are made for ITX


Well, i'd like to go with the fury x2. But there is no more info about it. And the price will be overkill (certainly around 1400€). I can spend 800-900 max for the VGA(s).in this range, 295x2 is the best for 4k


----------



## crislevin

Quote:


> Originally Posted by *Ironjer*
> 
> What do you think?


Official recommendation is to flip the top fans.


----------



## Ironjer

if i flip the top fans where supposed the radiator 1 fresh air take?


----------



## Dagamus NM

Leave the top as intake, flip the front 200 to exhaust and put the bottom 120 as exhaust from the case and roll with it.

My waterblocks are in, parallel triple bridge between cards with middle slot blocked out. EK supremacy Evo Elite Intel 2011 version is in. EVGA 1600 P2 is in, running cables now. Then just need to run tubing between these and my airplex modularity system and install drivers.

What games are we benchmarking? Everybody have Bioshock Infinite?


----------



## BradleyW

A warning for those with Witcher 3 and 1.07 Patch. Reduced CFX scaling from x2 to 1.3x on my end.


----------



## xer0h0ur

But are the flashing textures gone and did this patch actually deliver better gameplay?


----------



## BradleyW

Quote:


> Originally Posted by *xer0h0ur*
> 
> But are the flashing textures gone and did this patch actually deliver better gameplay?


I had no flickering on 1.06. Can't see any difference on 1.07 other than less fps.


----------



## xer0h0ur

Well you seemed to be one of the people reporting no flashing textures. More people were saying they still had them in crossfire.


----------



## BradleyW

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well you seemed to be one of the people reporting no flashing textures. More people were saying they still had them in crossfire.


Well, in the end it did not work out for me. Patch 1.07 has broken the experience on my end.


----------



## xer0h0ur

You're not the only one. Most are reporting worse performance and it also extends to consoles.


----------



## BradleyW

Quote:


> Originally Posted by *xer0h0ur*
> 
> You're not the only one. Most are reporting worse performance and it also extends to consoles.


I just tested 1.06 to 1.07 in oxfen whatever town... I got a 30fps reduction with enormous stuttering.


----------



## Mega Man

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> You're totally gonna have to tell me how to do this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wait, is EUFI vBIOS even important?
> 
> 
> 
> Its very very important, if you want to be able to use SECUREBOOT with UEFI for super fast windows bootup.
> and also bios tweaking as msi afterburner does not work with secureboot but it does when you bios tweak in combination with UEFI addition ;D
Click to expand...

here is one

http://www.club-3d.com/insights/thread/r9-295x-not-working-with-uefisecure-boot/


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> Medium is somewhat difficult to quantify and probably offers little for comparison unless it is what we run 24/7 and just have for the heck of it. Stock and maximum should be useful though.
> 
> So what game benchmarks? What games have benchmarks optimized for AMD?


Synthetic benches like 3dmark are optimized and sided to multicore gpus. Since there's already tons of places you can drop your 3dmark scores (as well as Valley/Heaven), I think its more appropriate to keep it to game benches. I know most games don't have any but few are out there.

As for a medium oc, take your stock (turbo) clock and your highest "sustained"/everyday clock, not talking about 5 minutes in a stripped windows to run a bench, add them and divide by two. So, mine does 4.6 with just some basic oc'ing and my stock turbo is 3.9. So my "medium clock would be ~4250.


----------



## TooManyAlpacas

Just a quick question has anyone updated to the 15.7 drivers and ran a Firestrike benchmark. As of the update my score is now 11xxx something. I don't know if it is a result of the driver or what is going on


----------



## crislevin

Quote:


> Originally Posted by *Ironjer*
> 
> if i flip the top fans where supposed the radiator 1 fresh air take?


The points to be considered is 1) push and pull, 2) hot air rises.

If your tower is standing up, the top fans blow air downward, which is against the uprising hot air, it doesn't work well that way.

Did you test temperature under load?


----------



## crislevin

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well you seemed to be one of the people reporting no flashing textures. More people were saying they still had them in crossfire.


For me, in game flashing was gone if xf is in default mode and TXAA is manually turned off in ini file. The flickering of static UI (inventory, map, etc) are still there.

1.07 seems to be a mild disaster in motion, I wouldn't bother with it.


----------



## Alex132

So, it seems like DDU + 15.7 has fixed almost all of my non-MSI AB related crashes and BSODs!

That's good, but question is when will another driver break it again









Also does anyone else play Project Cars here? I have been testing it out at 3200x1800 + 16x Supersampling AA.... and I am getting ~60fps? Is the Supersampling AA even applying correctly? That seems like way too many FPS for that amount of AA + res...


----------



## xer0h0ur

Quote:


> Originally Posted by *TooManyAlpacas*
> 
> Just a quick question has anyone updated to the 15.7 drivers and ran a Firestrike benchmark. As of the update my score is now 11xxx something. I don't know if it is a result of the driver or what is going on


15.6 Firestrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
15.7 Firestrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score


----------



## BradleyW

Witcher 3 1.07 Patch

AMD CFX ruined.

http://www.overclock.net/t/1565638/witcher-3-1-07-patch-amd-crossfire-performance-warning


----------



## xer0h0ur

Also affects consoles so whatever they did killed GCN's performance.


----------



## BradleyW

Quote:


> Originally Posted by *xer0h0ur*
> 
> Also affects consoles so whatever they did killed GCN's performance.


That would make sense.

Also, the fps shoots up to what it should be at, for a split second now and again. It's like something is blocking it from scaling properly all the time.


----------



## tagaxxl

guys i got 2 noctua nf f12 pwn for push/pull






playing witcher 3 on ultra it goes 74c temp

any ideas to inprove temps?


----------



## Dagamus NM

thermal pads and TIM are your best bet at this point. Your setup looks optimal for the design and yet you are struggling with heat issues.

I passed leak testing overnight with an external psu on the pump. Hook all my cables up and system won't post. it starts the process and then just stops either at code 24 or 60 on RIVE. It starts the post process then just shuts down. I guess I will pull and reseat all PSU cables. Maybe reseat the GPUs. I installed them connected via the EK bridge. I really don't want to drain my loop and start at square one.


----------



## xer0h0ur

I like to live dangerously. I didn't leak test before installing. Turned out that a rotary fitting directly above my PSU leading to the PCI pass-thru to the external radiator was leaking pretty badly. I was lucky there is a metal shroud around the PSU or else I would have fried that sucker.


----------



## Ironjer

looks good.


----------



## Mega Man

Quote:


> Originally Posted by *crislevin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ironjer*
> 
> if i flip the top fans where supposed the radiator 1 fresh air take?
> 
> 
> 
> The points to be considered is 1) push and pull, 2) hot air rises.
> 
> If your tower is standing up, the top fans blow air downward, which is against the uprising hot air, it doesn't work well that way.
> 
> Did you test temperature under load?
Click to expand...

please never say 2 again

hot air does not rise IN A PC CASE unless one of 2 things are true

1) you run a COMPLETELY PASSIVE system

2) you have POOR airflow and need to adj your fans- ( *or you want poor airflow ? )

as you are talking about a air cooled rad and the highest gpu avail from amd i will assume you DO NOT fall into 1 nor the "*" of 2

if heat ever rises in your case i would recommend either better more quality fans or better fan placement air goes where you want it to and does so by the fans

also stating fans blowing heat down is a bad idea is false, i do it every day ( i work in commercial hvac )
@Ironjer

as to fan placement intake ( outside and fresh air ) is always preferred on rads,

always

however you can exhaust just as easily you wont loose that much performance in most cases. a few deg

so what do you want to do ? - it just heads to personal pref


----------



## Dagamus NM

Quote:


> Originally Posted by *xer0h0ur*
> 
> I like to live dangerously. I didn't leak test before installing. Turned out that a rotary fitting directly above my PSU leading to the PCI pass-thru to the external radiator was leaking pretty badly. I was lucky there is a metal shroud around the PSU or else I would have fried that sucker.


Well, I'll admit that there have been times that I have let it roll but with several rotaries I didn't want to risk anything. Looks like I still messed up somewhere.


----------



## Ironjer

What fans configuration do you recommend?


----------



## Mega Man

personally i always configure all fans in and one out

i like positive pressure and my cases have enough vents that the extra air will just escape

i have never used that case though and i am sorry i can not be more help :/


----------



## SAFX

Quote:


> Originally Posted by *Ironjer*
> 
> What do you think?


Your flow is not optimal and likely creating turbulence.

General guidelines I follow:

1) Cool air, external to case, for radiators
2) INFLOW thru front or bottom of case
3) OUTFLOW rear and top of case

Here's my setup,



Max GPU temp 64c, 100% GPU load in AIDA for 10 minutes


----------



## Orivaa

Wouldn't top-inflow, bottom-exhaust make much more sense? As far as dust and such is concerned, I mean.


----------



## Mega Man

for dust it is really personal the "best is in at bottom and out at top not due to heat rising, but most people keep their PC case surrounded ( IE stuff on the sides of the case )

by pushing the air up and out you are not recirculating are that just left your case back into it, but really it depends on your setup

as to dust it depends on the environment and where you have your pc, on capret vs on top of a desk

ect


----------



## Dagamus NM

I am so frustrated right now. I pulled apart my loop and tried booting with only one card and no luck. These cards heat up the blocks very quickly when the loop is empty btw. So no go, the computer just shuts off during post and often on different codes. I popped in an air cooled (for now) reference 980ti and got further into the process. I shut it off before going into the ps because I could at least see the bios display. I tried one of the 295x2's after reseating all of the power cables and was able to get into the bios. I was updating settings as I had cleared the cmos and it just shut off. Go to power back on and it shuts off within seconds.

I let it sit for about half an hour and it powers back on fine and tries to boot into os. I get the windows logo and everything is moving extremely slow and frozen for bits then it powers off.

This is where I am stuck at now. After work today I will try using a different power supply. It seems like something is over heating or tripping some kind of protective shutoff mechanism. I wonder if this is something associated with this eVGA P2 1600W. Maybe the Lepa will work as it worked before.

The only other thing I can think of is that the NZXT USB adapter fried that circuit. There seem to have been some weird issues regarding the RIVE and USB peripherals leading to RMAs.

Or perhaps in changing from the corsair h110 to the EK supremacy evo elite that I killed my cpu, not likely but if so then I will move on to my 4930k rive board that is currently collecting dust.

Or maybe I will walk away from x79 entirely and finish my x99 builds. This sucks.


----------



## Mega Man

That really sticks


----------



## ur4skin

Ok so heres a pic off my system...I absolutely needed it to be white so ........blaaaa



sorry for the cellphone quality pics.


----------



## ur4skin

I have two 120mm af intakes at the bottom, and the rest is all exhaust fans....my temps never go above 65c


----------



## xer0h0ur

Quote:


> Originally Posted by *Dagamus NM*
> 
> I am so frustrated right now. I pulled apart my loop and tried booting with only one card and no luck. These cards heat up the blocks very quickly when the loop is empty btw. So no go, the computer just shuts off during post and often on different codes. I popped in an air cooled (for now) reference 980ti and got further into the process. I shut it off before going into the ps because I could at least see the bios display. I tried one of the 295x2's after reseating all of the power cables and was able to get into the bios. I was updating settings as I had cleared the cmos and it just shut off. Go to power back on and it shuts off within seconds.
> 
> I let it sit for about half an hour and it powers back on fine and tries to boot into os. I get the windows logo and everything is moving extremely slow and frozen for bits then it powers off.
> 
> This is where I am stuck at now. After work today I will try using a different power supply. It seems like something is over heating or tripping some kind of protective shutoff mechanism. I wonder if this is something associated with this eVGA P2 1600W. Maybe the Lepa will work as it worked before.
> 
> The only other thing I can think of is that the NZXT USB adapter fried that circuit. There seem to have been some weird issues regarding the RIVE and USB peripherals leading to RMAs.
> 
> Or perhaps in changing from the corsair h110 to the EK supremacy evo elite that I killed my cpu, not likely but if so then I will move on to my 4930k rive board that is currently collecting dust.
> 
> Or maybe I will walk away from x79 entirely and finish my x99 builds. This sucks.


Dang man. I feel your pain. I had some nasty pit of my stomach feelings when my system wouldn't POST after I installed my EK block. I ended up finding a solder point that had a big glob of solder making direct contact with the nickel plating on my block. I have pictures of it in my sig rig. I had to file it down. I know its a PITA diagnosing stuff like this but I hope you get it figured out.


----------



## Dagamus NM

Wow, how long did it take you to come to realize that was the problem. Sadly it seems that either my motherboard or CPU is dead. I cannot get past the windows loading screen. The system just shuts down. I have been pushing this system for a couple years so I guess it is possible that the CPU failed from the forces of releasing one cooler and strapping the other on. So I have both another RIVE and another x79 CPU so I should be able to determine for sure what is wrong.

I think I will start with the motherboard swap. Dumb question, but if the motherboard is the same model and swapped will I have to reinstall windows or will I just pick up and move along?


----------



## Mega Man

you will have to reactiveate


----------



## xer0h0ur

I actually lost count but I think it was either on the 3rd or 4th time taking off the EK block from my 295X2 that I was going over the PCB with a fine tooth comb and I noted that glob of excess solder. At first I was sure I had static shocked the card dead or something to that effect. So I had reinstalled the original Asetek AIO system and the card booted and benched perfectly fine so I went with another block mount only to get the same no POST crap. At that point I was scratching my head for a while. When I finally found that I was a combination of mad at whomever assembled the PCB and glad I identified a possible culprit. All in all it was an emotion roller coaster. From believing I had just wasted $1500+ to realizing it was all good.


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> You're totally gonna have to tell me how to do this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wait, is EUFI vBIOS even important?
> 
> 
> 
> Its very very important, if you want to be able to use SECUREBOOT with UEFI for super fast windows bootup.
> and also bios tweaking as msi afterburner does not work with secureboot but it does when you bios tweak in combination with UEFI addition ;D
> 
> Click to expand...
> 
> here is one
> 
> http://www.club-3d.com/insights/thread/r9-295x-not-working-with-uefisecure-boot/
Click to expand...

Sorry for the late response, but do I simply download the BIOS that the rep linked and flash it to my GPU? I've never done this before... so yeah I have no idea what to use or how


----------



## joeh4384

Use this tool ATI Flash.

http://www.techpowerup.com/downloads/2306/atiflash-4-17/

Guide: https://www.techpowerup.com/articles//overclocking/vidcard/34/1

Remember the card has a master and slave bios. It runs a bios for each GPU in addition to having the bios switch. You need both, you can't flash the same bios for both GPUs.


----------



## Alex132

Quote:


> Originally Posted by *joeh4384*
> 
> Use this tool ATI Flash.
> 
> http://www.techpowerup.com/downloads/2306/atiflash-4-17/
> 
> Guide: https://www.techpowerup.com/articles//overclocking/vidcard/34/1
> 
> Remember the card has a master and slave bios. It runs a bios for each GPU in addition to having the bios switch. You need both, you can't flash the same bios for both GPUs.


Thanks


----------



## wermad

Fired up an old classic and my rig keeps shutting down every after a short while. Annoying as heck but I think its loading up the rails pretty good on the g1600. Might go back to dual psu's (V1000s) like when I had my quad 7970 lightnings.


----------



## Medusa666

Quote:


> Originally Posted by *wermad*
> 
> Fired up an old classic and my rig keeps shutting down every after a short while. Annoying as heck but I think its loading up the rails pretty good on the g1600. Might go back to dual psu's (V1000s) like when I had my quad 7970 lightnings.


I had the same problem, even re-installed Windows because it was the driver which forced the BSOD / reboot.

The solution was to disable ULPS through the registry, worked like a charm!


----------



## wermad

its the original metro, also known to heavily load a powersupply. Every other game runs perfectly fine. No bsod, just power cuts off, possibly over-voltage protection is kicking in. I need to find my Kill-A-Watt to get some readings. Reducing the settings does help, but everything to max shuts it down more frequently. I'm a little perplexed since i have 1600w but I have to remember each rail is 30amps and two rails are going to each card. i'll find out tonight or tomorrow. Ulps has been disabled for a while. I'm still on 14.5 and the ccc bugs me to upgrade to 15.1 (ill take care of that later). Gonna install Crysis 3 and see how much juice that one uses up.


----------



## Dagamus NM

Man oh man oh man. I am excited and bummed at the same time.

So I had come to assume that either my cpu was bad or my motherboard, as the 3930k's are pretty robust I assumed that it was my board. My PCH fan had stopped working a while back (which was actually a good thing as it is quite annoying (RIVE)) so I told Asus about my problems and they issued an RMA. As I was going through all of my boxes and stuff tearing the last parts down I was about to pluck out the processor. I unbolt the waterblock and see that the thermal paste is in the exact same configuration as it was when I applied it. ***??I look back through the EK instructions and see that they say only to use the m4 standoffs into the motherboard, place waterblock, tighten nuts over threaded standoff with spring underneath.

The box had two sets of standoffs in it, both m4. One long and one short. Without thinking about it I grabbed the ones that looked like the picture and everything seemed to bolt up just fine. Not like I could see the space between the block and CPU inside this POS case I am using for this old setup.

Needless to say, I swapped everything and hooked up a power supply lazy and hanging out of the case (not the one I plan on using, just what was closest as well as the 980Ti on the desk.

I am typing this message while using it so everything is just fine. Now to drain the CPU only loop that is connected and reconfigure my two 295X2's back into the loop. Well, not today.

So I am happy yet feel like a total DA.


----------



## wermad

Probably used the LGA115X hardware; hey, its an honest mistake. Still, run through some cpu stability testing (stock) to make sure you don't need the rma. Also, no fan working, you can still rma. Or just find a Rive block. I just missed a Koolance rive block for $50 (







). I was debating picking up another rive since I'm not too happy with the Z97 tbh.


----------



## xer0h0ur

Quote:


> Originally Posted by *Dagamus NM*
> 
> Man oh man oh man. I am excited and bummed at the same time.
> 
> So I had come to assume that either my cpu was bad or my motherboard, as the 3930k's are pretty robust I assumed that it was my board. My PCH fan had stopped working a while back (which was actually a good thing as it is quite annoying (RIVE)) so I told Asus about my problems and they issued an RMA. As I was going through all of my boxes and stuff tearing the last parts down I was about to pluck out the processor. I unbolt the waterblock and see that the thermal paste is in the exact same configuration as it was when I applied it. ***??I look back through the EK instructions and see that they say only to use the m4 standoffs into the motherboard, place waterblock, tighten nuts over threaded standoff with spring underneath.
> 
> The box had two sets of standoffs in it, both m4. One long and one short. Without thinking about it I grabbed the ones that looked like the picture and everything seemed to bolt up just fine. Not like I could see the space between the block and CPU inside this POS case I am using for this old setup.
> 
> Needless to say, I swapped everything and hooked up a power supply lazy and hanging out of the case (not the one I plan on using, just what was closest as well as the 980Ti on the desk.
> 
> I am typing this message while using it so everything is just fine. Now to drain the CPU only loop that is connected and reconfigure my two 295X2's back into the loop. Well, not today.
> 
> So I am happy yet feel like a total DA.


Well glad to hear that ultimately it wasn't a serious issue after all and that you managed to figure it out. Those ah ha moments are great, even when the cause is something of your own doing.


----------



## Dagamus NM

That is exactly what happened. Still I am pretty pissed about it, especially after hearing my wife go on and on about me breaking a computer that to her was working just fine.

I have Seross's waterblocks coming at some point soon for the motherboard. An EK LE or full kit, I don't remember probably a full kit, it was $100 shipped. So yes I could RMA but I would rather not deal with the hassle.

I will further test, but everything seems fine at the moment. It was shutting down because the CPU was overheating. Doesn't Intel cut it off at 90C? This explains why I could almost get through the windows screen if I left it be for a while.

Not all bad though, I made considerable progress on one of my Caselabs SM8 X99 builds and my Dimastech EasyXL FX-9590 build is ready to wire up. Extensive leak testing is done. When I get around to posting pictures the number of fittings in it is crazy but surprisingly none of the rotaries leaked.

I am still trying to figure out what monitors I want to run. I would like 1440P, 144Hz, 1ms, 23-25". Know of anything that fits that bill? 27" is too big for what I am doing. I have two 28" Asus monitors and they are just stupid big. I think 24" would be perfect, then I would run 3.


----------



## gatygun

Quote:


> Originally Posted by *wermad*
> 
> its the original metro, also known to heavily load a powersupply. Every other game runs perfectly fine. No bsod, just power cuts off, possibly over-voltage protection is kicking in. I need to find my Kill-A-Watt to get some readings. Reducing the settings does help, but everything to max shuts it down more frequently. I'm a little perplexed since i have 1600w but I have to remember each rail is 30amps and two rails are going to each card. i'll find out tonight or tomorrow. Ulps has been disabled for a while. I'm still on 14.5 and the ccc bugs me to upgrade to 15.1 (ill take care of that later). Gonna install Crysis 3 and see how much juice that one uses up.


Original metro is probably still to this day the most taxing game there is. Specially if you max it out. Far more then it's remaster.

So if you will get power supply issue's, then that's surely going to be noticed in that game.

295x2 needs 2x 28 amp with a combined solution of 50 amp's. So 2x 30 amps should work fine.

No clue how the OC is going to effect this tho, as i find 28 amp for each 290 gpu kinda lowish. My 290 for example has a 28amp + 18 amp solution, so it has fare more juice to work with.


----------



## OzGoD

My dual XFX 295x2 rig has been running without a skip for almost a year now. Decided to do a few runs on 3DMark 11 performance. Ended up at Rank 51 on the hall of fame User InFl1cT!0N
Will buy firestrike and maybe tweak a little something and see what I can come up with? A lot of the pics i see the cards are so close? I however have nice spacing with a pci-e drive in-between cards even. No crashes or preference issues with current 15 drivers nor heat problems. My cards run stock coolers no water block mods.

http://www.3dmark.com/3dm11/10065149


----------



## OzGoD

I can run 1600Mhz all day with no other tweaks in catalyst , pushed it up a little to 1615 on that test run. can do 1625 but drops a little score , doesn't like 1625 , good air flow and all seems to run fine other than , run 4.5 to 4.6 daily , 4.8Ghz was pushed up for the 3DMark 11 runs.


----------



## OzGoD

Still have the beautiful 16Gb Corsair GT Hypers 1600MHz @ 6-6-6-20 laying around on my desk waiting to enter a build that SmokinWaffle helped hook me up with to complete my set and much appreciated.


----------



## xarot

Sorry if asked before. Recently I asked about this but got zero replies.

How's Witcher 3 with AMD's latest drivers? Does it work yet in Crossfire mode? With 15.5 people still had a lot of issues and some said that with latest drvers Crossfire is disabled for that game. Lol.


----------



## xer0h0ur

Quote:


> Originally Posted by *xarot*
> 
> Sorry if asked before. Recently I asked about this but got zero replies.
> 
> How's Witcher 3 with AMD's latest drivers? Does it work yet in Crossfire mode? With 15.5 people still had a lot of issues and some said that with latest drvers Crossfire is disabled for that game. Lol.


Wrong place to be asking these questions. Go to the driver threads.


----------



## xarot

Quote:


> Originally Posted by *xer0h0ur*
> 
> Wrong place to be asking these questions. Go to the driver threads.


Not exactly. I am debating if I should make a deal of another Ares III or not. But BradleyW has posted a link to his new thread already in this thread. http://www.overclock.net/t/1565638/witcher-3-1-07-patch-amd-crossfire-performance-warning

It seems it's not worth it and probably never will.


----------



## xer0h0ur

Crossfire performance was working perfectly fine before that patch but by all means you can blame AMD's drivers instead of CD Projekt Red's 1.07 patch ruining it. Par for the course


----------



## OzGoD

I will say this as far as drivers go> I had crossfire running on my ASUS WS on 16x lanes , I had them on GEN3 , I reinstalled win 7 x64 and used the new 15 drivers. Crossfire would no longer activate or show up. I had to change settings to GEN2 and crossfire showed up on reboot and activated. Its showing running at 16x crossfire so Im not digging into it any further. My 3Dmark 11 score is on Gen2 16x , IF I change back to gen3 and reboot it drops off and will not show crossfire as a option in catalyst. Only gen2 is working on 15 drivers?

@xarot I do not have Witcher 3 to do any tests sry.


----------



## xer0h0ur

Quote:


> Originally Posted by *OzGoD*
> 
> I will say this as far as drivers go> I had crossfire running on my ASUS WS on 16x lanes , I had them on GEN3 , I reinstalled win 7 x64 and used the new 15 drivers. Crossfire would no longer activate or show up. I had to change settings to GEN2 and crossfire showed up on reboot and activated. Its showing running at 16x crossfire so Im not digging into it any further. My 3Dmark 11 score is on Gen2 16x , IF I change back to gen3 and reboot it drops off and will not show crossfire as a option in catalyst. Only gen2 is working on 15 drivers?
> 
> @xarot I do not have Witcher 3 to do any tests sry.


That is a problem on your system. I have never had to change from PCI-E 3.0 to 2.0 for any driver change. I suggest to try a driver re-install following BradleyW's manual driver removal guide.


----------



## xarot

Quote:


> Originally Posted by *xer0h0ur*
> 
> Crossfire performance was working perfectly fine before that patch but by all means you can blame AMD's drivers instead of CD Projekt Red's 1.07 patch ruining it. Par for the course


I am having a hard time trying to read you. I don't know why you think I am blaming the drivers. It is a problem regardless of drivers if developers are not too interested in keeping games playable with AMD's hardware utilizing Crossfire. I couldn't find any up to date videos with latest drivers and that's why I am asking here since that is the only game I play. I have only found information about constant flickering since the game was launched and some reports where it was still happening with 15.5 drivers and older than 1.07 patch. I'm quite sure the best feedback could come from actual 295X2 owners, but I was wrong. This question is now


----------



## Alex132

Does anyone have a UEFI BIOS to share? And ways to edit it?

The only one I can find is this, but its extension is ".zi_"


----------



## xer0h0ur

Quote:


> Originally Posted by *xarot*
> 
> I am having a hard time trying to read you. I don't know why you think I am blaming the drivers. It is a problem regardless of drivers if developers are not too interested in keeping games playable with AMD's hardware utilizing Crossfire. I couldn't find any up to date videos with latest drivers and that's why I am asking here since that is the only game I play. I have only found information about constant flickering since the game was launched and some reports where it was still happening with 15.5 drivers and older than 1.07 patch. I'm quite sure the best feedback could come from actual 295X2 owners, but I was wrong. This question is now


What is there to read into? You basically blamed AMD's driver for the drop in crossfire performance even though in his thread he explicitly said it was caused by the 1.07 patch.

Some people were still reporting flashing textures and a few people said they didn't have it anymore. Either way that isn't the issue being reported by the person you were referencing before to begin with. They were reporting that Crossfire scaling went down the toilet with this 1.07 patch and for the record performance also dropped on consoles with this patch so whatever CD Projekt Red did in this patch seemingly only managed to affect the GCN family (crossfired or not). I don't follow the Nvidia forums to know if the patch also adversely affected their performance or SLI scaling.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xarot*
> 
> Sorry if asked before. Recently I asked about this but got zero replies.
> 
> How's Witcher 3 with AMD's latest drivers? Does it work yet in Crossfire mode? With 15.5 people still had a lot of issues and some said that with latest drvers Crossfire is disabled for that game. Lol.


15.7 drivers and patch 1.0.6 Crossfire and Trifire work great.

I havent downloaded the newest patch yet because of all the negative reports for people running AMD hardware are having alot of issues.


----------



## BradleyW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 15.7 drivers and patch 1.0.6 Crossfire and Trifire work great.
> 
> I havent downloaded the newest patch yet because of all the negative reports for people running AMD hardware are having alot of issues.


I get more fps and smoother gaming with 15.5 CCC + Witcher 1.06v. 1.07v + 15.7 CCC is a nightmare.


----------



## casier

Hi everybody,

I received my 295x2 today, and I'm pretty happy about it. However, I noticed that when gaming, its temperature reach 75°, which as the consequece, as you know, to decrease GPU frequencies to 800Mhz... big loss of performance then.

I'm pretty angry about that, because my case is well ventilated (its a Be Quiet Silent Base 800), and also because 290 GPUs can easily reach 90° without any risk, so why the hell did they put this limit so low ?

I would have two questions :

1) Did anybody here tried to fin (make ?) a BIOS for the 295x2 which just change this 75° limit for say 80° ? Would be enougth to be ok most of the case I guess...

2) Do you guys have any solution about this problem ? I thought about a push-pull, but my motherboard (asus x99 deluxe) has stupid plastic covers at its top that make the push-pull impossible, for only one centimeter ; pretty angry about that too... But not sure the push-pull configuration would be enough to fix this 750 issue however...?

I'm still happy abut this card but I'm a but sad to have to use a limited 295x2 just because of this 75° limit... Opened to any advice.

Thanks by advance.


----------



## Alex132

Quote:


> Originally Posted by *casier*
> 
> Hi everybody,
> 
> I received my 295x2 today, and I'm pretty happy about it. However, I noticed that when gaming, its temperature reach 75°, which as the consequece, as you know, to decrease GPU frequencies to 800Mhz... big loss of performance then.
> 
> I'm pretty angry about that, because my case is well ventilated (its a Be Quiet Silent Base 800), and also because 290 GPUs can easily reach 90° without any risk, so why the hell did they put this limit so low ?


GPUs can, pumps can't. Even 75'c is pretty risky for pumps.

Quote:


> Originally Posted by *casier*
> 
> 1) Did anybody here tried to fin (make ?) a BIOS for the 295x2 which just change this 75° limit for say 80° ? Would be enougth to be ok most of the case I guess...


Bad idea. See above.

Quote:


> Originally Posted by *casier*
> 
> 2) Do you guys have any solution about this problem ? I thought about a push-pull, but my motherboard (asus x99 deluxe) has stupid plastic covers at its top that make the push-pull impossible, for only one centimeter ; pretty angry about that too... But not sure the push-pull configuration would be enough to fix this 750 issue however...?


Push/pull with good fans and open the side-panel to see if you actually are getting enough air intake.


----------



## casier

Oh, right, I didn't tought about the pump in fact. All right, bad idea then, too bad, that would have been the best way.

For push-pull I'm pretty embarassed for the moment, really don't know how to do, as the pull fan cannot pass due to plastic covers of the X99 deluxe, at its top (genius idea, really). The only solution I have is to unmount everything and try to remove those esthetic covers :-/

The only thing I know is that when I put the rad outside, the temp stabilizes at 65 degrees, and when I put the WC at the top of the Silent base 800, it raches 75 degrees (that seems pretty normal : the rad is inside the case, which is pretty hot...)


----------



## Alex132

Quote:


> Originally Posted by *casier*
> 
> Oh, right, I didn't tought about the pump in fact. All right, bad idea then, too bad, that would have been the best way.
> 
> For push-pull I'm pretty embarassed for the moment, really don't know how to do, as the pull fan cannot pass due to plastic covers of the X99 deluxe, at its top (genius idea, really). The only solution I have is to unmount everything :-/
> 
> The only thing I know is that when I put the rad outside, the temp stabilizes at 65 degrees, and when I put the WC at the top of the Silent base 800, it raches 75 degrees (that seems pretty normal : the rad is inside the case, which is pretty hot...)


The case shouldn't be hot, that means you are dumping heat inside that case that you are unable to exhaust fast enough.


----------



## casier

Well, warm air is due to CPU WC and 295x2 WC, at the top ot the case, air going from outside (to get fresh air for the WC) to inside. Then, to evacuate, there are 2 x 120mm at the front in push and one 120mm at the back in pull ; all 3 fans are at 100% when gaming.


----------



## Mega Man

But are they "fans" it are they good quality fans with a good pq chart
Quote:


> Originally Posted by *xarot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Crossfire performance was working perfectly fine before that patch but by all means you can blame AMD's drivers instead of CD Projekt Red's 1.07 patch ruining it. Par for the course
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am having a hard time trying to read you. I don't know why you think I am blaming the drivers. It is a problem regardless of drivers if developers are not too interested in keeping games playable with AMD's hardware utilizing Crossfire. I couldn't find any up to date videos with latest drivers and that's why I am asking here since that is the only game I play. I have only found information about constant flickering since the game was launched and some reports where it was still happening with 15.5 drivers and older than 1.07 patch. I'm quite sure the best feedback could come from actual 295X2 owners, but I was wrong. This question is now
Click to expand...

I can try to test for you tonight
Quote:


> Originally Posted by *Alex132*
> 
> Does anyone have a UEFI BIOS to share? And ways to edit it?
> 
> The only one I can find is this, but its extension is ".zi_"


I don't use it but iirc they don't allow zip files change the underscore to a p and use 7 zip ( or w.e ) to extract it

I have been meaning to use it but atm transferring .5tb of movies and my WiFi is pretty crappy in that room

After I can power cycle I will


----------



## Mega Man

Derete


----------



## casier

The fans are Be Quiet Pure Wings 120mm, rather good fans I think.


----------



## BradleyW

For those wondering about AMD Crossfire and Witcher III v1.07.
http://www.overclock.net/t/1565638/witcher-3-1-07-patch-amd-crossfire-performance-warning


----------



## gatygun

Quote:


> Originally Posted by *OzGoD*
> 
> My dual XFX 295x2 rig has been running without a skip for almost a year now. Decided to do a few runs on 3DMark 11 performance. Ended up at Rank 51 on the hall of fame User InFl1cT!0N
> Will buy firestrike and maybe tweak a little something and see what I can come up with? A lot of the pics i see the cards are so close? I however have nice spacing with a pci-e drive in-between cards even. No crashes or preference issues with current 15 drivers nor heat problems. My cards run stock coolers no water block mods.
> 
> http://www.3dmark.com/3dm11/10065149


50k graphical points, your system should be illegal.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gatygun*
> 
> 50k graphical points, your system should be illegal.


Honestly, if you've got the power, running 2) 295x2's in quadfire at roughly $600 a piece right now is hands down the best value in PC graphics.....

Not to mention you get water cooling and only need two PCI-X slots.....


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, if you've got the power, running 2) 295x2's in quadfire at roughly $600 a piece right now is hands down the best value in PC graphics.....
> 
> Not to mention you get water cooling and only need two PCI-X slots.....


Yea so true, 4 gpu's with 4x watercooling. that's like 300 a piece, not bad at all indeed.


----------



## Dagamus NM

It was a shame to waste my stock coolers but considering that I throttle halfway through any benchmark I just couldn't do it anymore.

I can't work on my computers this weekend (Wife's orders, it is her birthday wish) but Monday it is on. I will get my system reconfigured now that it is working. Actually three of them are very close to working but only the one has X2's so that is what I will discuss in this thread.

So yeah, I tossed the coolers in the box along with the stock fans and the corsair SP120's I used for two weeks. They just weren't enough.

So between a stable power source, waterblocks, and complimentary cooled components I should be able to post some decent scores.


----------



## wermad

Still shutting down







. Even if I disable crossfire







. Ok, so took off the c19 extension, no diff. Will check the power strip. The only other things I could think of are the extensions I'm using (bought some pins to make some custom 16awg ones) and/or I'm overloading the rails. I need to dig out my kill-a-watt.


----------



## SAFX

Who's upgrading to Windows 10?


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Who's upgrading to Windows 10?


Gonna wait till end of next month. Just to avoid any zero-day issues and the like.


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Gonna wait till end of next month. Just to avoid any zero-day issues and the like.


Wimp







, I'm upgrading my desktop to Windows 10 alpha release, no updates, and virus scan disabled!


----------



## xer0h0ur

I too like to live dangerously


----------



## Medusa666

Noobish question guys, but I do not have any PWM fans at home to test with;

Going to replace the stock radiator fans on my two R9 295X2 cards, will a fan with PWM connection work? I want PWM to be able to swap them around if needed in the case etc, and I think that PWM is more value for the money than a 3 pin fan.

Thanks!


----------



## xer0h0ur

If you mean direct replacement while leaving them connected to the cards so they control the fan speeds then no you can't just plug in a PWM fan into it. If you meant replacing the fans with PWM fans and controlling them yourself then that is always fair game.

3 pin fans modulate a constant voltage to control fan speed while PWM fans use a set voltage but modulate the pulse width so its turning the fan on and off at different intervals to ultimately speed up or slow down the fan speed.


----------



## Medusa666

Quote:


> Originally Posted by *xer0h0ur*
> 
> If you mean direct replacement while leaving them connected to the cards so they control the fan speeds then no you can't just plug in a PWM fan into it. If you meant replacing the fans with PWM fans and controlling them yourself then that is always fair game.
> 
> 3 pin fans use a constant voltage modulation to control fan speed while PWM fans use a set voltage but modulate the pulse width so its turning the fan on and off at different intervals to ultimately speed up or slow down the fan speed.


So I assume that means, that if I apply PWM fans to the connector on the card, they will run at full speed constantly?


----------



## xer0h0ur

Honestly off hand I can't say that I know what would happen since I never have connected a 4-pin PWM fan to a 3-pin header. I just know its not a good idea to change the voltage on a PWM fan as that can screw it up.


----------



## Medusa666

So ka
Quote:


> Originally Posted by *xer0h0ur*
> 
> Honestly off hand I can't say that I know what would happen since I never have connected a 4-pin PWM fan to a 3-pin header. I just know its not a good idea to change the voltage on a PWM fan as that can screw it up.


So maybe the best solution would be to not use the card fan connection at all, and instead run all fans for the graphics cards on a fan controller / fan hub?

Going to do push pull with Noctua fans so I do not mind if they run at full speed constantly.


----------



## xer0h0ur

If you want to use PWM fan you really don't have a choice. It has to be connected to either a fan controller or use a PWM header on your motherboard.

For the record Noctua does make 3-pin variants. For instance I am using six 3-pin 2000RPM Noctua iPPC NF-F12's connected to a Phanteks fan hub which in turn is connected to a PWM header so that when I adjust the PWM signal then my hub modulates the voltage to the 3-pin fans. In your case however that hub would be entirely unnecessary as if you connected 3-pin Noctuas to your 295X2's then it would control the fan speed equally the same as the original fans. Presumably by temperature.

Honestly though I wouldn't take that chance. If you can use PWM fans and control them manually then just go that route. I always prefer manual control of my fans than leaving it up to some piece of hardware to chose when to kick up the fan speed.


----------



## Medusa666

Quote:


> Originally Posted by *xer0h0ur*
> 
> If you want to use PWM fan you really don't have a choice. It has to be connected to either a fan controller or use a PWM header on your motherboard.
> 
> For the record Noctua does make 3-pin variants. For instance I am using six 3-pin 2000RPM Noctua iPPC NF-F12's connected to a Phanteks fan hub which in turn is connected to a PWM header so that when I adjust the PWM signal then my hub modulates the voltage to the 3-pin fans. In your case however that hub would be entirely unnecessary as if you connected 3-pin Noctuas to your 295X2's then it would control the fan speed equally the same as the original fans. Presumably by temperature.


Thanks for a great and informative reply sir.

Thing is I just bought these fans so getting new 3 pins is not a viable option.

It won't hurt the cards if I don't use the fan connector off them then, other than the obvious fact that the fan speed won't be regulated by the gpu Temps but instead of the pwm hub?

Do you yourself use the cards connector?


----------



## xer0h0ur

Sorry, I edited my post after the fact so you didn't see that I wouldn't recommend you using those 3-pins fans anyways as I can't guarantee how well they would perform with the 295X2's controlling the fan speed on fans that are significantly slower RPM-wise than the original fans. So in other words you're already golden as is in my book with those PWM fans.

As for your question, no I don't use the cards for fan control at all. I have EK waterblocks on my 290X and 295X2. All 7 of my radiator fans are manually being controlled by one PWM header. One PWM header is connected to the Phanteks hub that in turn controls my six 3-pin fans and the 7th fan is connected by a Noctua Y-adapter that came with one of my older Noctua NF-F12s (the original 1200 RPM flavor).


----------



## wermad

Found my rig power-less this morning, sigh







. Finally fixed my issue (I think







). I remember having this issue with my previous case. I swapped the cpu and mb harness from the Lepa unit with Cooler Master V series cables. One set in particular has two thick cables joined together. The lepa had a splice a few inches before it hit the psu plug. The CM doesn't have this, so I ended up crimping two wires with one pin. No biggie, done this a bunch of times and some cables come like this.Well, these wires did undid once and I had to redo it. Unfortunately, everything was in place and extremely tight, and after a couple of attempts, I got the pin in place. Well, this pin had slow been working its way undone from the wires. I guess I may have nudged it enough to loosen both wires when I was working on the second system in my TX10-D. It was a pain to get to as the psu's are located in the middle to clear the two rads next to them. I got luck I only had to undo a rad bracket from one side to access the cable. Got it out and noticed the on wire was completely out and the other sitting barely touching the pin inside the plug. The pin was crushed from the bottom when I probably jammed in the previous repair. This time, I got the whole thing out and properly "combed" and twisted the wire cores together. I was uber lucky I had one eps pin w/ long crimp wings. I usually snip this shorter since it was not practical with single wires. But it does the job with two wires. Crimped them on and used the fancy crimp tool to finish it off. it went in smooth this time in the plug and everything fired up after that.

I did double check my pcie cables were good but it dawned on me it didn't matter as each one had its own rail. I just downloaded 15.7 and I'll launch a few games to see how it copes.

Oh, I got a free upgrade to win10, but I'll wait for the early adopters on how it does for gamers.

edit: never mind







, still shuts down when I play games. Gonna disable crossfire and one of the cards and I'm hoping that will narrow down things. At first, it was only Metro 2033, now its any game I launch. Power gets cut and it restarts. Gonna check out Fry's if they have any deals on a lga1150 board.

edit #2:

Googled the symptoms and the psu is coming up as the prime suspect. I'll see if I can pick up a cheap unit from frys.

edit #3: tested each card on its own (unplugged second card). Same thing running one card at a time, which brings a huge sigh of relief. I'm down to the psu or the mb. Found a 1000w @ frys I may get if they still have one (~$50).


----------



## wermad

Sorry to double post. I missed on out the zalman 1000w unit but it wasn't total lost as its it was 2 rail @ 25 each. But $50 is a bargin! Anyways, I picked up a Lepa 1000w bronze unit (83amp single rail) and temporarily connected my system and one card to it. Doing a cpu test right now (stock clocks) and so far after 10 minutes its held. Ran 3Dmark11 and Firestrike (Extreme & 4k) with the first card and it went on fine. One thing I wanted to point out is that ppl say to stay away from the pig-tail pcie connectors. Just for the sake of testing, I'm using one harness per plug. So, it looks like its the psu







. The good news is that I can rma it, the bad news is Lepa (/Enermax) is very slow in processing rma's. Not sure if it was due to the mining craze last year, but they took over a month and they're also slow in responding to emails. Well, I have 30 days to return the Lepa, may keep it for a week or two make sure everything runs fine. Honestly, I'm seriously thinking of just getting a V1000 + V850 and call it a day.

I'll be pulling the g1600 and I'll bypass he extensions to see if that makes a difference.


----------



## Dagamus NM

Bummer on your luck. Do you think it is the core unit or the cables?

I am not a fan of the cables in general, but especially for the builds with two 295X2's. I know you cleaned yours up by removing excess, I didn't feel like going that route.

So I bought three of these Lepas thinking they would be a good option. I must say that I like their footprint, the eVGAs that I got to replace the Lepas have a much larger footprint. So I will use one of the Lepas for my quad 6GB 280x build, maybe one for my quad 7970 build, and probably two of four eVGAs for my twin x99 builds. This leaves one Lepa, a platimax 1350, a rosewill 1300, and two eVGAs to collect dust in the closet.

I am a shameless hoarder.


----------



## wermad

Some had issues and it turned out to be the extensions. I'm using some to extend the cables as the TX10D makes it a pita for the already short Lepa gpu cables to reach. I have some pins and connectors coming in from ppcs.com. If indeed its the extensions, I'll be making some custom cables instead (using black 16 awg wire).

But it ran fine for a month. I'm leaning on the unit being an issue. Its an ex miner so I'm sure its a bit tired. I had to keep bugging the seller for the receipt as I knew ex-mining gear will give out pretty soon. We'll see, some more testing is needed. I need to pull out the lepa but its a very tricky maneuver if I want to avoid draining the entire thing.


----------



## bobbavet

Gday Guys

Have just joined the club. Picked up a 2month used 295x2 for $582US here in Australia.

Fury or 980ti were not floating my boat and FuryX2 is going to cost a mint, so I thought this to be a good buy.

Need some advice on min PSU.

I currently have a Silverstone Strider 750W 80+silver.

It has been a rock solid unit running my GTX690 and an 3770k OC on Mitx platform.

Cheers Bob


----------



## wermad

Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> Have just joined the club. Picked up a 2month used 295x2 for $582US here in Australia.
> 
> Fury or 980ti were not floating my boat and FuryX2 is going to cost a mint, so I thought this to be a good buy.
> 
> Need some advice on min PSU.
> 
> I currently have a Silverstone Strider 750W 80+silver.
> 
> It has been a rock solid unit running my GTX690 and an 3770k OC on Mitx platform.
> 
> Cheers Bob


Check the op, i made a list for this very question. First timers, it helps to read the first post in threads as there maybe useful info. On your current psu, it does *not* meet the amd specs. You want at least 50amps (combined single rail, or 28+28 for multi rail) for the card and 20amps for the rest of your system. This psu has 60amps and may not be sufficient. You can run it if you don't max load or oc the card for now. There's a ton of great psu's out there and check the list. If you plan to push the card, I would recommend start @ 850w w/ at least 70amps (or greater) for single rail units. For multi rail units, Amd recommends 28amps per 8-pin connector. Check the specs if you're going with a multi rail unit to ensure the vga cables have enough amps. To keep it less complicated, I would recommend a single rail unit for new owners as there's no worry about rail distribution. I love the cooler master V series (seasonic guts) and it comes w/ all modular cables. Highly recommend the V1000 (83 amps) or the EVGA G/P1000s (both 83amps). Welcome and post pics btw!


----------



## bobbavet

Gday Wermad

Thanks for the quick reply. Yeh soz I didn't look @ OP. I just had this buy come up quick and literally jumped straight on here. lol

I do like my Cpu OC and haven't investigated any benefit of OC the 295x2 as yet.

Was eying off the Sivlerstone 1000 but yeah that Coolermaster is only $10 more.

Am debating what screen to use now. 34" WS [email protected] or a 40" [email protected]

I have really enjoyed my [email protected] and pretty torn up on choice. lols.

Can take pics. It's going to be rather plain affair, am over the case mod game.

Probably in a TT V21. Should have a great view of the card in the side window though.

cheers


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> Have just joined the club. Picked up a 2month used 295x2 for $582US here in Australia.
> 
> Fury or 980ti were not floating my boat and FuryX2 is going to cost a mint, so I thought this to be a good buy.
> 
> Need some advice on min PSU.
> 
> I currently have a Silverstone Strider 750W 80+silver.
> 
> It has been a rock solid unit running my GTX690 and an 3770k OC on Mitx platform.
> 
> Cheers Bob
> 
> 
> 
> Check the op, i made a list for this very question. First timers, it helps to read the first post in threads as there maybe useful info. On your current psu, it does *not* meet the amd specs. You want at least 50amps (combined single rail, or 28+28 for multi rail) for the card and 20amps for the rest of your system. This psu has 60amps and may not be sufficient. You can run it if you don't max load or oc the card for now. There's a ton of great psu's out there and check the list. If you plan to push the card, I would recommend start @ 850w w/ at least 70amps (or greater) for single rail units. For multi rail units, Amd recommends 28amps per 8-pin connector. Check the specs if you're going with a multi rail unit to ensure the vga cables have enough amps. To keep it less complicated, I would recommend a single rail unit for new owners as there's no worry about rail distribution. I love the cooler master V series (seasonic guts) and it comes w/ all modular cables. Highly recommend the V1000 (83 amps) or the EVGA G/P1000s (both 83amps). Welcome and post pics btw!
Click to expand...

$250 For the 1200w Silverstone Strider Gold Evo: Link

$299 for the 1200w Cooler Master V1200 Link

$329 for the 1300w EVGA SuperNova G2 Link

Corsair are silly expensive here and so is Seasonic so i'd rule them out, I have the Silverstone PSU i linked (5 years old now) and it still can run the 295x2 with no issues even overclocked


----------



## bobbavet

Gday Sgt Bilko

Yeh thats where I get all me gear from.

Eyeing of the Phillips 4k atm. lol What screen you got?

I picked up me SS 750W 3yrs ago, it was about 6mths old when I got it. Has really been a solid performer and a perfect match rails wise for the GTX 690.

I think this 295x2 should be a good upgrade. It's roughly double the performance of me 690 and $200 cheaper than the $1000 I paid for the 690 2nd hand.

I think the price of top end single GPU is out of hand, so I can't fathom what a Fury X2 is goin to cost over here.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bobbavet*
> 
> Gday Sgt Bilko
> 
> Yeh thats where I get all me gear from.
> 
> Eyeing of the Phillips 4k atm. lol What screen you got?


Currently running a Qnix QX2710 1440p Korean monitor at 110hz but i'm eyeing off the new 4k IPS Freesync Monitor from LG coming in soon.









Wife wants a Fury X first so that takes priority


----------



## bobbavet

Yeh I want to go bigger than my current 27". What size is that LG, is there a model number or info on it yet?

Have they sorted out freesync with crossfire yet?


----------



## Sgt Bilko

false
Quote:


> Originally Posted by *bobbavet*
> 
> Yeh I want to go bigger than my current 27". What size is that LG, is there a model number or info on it yet?
> 
> Have they sorted out freesync with crossfire yet?


It's the LG 27MU67 It's a 27" Monitor and PCCG have it for $800 and that works out ok for me as i'll be going to Win 10 early (better desktop scaling) and even if it isn't great i'll run 1440p desktop and 4k in games.

Yeah, newest driver gave us Crossfire support for Freesync


----------



## bobbavet

Good news about the freesync. But I am pretty sure 27" will be to small for me. I suppose I could try it.

I am pretty sure the 295x2 will deliver 60fps experience @ 4k with out Freesync. I only game BF4 and War Thunder.

I was looking at the Korean 4k 40" but it's hard to go passed the 3 yr local warranty on them Phillips 40".

My only concern is going back to 60Hz.. Though I don't think there's much choice, [email protected] 120hz is quite a while off both with GPU's and Monitors I think.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bobbavet*
> 
> Good news about the freesync. But I am pretty sure 27" will be to small for me. I suppose I could try it.
> 
> I am pretty sure the 295x2 will deliver 60fps experience @ 4k with out Freesync. I only game BF4 and War Thunder.
> 
> I was looking at the Korean 4k 40" but it's hard to go passed the 3 yr local warranty on them Phillips 40".
> 
> My only concern is going back to 60Hz.. Though I don't think there's much choice, [email protected] 120hz is quite a while off both with GPU's and Monitors I think.


I've been running my Qnix at 60hz over the past couple of weeks so i can use VSR and the trade off has been worth it outside of shooters for me, the extra refresh rate really helps out there but i'm mainly an RPG player so I'm happy with the extra resolution over refresh rate


----------



## wermad

no go bypassing the extensions. Rma time for the psu.

edit: it seems like its just shutting off w/ the gpu's plugged in. Even one it will cut power after 10 minutes. I can't find my dp cable, so I'm using igpu hdmi, which sucks in 4k so I'm down to wqhd. Man, its very weird switching from 4k to 1440. At least solitaire works







. Waiting on lepa rma. If they're too slow, I'll be picking up a couple of decent units and sell the lepa once it returns (probably a while).


----------



## Dagamus NM

Do you want to borrow my extra Lepa in the interim?


----------



## wermad

I'm good, I'm using a spare unit i have. There's a few more fixes I need to do so the rig will be down anyways (noisy fan, fit filters to front rads, and redo some of the wiring).

I'm more inclined to get two solid single rail units tbh, not worry about rail load (something I've advised of before), and get it done probably faster then lepa support can get me a replacement. Was gonna pick up a used evga 1000p but was dead asleep when the auction ended ($100) after spending the entire night trouble shooting. Adding to this disarray, my router gave out and its not a good thing w/ kids and wifey needing wifi. I ended up on my cell and I hope Lepa got my support request form I sent them from their site.

Still, I'm not sold on the evga's due to their long length, so CM V series are on the top of my list. A V1000 for the system and 1st gpu and a V850 or V750 for the second card. I already have the jumper cable I used in my dual psu quad 7970 setup.


----------



## Medusa666

So erhm, I got another luxury problem, went to the store and got a Noctua NH-D15, just coming home after a 1,5 hour drive realizing that it would hang over my main PCI-E slot, long story short, currently I'm running my two R9 295X2 in;

16X PCI-E 3.0
and
8X PCI-E 3.0

If I install the Noctua, I got to move the cards so they run at;

8X PCI-E 3.0
and
4X PCI-E 3.0

Considering XDMA, and the fact that the 295X2 is a dual GPU card, will this significantly hurt performance?

My other option is to go back to the store, and pick up a Corsair H100i GTX.

Let me know what you think guys.

Thanks!


----------



## wermad

Which mb model? If it's the g1 gaming, put card #1 in pcie #2 and card #2 in pcie #4. It should trigger 8x for both. 295x2 will run fine @ 8x/8x. Hard ocp did their quad review on a maximus z87 8x/8x.


----------



## Medusa666

Quote:


> Originally Posted by *wermad*
> 
> Which mb model? If it's the g1 gaming, put card #1 in pcie #2 and card #2 in pcie #4. It should trigger 8x for both. 295x2 will run fine @ 8x/8x. Hard ocp did their quad review on a maximus z87 8x/8x.


It is the Gigabyte X99 Gaming 5, and the CPU is a 5820K.

Manual says that 2x PCI Express 16x slots runs at x16, PCIE1 and PCIE2 - Currently using.

Furthermore, two PCI Express x16 slots running at x8 PCIE3 and PCIE4.

Says with a small asteric *When an i7-5820K CPU is installed, the PCIE2 slot operates at up to x8 mode and the PCIE3 operates at up to x4 mode.

This is so confusing to me, if I read it correctly, if I use the PCIE2 and PCIE4 i will still get 16x and 8x?

Edit:; Goddamn I feel so ******ed, must be that if I use PCIE4 and PCIE2 I should get 8x / 8x?

The manual recommends installing in PCIE1 and PCIE2, but I can't do that either way, is it still going to work?

Edit#2: Just tried mounting the cards that way, PCIE2 and PCIE4, well there was absolutely 0, ZERO, zip none, spacing between them, so the top cards small fan for the VRM gets rekt bigtime, I guess I have to pick up a new AIO cooler, and I dislike Corsair so much, goddamnit, other option is swapping motherboard but that feels very radical lol


----------



## wermad

Ok, according to the gb manual:
Quote:


> 2 x PCI Express x16 slots, running at x16 (PCIE_1/PCIE_2)
> * For optimum performance, if only one PCI Express graphics card is to be installed, be sure to install it in the PCIE_1 slot; if you are installing two PCI Express graphics cards, it is recommended that you install them in the PCIE_1 and PCIE_2 slots.
> 
> ]2 x PCI Express x16 slots, running at x8 (PCIE_3/PCIE_4)
> * The PCIE_4 slot shares bandwidth with the PCIE_1 slot. When the PCIE_4 slot is populated, the PCIE_1 slot will operate at up to x8 mode.
> * When an i7-5820K CPU is installed, the PCIE_2 slot operates at up to x8 mode and the PCIE_3 operates at up to x4 mode.
> (All PCI Express x16 slots conform to PCI Express 3.0 standard.)
> 
> 3 x PCI Express x1 slots
> (The PCI Express x1 slots conform to PCI Express 2.0 standard.)
> 
> 1 x M.2 Socket 1 connector for the wireless communication module (M2_WIFI)


You wanna use #2 and #4 slot, it should work w/ 8x.



You can always go w/ a different cooler. I know the phanteks did fit w/ the gb Sniper 5 Z87 as long as I insulated the fan clips. The old SilverArrow fits but they're harder to find these days. There are slightly smaller versions working w/ 120mm fans to help w/ clearance. They should perform close to these guys. I think beQuiet! has one.

Other then this, just get an aio cooler for the cpu to match the two cards.


----------



## Medusa666

Quote:


> Originally Posted by *wermad*
> 
> Ok, according to the gb manual:
> You wanna use #2 and #4 slot, it should work w/ 8x.
> 
> 
> 
> You can always go w/ a different cooler. I know the phanteks did fit w/ the gb Sniper 5 Z87 as long as I insulated the fan clips. The old SilverArrow fits but they're harder to find these days. There are slightly smaller versions working w/ 120mm fans to help w/ clearance. They should perform close to these guys. I think beQuiet! has one.
> 
> Other then this, just get an aio cooler for the cpu to match the two cards.


Thanks for you looking into this, it is confusing.

I put the cards in the PCIE 2 and 4 slots, but I don't think it is going to be good for them to be run that way, there was 0 spacing, so it will get extremely hot, it is out of the question I believe.

So the AIO is the next logical step, what do you think of the Corsair H100i GTX? Good? Bad?


----------



## wermad

If you use slot #2 and #4, you have two slots of spacing for the top card to breathe from *but* that bottom card will be basically blocked out. That's one thing w/ the huge tower coolers is the clearances you have to keep in mind.

For your hexa, I'm sure the h100i should work fine. Corsair aio's cooler are pretty good and reliable from what I hear. I know first hand their support is awesome. The old H100 has been used since SB hexas, so you can always look for one of these guys too. Even the older h100, I belive the corsair will edge out your Noctua. The 280 aio's are pretty good too if you're open to sticking a 280 on top (ie h110i/110).


----------



## Medusa666

Quote:


> Originally Posted by *wermad*
> 
> If you use slot #2 and #4, you have two slots of spacing for the top card to breathe from *but* that bottom card will be basically blocked out. That's one thing w/ the huge tower coolers is the clearances you have to keep in mind.
> 
> For your hexa, I'm sure the h100i should work fine. Corsair aio's cooler are pretty good and reliable from what I hear. I know first hand their support is awesome. The old H100 has been used since SB hexas, so you can always look for one of these guys too. Even the older h100, I belive the corsair will edge out your Noctua. The 280 aio's are pretty good too if you're open to sticking a 280 on top (ie h110i/110).


This is what it looks like with slot 2 and 4,










Is this acceptable?

The Corsair seems to be a decent option, but I love Noctua and it really bothers me that I can't use this cooler.


----------



## wermad

#4 is the bottom most pcie. Check back with the pic I posted, the yellow box is #4 (blue box is the gap between #2 and #4). Your last pic shows #2 and #3 btw. Keep in mind #4 will interfere with your i/o headers (hence the aio is the best route).


----------



## bobbavet

It's been sometime since I have had a AMD GPU (HD6990).

What are some "must have" software and utilities for the card?

Is there 3rd party custom drivers or registry fixes?

Particularly for any xfire issues.

Really looking forward to having it up and running. Just waiting on the Skylake and Win 10 releases for the new build.


----------



## Alex132

Was testing how memory clocks affect performance, some early results:






Does anyone know the safe voltage limit for these cards? Sapphire Trixx lets you pretty much go crazy. I am at 1080mV for a stable 1200/1500 - seems safe, right?


----------



## kayan

Hey guys, long time no post. I'm having an odd issue with my 295x2, again. A few nights ago I loaded up BF4 and it ran fine, same settings as below. Today I load it up and I got 2 Direct X errors back to back on the same server. Game is still running but the whole display turned white. And I get the following error:



So, I try a different server, it loads in this time but my screen looks like all smoke and one blocky cloud. Then the server changes maps, almost immediately and it loads in fine. FRAPS is reporting my FPS as 30, instead of my typical 55-60 (everything maxed @ 3440x1440, resolution scale was set to 150%). I lowered the resolution scale back to 100% and I got the same Direct X error and same white solid background the minute I hit apply. This happened over the course of probably one and a half minutes.


----------



## wermad

I got a reply from Lepa support, asking for more info and pics. I'm hoping they get this approved soon to ship it off. I'll be using a backup psu for now to at least power on my rig. I'm waiting on pay day to pick up a couple of used units (my tx10 can hold up to four psus and more w/ the psu bracket).
Quote:


> Originally Posted by *bobbavet*
> 
> It's been sometime since I have had a AMD GPU (HD6990).
> 
> What are some "must have" software and utilities for the card?
> 
> Is there 3rd party custom drivers or registry fixes?
> 
> Particularly for any xfire issues.
> 
> Really looking forward to having it up and running. Just waiting on the Skylake and Win 10 releases for the new build.


Welcome to our little club! First thing first, check out the recommended psu list in the op (or in my sig) as the 295x2 has unique power requirements from amd.

MSI afterburner and Sapphire Trixx. Most go w/ msi but I know it can cause some instability. Trixx is pretty solid too. I prefer the monitoring setup of the msi. Both have the option to turn of ulps if you're having issues with load on each core.
Quote:


> Originally Posted by *Alex132*
> 
> Was testing how memory clocks affect performance, some early results:
> 
> *snip*
> 
> Does anyone know the safe voltage limit for these cards? Sapphire Trixx lets you pretty much go crazy. I am at 1080mV for a stable 1200/1500 - seems safe, right?


You're not using the stock bios? I'm planning on oc'ing after I sort out the psu issues I'm having.
Quote:


> Originally Posted by *kayan*
> 
> Hey guys, long time no post. I'm having an odd issue with my 295x2, again. A few nights ago I loaded up BF4 and it ran fine, same settings as below. Today I load it up and I got 2 Direct X errors back to back on the same server. Game is still running but the whole display turned white. And I get the following error:
> 
> 
> 
> So, I try a different server, it loads in this time but my screen looks like all smoke and one blocky cloud. Then the server changes maps, almost immediately and it loads in fine. FRAPS is reporting my FPS as 30, instead of my typical 55-60 (everything maxed @ 3440x1440, resolution scale was set to 150%). I lowered the resolution scale back to 100% and I got the same Direct X error and same white solid background the minute I hit apply. This happened over the course of probably one and a half minutes.


It only happens w/ BF4? Try the single player campaign. You may have to install directx again and usually there's a launcher w/in the folder of the game (origin folder). I have DirectX issues with older games that don't like the new version installed or say the version is out of date. Just launching the installer that comes with the game fixes it.


----------



## xer0h0ur

For what its worth, Afterburner is only problematic due to the OSD which in reality means that RTSS (RivaTuner) is the culprit for me at least. If I don't use the OSD at all and never have RTSS running due to Afterburner's OSD then I don't have gaming related BSODing.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> You're not using the stock bios? I'm planning on oc'ing after I sort out the psu issues I'm having.


I'm actually using the stock BIOS and just leaving Sapphire Trixx permanently open for a 24/7 OC of 1050/1500 + stock voltage.

For the benchmark's OC I used these simple settings, and it seemed way, way more stable than MSI AB:


I've basically given up on the BIOS thing, although I'd love a ~1050/1500 + stock voltage + UEFI BIOS


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> It only happens w/ BF4? Try the single player campaign. You may have to install directx again and usually there's a launcher w/in the folder of the game (origin folder). I have DirectX issues with older games that don't like the new version installed or say the version is out of date. Just launching the installer that comes with the game fixes it.


Correct, I've only seen it in BF4 thus far. I'll try again tomorrow with this fix,and report back. Thanks.


----------



## casier

Hello guys,

I've an XFX 295x2, and today I tried to connect y PC to my 4K TV (hdmi only







) . What a surprise : only 1080p résolution !.... After hours of research, I figured out (I read it on internet) that it might be... the miniDP -> HDMI adapteur, given by XFX, that is so crap that it does not support greater resolutions... oh great, wonderful...

Can you confirm that, and if yes, can someone give me a link with a TRUELY adapteur or cable DP->HDMI for 4k, at a reasonnable price ?... I don't know which seller trust now...

Thx !


----------



## xer0h0ur

Well, you have bad news coming. None of AMD's cards use HDMI 2.0 which is the only common TV connection that would give you 60Hz @ 4K. Displayport 1.2a would also do it but I doubt your TV has that connection. I don't believe the DisplayPort to HDMI 2.0 adapters have come out yet. So really all you can do is use HDMI 1.4 for 4K @ 30Hz.


----------



## casier

No bad news because I already know that I cannot achieve 60 fps 4k with HDMI 1.4 ; that's a shame.

But my actual problem is that I cannot display 4k, it seems that resolution is still 1080p with this crap adapter, so I obviously need to but another cable/adapter ?.... Dunno where and which one I should buy :/

Does any cable mini DP -> HDMI will be ok (that would mean this adaptater is a really s***), or should be careful to what I buy ? There are so many of them, and description are not poor sometimes, it's a mess to choose...


----------



## xer0h0ur

Well since I haven't done it myself I don't want to lead you in the wrong direction. I do not know if simply any Displayport to HDMI adapter would work for you or if you specifically would require an active adapter.

For the record, the reason the DVI to HDMI adapter is not giving you a 4K signal is because DVI was never meant to carry a 4K signal at all. It simply doesn't have the bandwidth for it.


----------



## wermad

Both my xfx cards came with a mini dp to hdmi 1.4 adapter. Once you plug it in using an hdmi cable, it should give you 4k @ 30hz.


----------



## xer0h0ur

Well for what its worth, my Diamond 295X2 came with a DVI to HDMI adapter. I assumed that was what he had and why it wasn't giving him a 4K signal. If he is using a Displayport to HDMI adapter already then something else is wrong. Perhaps something within the TV's settings needs to be toggled.


----------



## Dagamus NM

Yep, the dongle from the xfx should do 4k 30Hz. There is one 4k TV to my knowledge that runs at 60Hz through DP but it has problems.


----------



## rakesh27

Guys,

Sorry for the change subject.. im using the latest MSI afterburner

ok before def i knew my crossfire was working, i had a 295x2 and 2950x... some games all 3 gpus would work, but most it would be 2.

Anyways i wanted to try Nvidia since i havent in a long time see what would it be like.

So at present in my rig im running 295x2 + 980Ti SC, obviously not at the same time, they are both powered, when ever i want to play one game with a particular card i plug the monitor cabled (hdmi) into that card and it gets detected.

Did notice when playing games with 295x2 that MSI would only show one gpu usage however the temps/gpu/mem clocks for both GPU's will be running at full pelt.

Is it correct to assume msi does not like mixing cards from 2 different groups....

I notice whenever i switched to particular card, in MSI everything was detected as normal...

For you reference, top card is Nvidia 980ti second card is AMD 295x2, both latest drivers installed, i installed AMD first then Nvidia....

Im assuming if when playing games with the 295x2 and i can see clocks for both gpus and memory go up along with temps that its running in crossfire mode, in GPUz it says its detected

Im not having any problems when switching between games or booting up with one of the cards, all running fine and stable

Id be grateful for you advice and thoughts

Thanks


----------



## Alex132

Disable ULPS.


----------



## SAFX

When using a single monitor, does it matter which DP port is used on the card?

I'm currently using the far right port, but having intermittent issue with sleep/wake on Windows 7; sometimes the monitor does not "wake" up.


----------



## xer0h0ur

Naw, you shouldn't have any issue using any port whatsoever. The issue you describe is driver related far as I know.


----------



## SAFX

Where did you read it was driver related?


----------



## xer0h0ur

Because the monitor not wanting to wake from sleep is a notorious complaint made by tons of AMD card users. Its why I disabled sleep mode altogether and only use the turn off monitor after x amount of time setting in windows.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Because the monitor not wanting to wake from sleep is a notorious complaint made by tons of AMD card users. Its why I disabled sleep mode altogether and only use the turn off monitor after x amount of time setting in windows.


Yeah, I'm thinking of doing the same at this point, very annoying


----------



## xer0h0ur

Yeah it really is a PITA.


----------



## SAFX

*GTA V Issue*

...getting really bad artifacts, huge jaggies covering the sky, basically Attack of the Polygons








Disable FXAA, enable MSAA? ...or disable AA altogether?


----------



## xer0h0ur

Are you using the latest 15.7.1 driver? Or which one?


----------



## NBAasDOGG

Hey guys,

I recently bought myself a Samsung U28E590 Freesync monitor, but somehow freesync does not work. The R9 295X2 and windows 10 are supported right (i'm on CCC 15.7.1)?
So i enables freesync in monitros OSD and from CCC, but cannot enable freesync in windmill demo (the option is greyed out). I tried BF4 and still screentearing when Vsync if off. Freesync doest even work when i lock FPS on the freesync range (which is 40-64 for Samsung).
Anyone have solutions?


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Are you using the latest 15.7.1 driver? Or which one?


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> *GTA V Issue*
> 
> ...getting really bad artifacts, huge jaggies covering the sky, basically Attack of the Polygons
> 
> 
> 
> 
> 
> 
> 
> 
> Disable FXAA, enable MSAA? ...or disable AA altogether?


Eh, sounds like drivers funking up.

Try a DDU clean install of the latest 15.7.1 drivers. That actually fixed a bunch of errors for me, as well as making sure to never touch MSI AB with a 10ft pole again


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> Eh, sounds like drivers funking up.
> 
> Try a DDU clean install of the latest 15.7.1 drivers. That actually fixed a bunch of errors for me, as well as making sure to never touch MSI AB with a 10ft pole again


Did not know there was a newer driver, thanks for that.
And regarding MSI, I switched to Trixx last month


----------



## wermad

Anyone on win10? I have the upgrade ready but I don't wanna pull the trigger yet.


----------



## Dagamus NM

Yep. I like it so far but I haven't stuck my 'X2s back in yet so far I have it on my RVE with quad 780Tis and the RIVE with a single air cooled 980Ti where the 'X2s live. I will put it on my quad 6GB 280X build shortly. I literally just fired it up for the first time five minutes ago and will see how many Intel 730s it lets me raid 0. I have six connected right now, hopefully the chipset supports it as I couldn't really find anything out one way or another.

But yes, windows 10 is going to be a winner. I disliked 8's UI for a desktop, 10 is very much like 7 but better features and functionality and man is it quick.

Also, you can upgrade at any time. Just use the tool to create media for original installs and pick the download option.


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> Anyone on win10? I have the upgrade ready but I don't wanna pull the trigger yet.


I've got the ISO and key, just waiting for others to experience problems I'd rather avoid


----------



## wermad

Quote:


> Originally Posted by *SAFX*
> 
> I've got the ISO and key, just waiting for others to experience problems I'd rather avoid


Same here







. I'm in no rush since I'm waiting for an rma.
Quote:


> Originally Posted by *Dagamus NM*
> 
> Yep. I like it so far but I haven't stuck my 'X2s back in yet so far I have it on my RVE with quad 780Tis and the RIVE with a single air cooled 980Ti where the 'X2s live. I will put it on my quad 6GB 280X build shortly. I literally just fired it up for the first time five minutes ago and will see how many Intel 730s it lets me raid 0. I have six connected right now, hopefully the chipset supports it as I couldn't really find anything out one way or another.
> 
> But yes, windows 10 is going to be a winner. I disliked 8's UI for a desktop, 10 is very much like 7 but better features and functionality and man is it quick.
> 
> Also, you can upgrade at any time. Just use the tool to create media for original installs and pick the download option.


How's the whole spyware junk? Can it be completely disabled or removed?


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> ....never touch MSI AB with a 10ft pole again


what are you using for statistics and graphing? MSI was great for that


----------



## wermad

Asus gpu tweak has a graph. Don't recall if Trixx has it







. Has anyone gotten evga precision to work? Heard somewhere it does read amd cards.


----------



## SAFX

Anyone notice lower performance on 15.7.1? My numbers are 3.4% and 2.4% lower for stock/OC, respectively, compared to 15.7 drivers:


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> ....never touch MSI AB with a 10ft pole again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what are you using for statistics and graphing? MSI was great for that
Click to expand...

Actually just FRAPS for FPS / benchmarking. For on-screen, nothing.


----------



## xer0h0ur

Quote:


> Originally Posted by *SAFX*
> 
> Anyone notice lower performance on 15.7.1? My numbers are 3.4% and 2.4% lower for stock/OC, respectively, compared to 15.7 drivers:


I have to benchmark 15.7.1 to give you a comparison between 15.6, 15.7 and 15.7.1.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> How's the whole spyware junk? Can it be completely disabled or removed?


Overhyped. The only thing I had to do was disable the auto updates to get nvidia to play nice. I have yet to test interaction with an amd card or CPU. For devices whose win 10 drivers are not released I downloaded a mix of 7 and 8.1 drivers. Everything has worked fine. In fact for the first time ever, when installing a Logitech game pad it just worked without having to download Xbox controller drivers and update through device manager. It actually worked out of the box. Amazing.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have to benchmark 15.7.1 to give you a comparison between 15.6, 15.7 and 15.7.1.


I wanna see this as well.
If you prove this that 15.7.1 are "bad" drivers then I might try to use 15.7 again..









BTW guys, I remember some of you had 3-fire.
295X2 + 290X. Is it worth it ?
Any problems you encountered ?
What is your recommendation for PSU wattage?
I have 1200W Superflower Leadex platinum but I'm not sure if it's nuff....
I would be w-cooling.


----------



## xer0h0ur

Well I play a lot of CS:GO and this driver includes an enhancement for this game that reduces the FlipQueueSize so even if it performs worse than the 15.7 I am still keeping it. I should finally have a chance to run the various FireStrike tests tonight when I get out of work. As for running 295X2 + 290X with your power supply, I don't see any problem. I barely pull a shade over 1000W and I haven't even spiked once past 1060W. Be it any of the FS benches, gaming, AIDA64 Extreme stress test and that is overclocked too. 1090/1500MHz 295X2 (+50 power limit, +100mV), 1150/1500MHz 290X (+50 power limit, +100mV), 4.2GHz 4930K.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well I play a lot of CS:GO and this driver includes an enhancement for this game that reduces the FlipQueueSize so even if it performs worse than the 15.7 I am still keeping it. I should finally have a chance to run the various FireStrike tests tonight when I get out of work. As for running 295X2 + 290X with your power supply, I don't see any problem. I barely pull a shade over 1000W and I haven't even spiked once past 1060W. Be it any of the FS benches, gaming, AIDA64 Extreme stress test and that is overclocked too. 1090/1500MHz 295X2 (+50 power limit, +100mV), 1150/1500MHz 290X (+50 power limit, +100mV), 4.2GHz 4930K.


Good to hear that ! Thanks









If I remember correctly, there's a problem with installing 295X2 + 290X with Aquacomputer blocks + ACTIVE BACKPLATE. Is the backplate making it worse/harder to install the cards on the motherboard or ?


----------



## xer0h0ur

I used EK blocks and backplates so I can't comment on that.


----------



## MIGhunter

Quote:


> Originally Posted by *fat4l*
> 
> Good to hear that ! Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I remember correctly, there's a problem with installing 295X2 + 290X with Aquacomputer blocks + ACTIVE BACKPLATE. Is the backplate making it worse/harder to install the cards on the motherboard or ?


Are you getting any benefit from a 295x and a 290x or is it just for an extra monitor?


----------



## xer0h0ur

I am guessing you aren't familiar with how multi-monitor setups work on AMD's cards. You can only use the monitor connections of a single card when crossfired. If you disable crossfire you can of course use every monitor connection available provided the card supports using all of them. Obviously adding more cards in crossfire does give a performance bump when its operating properly in games. 2 GPUs crossfired give godly scaling but it drops off after that and imo the 4th GPU doesn't add enough to warrant the money since scaling at that point is bad. Scaling on the 3rd GPU is halfway decent but still not worth the money for most people. I got my 290X cheap so I said screw it, why not.


----------



## MIGhunter

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am guessing you aren't familiar with how multi-monitor setups work on AMD's cards. You can only use the monitor connections of a single card when crossfired. If you disable crossfire you can of course use every monitor connection available provided the card supports using all of them. Obviously adding more cards in crossfire does give a performance bump when its operating properly in games. 2 GPUs crossfired give godly scaling but it drops off after that and imo the 4th GPU doesn't add enough to warrant the money since scaling at that point is bad. Scaling on the 3rd GPU is halfway decent but still not worth the money for most people. I got my 290X cheap so I said screw it, why not.


I'm building a new computer and moving my current one into my son's room. So until I finish it, I have a 290x and a 290x. My 290x is in the box in the closet right now ....


----------



## xer0h0ur

Overclocking-wise, dual 290X's will nearly always out-perform a 295X2 because each card on an individual basis has access to more power than a 295X2 can draw. Really, the only advantage you have with the 295X2 is reduced power draw for dual GPUs and its a single PCI-E slot used. In my case for instance if I wanted tri-fire I had to go this route since I only have two 16X PCI-E 3.0 slots.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> Overclocking-wise, dual 290X's will nearly always out-perform a 295X2 because each card on an individual basis has access to more power than a 295X2 can draw. Really, the only advantage you have with the 295X2 is reduced power draw for dual GPUs and its a single PCI-E slot used. In my case for instance if I wanted tri-fire I had to go this route since I only have two 16X PCI-E 3.0 slots.


Another advantage are the displayports. They come in handy with eyefinity, especially with wqhd eyefinity and 5x1/3x2 eyefinity. 1080/1200 eyefinity runs fine off the 290/290x plugs.


----------



## casier

After a week of testings, I don't have any clue about several issues I'm experiencing with my XFX 295x2. This is quite strange, more or less random, and I never managed to figure out what's going on, why, and what to conclude. Here are some facts, hope you guys will have a more precise idea than me.

I have some laggy issues with GTA 5 (this is the only big greedy game I actually have on my machine). Basically, the game runs almost perfectly, except that there are two issues, I don't know if they are linked or not :

- First issue is visible very often : even when the game is pretty smooth @ 60 fps, say every 3-4 secondes, I can see some frame jumps, I don't know how to call it. This is not tearing because vsync is on and also because this does not look like a cut image). I don't think this is CPU stuttering because I've a 5960X @ 4.4 GHz, because overclocking the processor didn't change anyting, and also because it does not look like a freezing of the whole image. What I'm talking about most of the time happens at the bottom of the screen, when moving, it was like a part of the screen (but not all the screen !) was briefly "jumping", its a mix between tearing and freezing, I really don't know how to describe it. A good idea would be to imagine like it was a graphic glitch (but its not, I don't think), most of the time at the bottom of the screen only. Some rare times, I don't see this issue at all, I dont know when or why... I recall that the game is still running at 60fps when it happens, not the whole frame have the issue, only a part of the frame...

- Second issue is even more strange ; it happens almost every time (but not always !!....) I start my computer and launch the game. When the game begins, it just incredibly LAGS, when I say lags, I meen I get like 5 fps ! This is more or less critical (5 fps, 30 fps, 40 fps), depending on the session. The lags does not really look like classical lags, this is really strange and change depending of the session (I know, this is ridiculous...) : sometime it seems classical micro lags, sometimes this is kinda like the frame was going very briefly "backward", like rewinding to the previous frame (whole frame, here), very briefly. Sometime this is just a HUGE lag of 2 seconds, every XX seconds. More strange : after 2-3-5-10 minutes of playing, the issue vanishes and the game go to it 60 fps until the end of the session...... Also, sometimes, just NOTHING happens and everything is ok (except the first issue) from the very beginning of the game... WHen it happens, all GPU are 100%, and the CPU is on its normal utilization (20%), this does not change from the good sessions. I NEVER managed to reproduce it perfectly, this seems ridiculously random, this is just like the game had to warm up, to load, and when it's done, it stops ; that's the only thing I'm pretty sure of. To finish : I experience the same thing in 3D Mark Firestrike (normal mode), sometime it runs good or almost good (16800 points, seems low for this car, isn't it ?...), sometimes it runs awful, lags and so on, and I get 10000 points or lower. Here again, it depends on the run.

After bunches of restarts, monitoring with MSI AB, I don't have many clues. I can say that it does not seem to be linked to the temperature of the GPU or CPU (because sometimes it continues even if the GPUs reached their stabilization temperature), the resolution have some effects (both issues seems more critical in high resolutions than lower) but the problems happens even in 800x600 (!). This is probably not linked to the PSU neither because I have a platinium Corsair 860W and in the whole week I measured a maximum of 680W (moreover, the game perfectly runs when the issue vanishes). I tested with only 8 GB of RAM instead of the 32GB, same thing. I installed windows 10 instead of windows 7, full of hopes this would solve the issue : same thing. I also tried to underclock the card voltage and/or frequencies, to see if this could be the memory or the PSU : same thing.

I'm pretty frustrated because I don't understand what's going on. If you guys encountered these issues, I' would appreciate any clue !

Thanks.

PS : using last catalyst drivers, X99 deluxe MB with 32GB DDR4.


----------



## xer0h0ur

I think you're overlooking the power supply without being properly informed. The 295X2 doesn't have bad wattage requirements. What it does have are high amperage requirements. It needs 50A and each 8-pin cable getting 30A a piece. I have a feeling that PSU doesn't meet the needs. Have you verified that you're running the card @ 16X PCI-E 3.0? Use the tool in GPU-Z to verify this. Have you been checking Afterburner's logging to see if you're getting GPU clock throttling? Did you disable ULPS through Afterburner?


----------



## xer0h0ur

Quote:


> Originally Posted by *SAFX*
> 
> Anyone notice lower performance on 15.7.1? My numbers are 3.4% and 2.4% lower for stock/OC, respectively, compared to 15.7 drivers:


5.6 FireStrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
15.7 FireStrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score
15.7.1 FireStrike Ultra: http://www.3dmark.com/fs/5597532 7643 overall score 8209 graphics score
15.7 FireStrike Extreme: http://www.3dmark.com/fs/5358452 13290 overall score 16370 graphics score
15.7.1 FireStrike Extreme: http://www.3dmark.com/fs/5597674 13264 overall score 16389 graphics score
15.7 FireStrike: http://www.3dmark.com/fs/5358521 22159 overall score 34095 graphics score
15.7.1 FireStrike: http://www.3dmark.com/fs/5597577 21869 overall score 34325 graphics score

4930K @ 4.2GHz, 295X2 1090/1500MHz +50 power limit +100mV, 290X 1150/1500MHz +50 power limit +100mV


----------



## casier

Quote:


> Originally Posted by *xer0h0ur*
> 
> I think you're overlooking the power supply without being properly informed. The 295X2 doesn't have bad wattage requirements. What it does have are high amperage requirements. It needs 50A and each 8-pin cable getting 30A a piece. I have a feeling that PSU doesn't meet the needs.


Well, the thing is that even when downlocking and downvoltaging the 295x2, this does not change anything. Also, how could we explain that I have the second issue only at the beginning of the game, and that when it vaniches, it works perfectly all the time then ? This would be strange. And this PSU is rather good, it is a platinium and it is able to deliver 71.6A @ 12V.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you verified that you're running the card @ 16X PCI-E 3.0? Use the tool in GPU-Z to verify this.


Yes, GPU-Z says it is 3.0.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you been checking Afterburner's logging to see if you're getting GPU clock throttling?


Yes I always have MSI AB launched, the frequencies are stable @ stock values.
Quote:


> Originally Posted by *xer0h0ur*
> 
> Did you disable ULPS through Afterburner?


No I didn't ; now I did ; I'll see if after few tests it changes anything, but from this first test, I had issue 1 (I didn't have issue 2, but, you know, the PC was "hot" (played games tonight), who know if this change anything, everything is possible here....)


----------



## Dagamus NM

MSI AB is sometimes the cause of problems with these cards, sometimes not. You will hear varying statements on this.


----------



## xer0h0ur

Quote:


> Originally Posted by *casier*
> 
> Well, the thing is that even when downlocking and downvoltaging the 295x2, this does not change anything. Also, how could we explain that I have the second issue only at the beginning of the game, and that when it vaniches, it works perfectly all the time then ? This would be strange. And this PSU is rather good, it is a platinium and it is able to deliver 71.6A @ 12V.
> Yes, GPU-Z says it is 3.0.
> Yes I always have MSI AB launched, the frequencies are stable @ stock values.
> No I didn't ; now I did ; I'll see if after few tests it changes anything, but from this first test, I had issue 1 (I didn't have issue 2, but, you know, the PC was "hot" (played games tonight), who know if this change anything, everything is possible here....)


You probably want to list your full specs in your signature or even better set it up in the rigbuilder link in the upper right hand corner. What is the model of the PSU?


----------



## wermad

He mentioned corsair platinum 860, so I assume it's the ax860i; 72amp single rail. Though with these controllable psu's, not sure how that affects the power output.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> 5.6 FireStrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
> 15.7 FireStrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score
> 15.7.1 FireStrike Ultra: http://www.3dmark.com/fs/5597532 7643 overall score 8209 graphics score
> 15.7 FireStrike Extreme: http://www.3dmark.com/fs/5358452 13290 overall score 16370 graphics score
> 15.7.1 FireStrike Extreme: http://www.3dmark.com/fs/5597674 13264 overall score 16389 graphics score
> 15.7 FireStrike: http://www.3dmark.com/fs/5358521 22159 overall score 34095 graphics score
> 15.7.1 FireStrike: http://www.3dmark.com/fs/5597577 21869 overall score 34325 graphics score
> 
> 4930K @ 4.2GHz, 295X2 1090/1500MHz +50 power limit +100mV, 290X 1150/1500MHz +50 power limit +100mV


thats looks good









BTW the reason I was asking about trifire is that I now have a new monitor, 144Hz, 27" Freeesync one. I like when my games are running 100fps +








Thats all hehe...


----------



## xer0h0ur

Basically, with Hawaii Tri-Fire as long as its working properly driver-side, you will maintain high framerate @ 1440p or 1080p assuming the game in question isn't running hellacious tessellation or particle effects.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> 5.6 FireStrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
> 15.7 FireStrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score
> 15.7.1 FireStrike Ultra: http://www.3dmark.com/fs/5597532 7643 overall score 8209 graphics score
> 15.7 FireStrike Extreme: http://www.3dmark.com/fs/5358452 13290 overall score 16370 graphics score
> 15.7.1 FireStrike Extreme: http://www.3dmark.com/fs/5597674 13264 overall score 16389 graphics score
> 15.7 FireStrike: http://www.3dmark.com/fs/5358521 22159 overall score 34095 graphics score
> 15.7.1 FireStrike: http://www.3dmark.com/fs/5597577 21869 overall score 34325 graphics score
> 
> 4930K @ 4.2GHz, 295X2 1090/1500MHz +50 power limit +100mV, 290X 1150/1500MHz +50 power limit +100mV


No pretty graph?









I'm testing again, I probably left Notepad open during my last run, it's a major resource hog


----------



## xer0h0ur




----------



## MIGhunter

Dumb question, my 290x has a switch for quiet mode and aggressive mode for gaming. Does the 295x2 have something similar?


----------



## wermad

It doesn't. The 295x2 has a lower thermal limit (75c) but its water cooled. It s just a dual bios.


----------



## mojobear

Hey guys,

Not many people with 4 way crossfire 290(X) so I was hoping to get some input here. For those with quadfire 295x2, how is witcher 3 performing when you enable all your GPUs? Mine stutters like a mess (plus Im in eyefinity). 2 way crossfire works...so it seems 4 GPU crossfire is broken for this game.

Anyone else can share their experiences?

Thanks!


----------



## wermad

Lepa a bit slow but things are moving along. Just submitted the rma request after after a few emails with lepa support about the issue and the troubleshooting I've done. I'm hoping i can get this shipped out quickly but I'm still very open to getting two solid single rail units instead. I need to measure how much space I have in my TX10 for a second shorter unit.

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Not many people with 4 way crossfire 290(X) so I was hoping to get some input here. For those with quadfire 295x2, how is witcher 3 performing when you enable all your GPUs? Mine stutters like a mess (plus Im in eyefinity). 2 way crossfire works...so it seems 4 GPU crossfire is broken for this game.
> 
> Anyone else can share their experiences?
> 
> Thanks!


I don't actually play this game but I hear a lot of complaints with this one. Try 2-way or 3-way xfire and see how that fairs. For us qaud 295x2 guys, we only have the luxury of 2-way or 4-way xfire sadly. Try the basic troubleshooting: roll back drivers or update if you're on an older set, monitor your usage to see load on each core, make sure you disable ulps as that could hamper with gpu load, etc. There's a few members around here that can further help as I know they have experience with this game.


----------



## bobbavet

Gday Guys

Been offered a Enermax Revolution ERV1250EGT 80+ Silver for $50 that's had some use. Changed out cause he wanted full modular.

Not on the list. Any thoughts?

Product page

Cheers Bob


----------



## Insan1tyOne

Hello Everyone,

Just wanted to drop in here and say that I just picked up a Diamond R9 295X2 for $500 shipped for my new mITX rig that I am building. So I will be joining the club soon! I am pretty excited about the amazing deal that I got on it.

- Insan1tyOne


----------



## wermad

Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> Been offered a Enermax Revolution ERV1250EGT 80+ Silver for $50 that's had some use. Changed out cause he wanted full modular.
> 
> Not on the list. Any thoughts?
> 
> Product page
> 
> Cheers Bob


Reason i left this one out is not doesn't meet bronze rating, and even though sometimes this rating can be over-hyped, to me, its really a dated design if it can't meet at least bronze. Enermax has a ton of models (past and present), so I really started with the bronze units and went from there. The rails specs are good, just make sure you connect each plug to a separate rail.
Quote:


> Model Number: ERV1250EGT-00
> AC Input Voltage 220-240VAC, 50-60Hz, Active PFC
> AC Input Current 7.5 - 6A
> DC
> output Rated 3.3V 5V 12V1 12V2 12V3 12V4 12V5 12V6 -12V 5Vsb
> 0-25A 0-25A 0-30A 0-30A 0-30A 0-30A 0-30A 0-30A 0-0.6A 0-5A
> Combined 170W 1248W (104A) 7.2W 25W
> Total Power 1250W
> Peak Power 1500W


http://www.enermax.com/home.php?fn=eng/product_a1_1_2&lv0=1&lv1=54&no=6

Btw, I don't think this one is silver rated as I couldn't find that rating (or percentage) on their site or google's.Its cheap, but as the old saying goes, if you're gonna put quality parts into your rig, don't skimp out on the psu. Personally, for the 295x2, I wouldn't run it. If I still had my tri 290s, sure









An extra $50 would get you a nice preowned unit. I've seen a ton of EVGA Gold/platinum 1000w and V1000s sell for $100 on ebay. Just make sure you receipt for warranty purposes from the previous owner. I picked up a G1600-MA a few months ago for $150 on ebay. Needs rma but the unit has been solid so far. Though I am open to getting dual single rail units. Enermax/Lepa are known to have a ton of rails (much like Antec) and it can get a bit confusing when connecting the 295x2

edit: the Evo (which i had with tri 580 3gbs) is bronze, but I can't find a 1250w silver or gold unit, so for sure this one is just 85+.
Quote:


> Originally Posted by *Insan1tyOne*
> 
> Hello Everyone,
> 
> Just wanted to drop in here and say that I just picked up a Diamond R9 295X2 for $500 shipped for my new mITX rig that I am building. So I will be joining the club soon! I am pretty excited about the amazing deal that I got on it.
> 
> - Insan1tyOne


Welcome!


----------



## bobbavet

Quote:


> Originally Posted by *wermad*
> 
> Reason i left this one out is not doesn't meet bronze rating, and even though sometimes this rating can be over-hyped, to me, its really a dated design if it can't meet at least bronze. Enermax has a ton of models (past and present), so I really started with the bronze units and went from there. The rails specs are good, just make sure you connect each plug to a separate rail.
> http://www.enermax.com/home.php?fn=eng/product_a1_1_2&lv0=1&lv1=54&no=6
> 
> Btw, I don't think this one is silver rated as I couldn't find that rating (or percentage) on their site or google's.Its cheap, but as the old saying goes, if you're gonna put quality parts into your rig, don't skimp out on the psu. Personally, for the 295x2, I wouldn't run it. If I still had my tri 290s, sure
> 
> 
> 
> 
> 
> 
> 
> 
> 
> An extra $50 would get you a nice preowned unit. I've seen a ton of EVGA Gold/platinum 1000w and V1000s sell for $100 on ebay. Just make sure you receipt for warranty purposes from the previous owner. I picked up a G1600-MA a few months ago for $150 on ebay. Needs rma but the unit has been solid so far. Though I am open to getting dual single rail units. Enermax/Lepa are known to have a ton of rails (much like Antec) and it can get a bit confusing when connecting the 295x2


Thanks for ya thoughts Wermad.

Considering as I am outa work and this is about $36 in USD, I thought I may consider it.

I could buy new or used, when I have a tax cheque arrive. Though money saved is good.

Will keep looking around for a good buy.

Been looking @ waterblocks. We have the EK's but no back plates in Australia. Looking to get a Koolance sent over from the states. About $100 aud cheaper.

Cheers

Bob


----------



## wermad

Quote:


> Originally Posted by *bobbavet*
> 
> Thanks for ya thoughts Wermad.
> 
> Considering as I am outa work and this is about $36 in USD, I thought I may consider it.
> 
> I could buy new or used, when I have a tax cheque arrive. Though money saved is good.
> 
> Will keep looking around for a good buy.
> 
> Been looking @ waterblocks. We have the EK's but no back plates in Australia. Looking to get a Koolance sent over from the states. About $100 aud cheaper.
> 
> Cheers
> 
> Bob


Check the oz forums/sites, you'll find lots of stuff there. Its does have enough wattage and amps, its just not as efficient as the bronze ->titanium units. It'll do the job, but seeing how old it is, your 295x2 will rid it hard once you have both cores running full on. I think that's what happened to my lepa, it was used pretty good (ex miner) and was really dusty when i got it. Eventually, it started sputtering. Good thing I have the warranty to cover it (have six months left). It probably didn't have long to live while pushed pretty good.

Just to warn you, a custom loop is very expensive, even if its preowned. There's just so many components that you need; fans, rads, fittings, tube, blocks, etc. Honestly, I would ride the stock and great cooler for now and get a better psu with the saved money. I wouldn't skimp on major components like the psu for the sake of an optional custom wc setup. But, hey that's just me. A lot of guys here still run the stock cooler and can manage below the thermal threshold.

Btw, mixing blocks w/ different backplates, you're gonna have to find some different screws. I've ran into this issue w/ Ek and Hk blocks using ek and evga backplates. You may find them locally or hop online.


----------



## bobbavet

Yeh I am eying every where in Oz.

I already have a custom loop. Just need a block for the 295. I have a 280 for CPU and a 200 for GPU.

I am not looking to mix up back plates.

The Koolance comes full front and partial backplate. Better than leaving open and it is cheaper than EK.


----------



## wermad

I'm using the stock backplates with the koolance blocks:


----------



## bobbavet

Quote:


> Originally Posted by *wermad*
> 
> I'm using the stock backplates with the koolance blocks:


Did you have any trouble with the screw types for that?


----------



## Dagamus NM

Yep, stock back plates fit. Just look up the install instructions for the ek backplate for the screw size. If you can't find it let me know and I'll take a picture of mine for you.


----------



## wermad

Quote:


> Originally Posted by *bobbavet*
> 
> Did you have any trouble with the screw types for that?


Nope, koolance took this into account so the screws fit perfectly with the stock backplate.


----------



## bobbavet

Well that is awesome to know. Cheers


----------



## wermad

The ek backplate is different. Its milled not stamped, so more then likely, the koolance screws may not all reach or the backplate may conflict. Just a heads up


----------



## Dagamus NM

The ek screws fit the stock backplate. They are hex head but I have stock backplates on ek blocks. My ek backplates are in but I haven't taken the time to install them yet.


----------



## bobbavet

OK Wermad, so the Koolance *doesn't* come with a back plate. You have to use the stock backplate.

I can't find a video or tutorial on the install any where.


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> The ek screws fit the stock backplate. They are hex head but I have stock backplates on ek blocks. My ek backplates are in but I haven't taken the time to install them yet.


He mentioned has an ek backplate that he wants to use with a koolance block and that's were the concern comes in.

edit: according to the ek backplate instructions, the backplate comes with new screws to use (with the ek block). So, I'm sure the shorter koolance screws won't fit. Also, the instructions show the outside scews are only used to secure the backplate, so there's a big chance the ek backplate may interfere with some of the koolance block screws that are left on. Its a pita, I know first hand, but once you have the backplate and block, its just a bit of time to figure what you need to make them work.

https://shop.ekwb.com/EK-IM/EK-IM-3831109869093.pdf

Quote:


> Originally Posted by *bobbavet*
> 
> OK Wermad, so the Koolance *doesn't* come with a back plate. You have to use the stock backplate.
> 
> I can't find a video or tutorial on the install any where.


No, iirc, all 295x2 blocks that have a backplate are sold separately or the block will work with the stock (amd/aib) backplate.

https://koolance.com/files/products/manuals/manual_vid-ar295x2_d100eng.pdf

http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2

edit: stock backplate:



From what i've heard, not all Vesuvius come with it. Both my xfx did come with them.

Btw, make sure to carefully remove the "warranty" stickers off the screws if you have any. Will help when you need warranty and to put them back on. I used a blade to take care of mine. It takes a lot of patience and care using a blade but it can be done. I saved them with the stock aio coolers for now.


----------



## bobbavet

Thanks Wermad. yeh I seen the instructions, but they didn't say original backplate.

The re-instal picture they showed didn't have stickers on it so thought it was a shipped one with block.


----------



## wermad

I'll see if I can dig out the included paper instructions. The koolance pdf online manual sometimes doesn't match the included instructions with the block. EK does a much better with the instructions tbh but every block maker runs a bit differently.


----------



## Dagamus NM

If you have the ek backplate do you have the screws that came with it? Did it come in the original box? EK puts the screws in a separate small compartment at the bottom of the box if you have it.

The EK screws are tapered and a bit longer to accommodate the backplate. Unless Koolance is using a different screw diameter or thread pitch (imperial vs. metric) then the ek screws should mate up.

I never realized you could get quad fire on the 1150 series CPUs. I assume that Asus stuck a PLX chip on the board just like we has on our 295X2s.

TW3 has sounded buggy so I have stayed away. Hopefully by the time it is on sale on steam the bugs will be fixed and it will run on machines with AMD cards.


----------



## xer0h0ur

They have literally patched hundreds of bugs in that game and aren't done. I am growing tired of developers releasing unfinished games.


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> If you have the ek backplate do you have the screws that came with it? Did it come in the original box? EK puts the screws in a separate small compartment at the bottom of the box if you have it.
> 
> The EK screws are tapered and a bit longer to accommodate the backplate. Unless Koolance is using a different screw diameter or thread pitch (imperial vs. metric) then the ek screws should mate up.
> 
> I never realized you could get quad fire on the 1150 series CPUs. I assume that Asus stuck a PLX chip on the board just like we has on our 295X2s.
> 
> TW3 has sounded buggy so I have stayed away. Hopefully by the time it is on sale on steam the bugs will be fixed and it will run on machines with AMD cards.


The screws are flushed for the stock backplate and you get some pan heads for the cores that don't attach to the backplate. I think they're M2.5 vs M3 ek uses. If you have both the ek backplate and koolance block, you'll know.

The hardocp review showed you can run each card @ 8x 3.0 (regardless of socket). They all have an onboard plx chip which gives you the same lanes as the slot vs splitting it like older dual gpu cards.

edit: backplate shot with koolance blocks:


----------



## Dagamus NM

I think we are missing each other here.

Yes you can use a machine type screw with a pan head or you can get screws that are flush to whichever back plate you have but you must match the thread size and pitch to the block you are using. Flush screws will be shorter. At these sizes, thread pitches, and lengths you should be able to source them readily at a local shop.


----------



## wermad

My first post said: just be prepared for different screws if he plans to use the koolance block with the ek backplate.


----------



## Dagamus NM

+1 to that


----------



## SAFX

Quote:


> Originally Posted by *MIGhunter*
> 
> Dumb question, my 290x has a switch for quiet mode and aggressive mode for gaming. Does the 295x2 have something similar?


No, it's just always aggressive


----------



## mortenv

Not having read all 743 pages. How would you cool this card if it reaches 75C on gpu2 with ambient 23c on stock voltage and powerlimit? I have two noctua af140 intake fans and one noctua af140 blowing directly in front of the card. Then Corsair sp120 and stock fan blowing push/pull on the stock radiator. Cleaned out VRM heatsink with canned air etc. Case airflow is good as far as I know in the corsair 750d. It runs cool when I apply -40mv in afterburner, but I was aiming for a sligh gput OC

specs 4790k overclocked to 4,8 all cores 1,3vcore.
corsair h100i AIO mounted on top.
corsair 750d
XFX 295x2 with radiator mounted in the back with sp120 and stock fan in push/pull
Noctua af140 intake x3. modded case to properly allow 140mm airflow.
msi mpower max ac, rm850
corsair vengeance 32gb 2400mhz


----------



## wermad

Try mounting the radiator in the front. What's the first core holding at? may have a bad mount on the second core. Did you remove the stock cooler for any reason?


----------



## mortenv

Quote:


> Originally Posted by *wermad*
> 
> Try mounting the radiator in the front. What's the first core holding at? may have a bad mount on the second core. Did you remove the stock cooler for any reason?


The first core is always around 68C with the second core at 73-75c. I have not removed the stock cooler, although I have removed the shroud only for de dusting with canned air. I have not tried mounting the radiator in the front, because the radiator is dumping A LOT of heat, also, my corsair 750d no longer has mounting for 120mm rad in front due to the dremel tool.

At stock voltage, power, core and mem clock, it barely does 1:30 minutes of aida64 with intake at 100% before the first signs of throttling occur (small deviation in core#2 clock)

at core 1030mhz, mem 1550mhz, -40mv and stock powerlimit it is stable with no throttling for 10 minutes of aida64.


----------



## wermad

Maybe try a remount with fresh tim. Use something that has a fast cure time (aka not-as5). Also, check if the second gpu's pump is still working. For now, open the door to keep it cooler. My testing had both cards at or slightly under 70c with the door off and the rads and stock - fans hanging outside.


----------



## mortenv

Quote:


> Originally Posted by *wermad*
> 
> Maybe try a remount with fresh tim. Use something that has a fast cure time (aka not-as5). Also, check if the second gpu's pump is still working. For now, open the door to keep it cooler. My testing had both cards at or slightly under 70c with the door off and the rads and stock - fans hanging outside.


I am not sure whether I dare to reapply TIM or not, I only have noctua paste and acetone or loctite 7063 solvents. I applied thermal paste for the 4790k no problem but bare dies are a lot harder in my experience, I changed TIM on my laptop with 2630qm and gt555m and that thing overheated like a *****.. When you say door, do you mean side panel or front panel? Also I have thought about punching a hole in the rear of the case to route the cabels through but its tricky since its AIO? And I was under the impression core2 always run a bit hotter due to the water flow is serial configured? It's at 68 and 74 at -40mv and at the very edge of not throttling at all. How do I test the pump?


----------



## Medusa666

Hi guys,

What is considered a normal idle temperature for this card?

I'm experiencing heavy throttling while gaming Metro Redux in 1440p. The card flats out at 75.0c and doesn't go down.

The idle temperatures are 44-50c depending on ambient, I'm suspecting that the pumps may be malfunctioning.

Any ideas on how to test them? How to make sure it works? The card have also been making a lot of trickling water noise lately, but it goes away during heavy load.


----------



## wermad

Quote:


> Originally Posted by *mortenv*
> 
> I am not sure whether I dare to reapply TIM or not, I only have noctua paste and acetone or loctite 7063 solvents. I applied thermal paste for the 4790k no problem but bare dies are a lot harder in my experience, I changed TIM on my laptop with 2630qm and gt555m and that thing overheated like a *****.. When you say door, do you mean side panel or front panel? Also I have thought about punching a hole in the rear of the case to route the cabels through but its tricky since its AIO? And I was under the impression core2 always run a bit hotter due to the water flow is serial configured? It's at 68 and 74 at -40mv and at the very edge of not throttling at all. How do I test the pump?


I did both my cards on water blocks. I've done bare dies and delids. As long as you do it properly and are using good time, no harm







.

Quote:


> Originally Posted by *Medusa666*
> 
> Hi guys,
> 
> What is considered a normal idle temperature for this card?
> 
> I'm experiencing heavy throttling while gaming Metro Redux in 1440p. The card flats out at 75.0c and doesn't go down.
> 
> The idle temperatures are 44-50c depending on ambient, I'm suspecting that the pumps may be malfunctioning.
> 
> Any ideas on how to test them? How to make sure it works? The card have also been making a lot of trickling water noise lately, but it goes away during heavy load.


That's pretty decent. Tbh, Hawaii doesn't idle as good as Nvidia (in the low 30s) unless you're on custom water. I had idles in the low 40s with ambients in the mid to upper 20s with two stock. Load should be ~60-65c if you have good cooling. The stock cooling does a pretty good job but it has to be done right otherwise you'll hit the thermal wall quickly. If you're pumps were failing you would see them hover in the 60s at idle. You can enable ulps to drop the clocks in idle to low use but this can hamper core performance once they're under load.


----------



## Insan1tyOne

Hello Everyone,

I just recently acquired a Diamond R9 295X2 in "like new" condition, but I have a few things that I would like to do to it and was just curious if it you all thought it would be worth it or not.

*R9 295X2 Upgrades:*


Flash BIOS to Sapphire R9 295X2 OC Edition (1030 / 1300)
Remove cooler and replace stock TIM with Gelid GC Extreme
Remove center fan connector from GPU and connect it to motherboard PWM connector.
Use 3-pin fan to dual 3-pin fan Y-splitter to do Push / Pull on the stock radiator using the R9 295X2 header. (The fans I will be using are Phanteks PH-F120MP)
I think by doing all of those things I should be able to significantly lower the temps on the R9 295X2 and avoid any throttling at 75C while gaining a bit of performance. Your opinions are much appreciated though.

- Insan1tyOne


----------



## NBrock

Quote:


> Originally Posted by *Insan1tyOne*
> 
> Hello Everyone,
> 
> I just recently acquired a Diamond R9 295X2 in "like new" condition, but I have a few things that I would like to do to it and was just curious if it you all thought it would be worth it or not.
> 
> *R9 295X2 Upgrades:*
> 
> 
> Flash BIOS to Sapphire R9 295X2 OC Edition (1030 / 1300)
> Remove cooler and replace stock TIM with Gelid GC Extreme
> Remove center fan connector from GPU and connect it to motherboard PWM connector.
> Use 3-pin fan to dual 3-pin fan Y-splitter to do Push / Pull on the stock radiator using the R9 295X2 header. (The fans I will be using are Phanteks PH-F120MP)
> I think by doing all of those things I should be able to significantly lower the temps on the R9 295X2 and avoid any throttling at 75C while gaining a bit of performance. Your opinions are much appreciated though.
> 
> - Insan1tyOne


I did the PWM connector mod for the VRM fan as well as PK-1 on the GPUs and Corsair SP High Performance fans in push pull on the rad. It made a huge difference. I run the VRM fan on 53% seems to be a sweet spot in my case for noise. If I am benching I crank it up to 100%. Stays cool @ 1100 core and 1450 mem (right around 60-65*c in games).


----------



## elgreco14

Windows 10 and the new drivers were a good thing for my scores on 3DMark.

Before: 1150/1350 http://www.3dmark.com/3dm/7383619 (Overall 16608 and Graphics Score 23831)

After: 1100/1350 http://www.3dmark.com/3dm/8079962 (Overall 16709 and Graphics Score 24530)
1100/1425 http://www.3dmark.com/3dm/8080065 (Overall 16872 and Graphics Score 24966)
1150/1450 http://www.3dmark.com/3dm/8080397 (Overall 17212 and Graphics Score 25882)

Nice improvement


----------



## xer0h0ur

Why would you not benchmark using the same GPU/vRAM clocks to make a direct comparison? That is benchmarking 101.


----------



## wermad

Shipping bad psu







.

Evga has some b-stock on sale. Anyone have any experience?


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Why would you not benchmark using the same GPU/vRAM clocks to make a direct comparison? That is benchmarking 101.


YUP

Quote:


> Originally Posted by *elgreco14*
> 
> Windows 10 and the new drivers were a good thing for my scores on 3DMark.
> 
> Before: 1150/1350 http://www.3dmark.com/3dm/7383619 (Overall 16608 and Graphics Score 23831)
> 
> After: 1100/1350 http://www.3dmark.com/3dm/8079962 (Overall 16709 and Graphics Score 24530)
> 1100/1425 http://www.3dmark.com/3dm/8080065 (Overall 16872 and Graphics Score 24966)
> 1150/1450 http://www.3dmark.com/3dm/8080397 (Overall 17212 and Graphics Score 25882)
> 
> Nice improvement


looks can be deceiving, retest 1150/1350 on W10.
Benchmarks should be apples to apples; you tested apples to horse apples


----------



## SAFX

Quote:


> Originally Posted by *mortenv*
> 
> Not having read all 743 pages. How would you cool this card if it reaches 75C on gpu2 with ambient 23c on stock voltage and powerlimit? I have two noctua af140 intake fans and one noctua af140 blowing directly in front of the card. Then Corsair sp120 and stock fan blowing push/pull on the stock radiator. Cleaned out VRM heatsink with canned air etc. Case airflow is good as far as I know in the corsair 750d. It runs cool when I apply -40mv in afterburner, but I was aiming for a sligh gput OC
> 
> specs 4790k overclocked to 4,8 all cores 1,3vcore.
> corsair h100i AIO mounted on top.
> corsair 750d
> XFX 295x2 with radiator mounted in the back with sp120 and stock fan in push/pull
> Noctua af140 intake x3. modded case to properly allow 140mm airflow.
> msi mpower max ac, rm850
> corsair vengeance 32gb 2400mhz


From where does air flow _into_ the rad? inside the case, or fresh air from outside?
I had a similar problem, so I front-mounted the rad fan (AP-15), pulling cool air from outside, made a HUGE difference



100% GPU load, 10m, AIDA64


----------



## elgreco14

Quote:


> Originally Posted by *SAFX*
> 
> YUP
> looks can be deceiving, retest 1150/1350 on W10.
> Benchmarks should be apples to apples; you tested apples to horse apples


Hehe I know but it was to show where before with 1150/1350 I scored lower than with 1100/1350 after newer drivers and w10


----------



## xer0h0ur

I am not far away from benching on Windows 10 myself so I will provide my own benches. I did however just upgrade from a 256GB Micron M550 SSD to a 1TB Samsung 850 Pro so I haven't the faintest idea if that will affect my scores. That would be the only actual change in my setup apart from operating system. Basically I am keeping my Win7 install on the M550 while I attempt to make the gaming switch-over to Windows 10 without being stuck with the headaches of early adoption. If things get wonky I just swap SSD trays back to the M550 and keep playing.


----------



## nodbone

Hello everyone! I just got myself a 295x2 and found this thread









here's my specs:

Asus Sabertooth z87
i7 4770k with EK supremacy evo block
8GB Kingston HyperX
295x2 with kryographics Vesuvius block and active backplate
Seasonic 1050 platinum
corsair 750d
xt45 360mm radiator - 3x 120mm corsair sp120 (push)
ut60 240mm radiator - 2x 120mm corsair sp120 (push)
xspc dual bay res with d5 vario
2 x front intake corsair 140mm
1 x exhaust in the back 140mm

Now my question is this: is it normal to have a 42 degrees Celsius IDLE temp for the 295x2 ? Load temp is arround 55. is that normal too? I was expecting idle to be more arround 35 and load MAX 50.
I was thinking of going push pull on both rads; do you think it would be worth it?

One other question: what does the bios mini switch on the card do?

Thanks!


----------



## xer0h0ur

Quote:


> Originally Posted by *nodbone*
> 
> Hello everyone! I just got myself a 295x2 and found this thread
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here's my specs:
> 
> Asus Sabertooth z87
> i7 4770k with EK supremacy evo block
> 8GB Kingston HyperX
> 295x2 with kryographics Vesuvius block and active backplate
> Seasonic 1050 platinum
> corsair 750d
> xt45 360mm radiator - 3x 120mm corsair sp120 (push)
> ut60 240mm radiator - 2x 120mm corsair sp120 (push)
> xspc dual bay res with d5 vario
> 2 x front intake corsair 140mm
> 1 x exhaust in the back 140mm
> 
> Now my question is this: is it normal to have a 42 degrees Celsius IDLE temp for the 295x2 ? Load temp is arround 55. is that normal too? I was expecting idle to be more arround 35 and load MAX 50.
> I was thinking of going push pull on both rads; do you think it would be worth it?
> 
> One other question: what does the bios mini switch on the card do?
> 
> Thanks!


Well without accounting for any overclocks on your CPU and GPUs you're dumping 584 Watts of heat into that loop and you most assuredly aren't here because you're running a bone stock configuration. So taking into account those overclocks and that you're not running push/pull on the radiators then I would say that is reasonable.


----------



## wermad

I saw 40 idle on my cards when the ambient was a tad high in my old case.


----------



## Davehillbo

Hi all

Ive got a review sample 295x2, and was wondering if there is any way to control voltage? Can only get 1090Mhz on stock!

Cheers


----------



## NBrock

Quote:


> Originally Posted by *Davehillbo*
> 
> Hi all
> 
> Ive got a review sample 295x2, and was wondering if there is any way to control voltage? Can only get 1090Mhz on stock!
> 
> Cheers


You can use software like Sapphire Trixx or MSI Afterburner to adjust voltage.


----------



## Davehillbo

Apparently not in windows 10 :|


----------



## NBrock

Quote:


> Originally Posted by *Davehillbo*
> 
> Apparently not in windows 10 :|


What do you mean? I have Trixx working on Windows 10 Pro x64 with my 295x2. I just installed it like you normally would. Didn't have to do compatibility mode or anything.


----------



## MIGhunter

Is Trixx better than CCC? I have a 295x2 from Sapphire and never used it before


----------



## Davehillbo

Have no voltage control at all , only power percentage. It worked in win8, never did in 10


----------



## NBrock

Quote:


> Originally Posted by *MIGhunter*
> 
> Is Trixx better than CCC? I have a 295x2 from Sapphire and never used it before


I use Trixx for voltage control in addition to overclocking.

I personally prefer it to MSI Afterburner. I have had too many issues with Afterburner causing crashes and instability.
I have had good results with my 7970, 290x, 295x2 and 290.
Quote:


> Originally Posted by *Davehillbo*
> 
> Have no voltage control at all , only power percentage. It worked in win8, never did in 10


That's odd. I know this may sound silly but have you tried scrolling down? Some times it doesn't always show everything without scrolling. If it isn't there I would uninstall then try reinstalling it. I have Voltage control with mine in Windows 10.


----------



## Davehillbo

Yep, tried fresh install as well. no voltage control AT ALL.


----------



## Alex132

Quote:


> Originally Posted by *Davehillbo*
> 
> Yep, tried fresh install as well. no voltage control AT ALL.


Trixx or MSI AB?

Trixx seems to work way better than MSI AB for AMD cards, I would recommend it.


----------



## Davehillbo

Both.


----------



## mortenv

Quote:


> Originally Posted by *SAFX*
> 
> From where does air flow _into_ the rad? inside the case, or fresh air from outside?
> I had a similar problem, so I front-mounted the rad fan (AP-15), pulling cool air from outside, made a HUGE difference
> 
> 
> 
> 100% GPU load, 10m, AIDA64


Front mounting is out of the question, as you can see I've modified the front of the case to allow 140mm airflow.


Spoiler: Warning: Spoiler!







I just bought two Noctua NF-F12 iPPC-2000 fans to run push/pull on the 295x2 radiator. I used to run corsair sp120 and stock fan in push/pull but the bearings on those fans do not like laying flat. Grinding noise.. Maybe I can run a 90 degree pvc tubing from the rear noctua nf-a14 to the 295x2 radiator, providing cool air.. What I won't do is run the 295x2 radiator as intake, because the card itself heats up like a mofo, the backplate, shroud and vrm chips are HOT!, so adding more heat to the case is a no no IMO.

EDIT: FFS!! Ordered the Noctua NF-F12 iPPC-2000 instead of the iPPC-3000


----------



## mortenv

Will a 3000 rpm fan (Noctua NF-F12 iPPC-3000) spin at full 3000 rpm when connected to the 295x2 or will it stop at 2000 rpm like the oem fan? Got 4 ippc-3000 fans and 2 ippc-2000 fans in the mailbox soon..


----------



## NBrock

Quote:


> Originally Posted by *mortenv*
> 
> Will a 3000 rpm fan (Noctua NF-F12 iPPC-3000) spin at full 3000 rpm when connected to the 295x2 or will it stop at 2000 rpm like the oem fan? Got 4 ippc-3000 fans and 2 ippc-2000 fans in the mailbox soon..


I just plugged mine into my motherboard so I could manage speed that way. It's also less power draw from the card if you do it that way.


----------



## mortenv

Quote:


> Originally Posted by *NBrock*
> 
> I just plugged mine into my motherboard so I could manage speed that way. It's also less power draw from the card if you do it that way.


I have used a y-split 4pin pwm to 3pin that customiced a bit... and it worker alright! no problems with running two fans from the gpu as far as I know. I just used a noctua y-split cable and then a 3-pin low speed connector from corsair, I removed the heatshrink plastic and removed the resistor, and removed one side of the connector so the noctua y-split fit nicely.


----------



## Medusa666

If you run a 295X2 in trifire with a 290X Lightning, does it matter if you have the 295X2 or the 290X in the first Pcie slot? And thus connected to the monitor?

Considering the Lightning is a stronger GPU it makes sense running it as primary. It won't throttle either.


----------



## wermad

It don't matter. Actually, in my setup, it detects either card as long as it's plugged in and boots up on that card. My bios also allows me to choose the slot video will be assigned.

I think ccc will assign the 295x2 as the primary card (but does not mean its the video output card). Not sure though tbh.

Edit, once you go multi monitors, I think it may matter.


----------



## Medusa666

Quote:


> Originally Posted by *wermad*
> 
> It don't matter. Actually, in my setup, it detects either card as long as it's plugged in and boots up on that card. My bios also allows me to choose the slot video will be assigned.
> 
> I think ccc will assign the 295x2 as the primary card (but does not mean its the video output card). Not sure though tbh.
> 
> Edit, once you go multi monitors, I think it may matter.


Ok, thank you, sounds great.

I was thinking more like some games use only two cores in Xfire, others three, some four etc. What I really wonder is if the GPU core that is in the first Pcie slot and connected to the monitor always gets 100% usage first, and then it can utilize one or two cores on the 295X2, or if the 295X2 is all or nothing kinda deal.

Maybe a weird question.


----------



## wermad

You can create profiles to disable the 295x2's xfire. Amd doesn't give you an easy option anymore. Though, plugging in the lightning and using that for video may make it the primary card, and once you disable xfire, it will run the 290x alone.


----------



## SAFX

Quote:


> Originally Posted by *mortenv*
> 
> so adding more heat to the case is a no no IMO.
> 
> EDIT: FFS!! Ordered the Noctua NF-F12 iPPC-2000 instead of the iPPC-3000


I thought the same; it's counter-intuitive, yes, but it lowers temps. You won't know until you try


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*


Uninstalled Trixx, drivers, then reinstalled 15.7.1, but not Trixx, and I OC'ed only from CCC, results are more consistent with yours









Not sure if Trixx was the problem, but one thing I've noticed, regardless of OC tool, if you resintall drivers, you're better off uninstalling Trixx/MSI first, then uninstall/reinstall drivers, then reinstalling the OC tool . Keeping the same OC tool install, in my opinion, randomly screws things up, not sure how, but that's my tin-foil hat take on the matter


----------



## ENTERPRISE

Hey guys, have had my 295x2 a little while from msi and I wondered if there is a decent unlocked Voltage BIOS? Or BIOS editing tool for editing my clocks. Been out of the Gpu oc game for a while


----------



## xer0h0ur

I am actually for the time being screwed on trying to install Windows 10 altogether. Microsoft, in their infinite wisdom, decided to include an Intel CPU microcode update in the Windows 10 ISO that is fubared for 4th generation Intel processors and of course I have a 4930K so I am screwed. It causes a failure to boot into the installer after the first restart. KB3064209

God knows when if at all I will be able to upgrade to Windows 10. I have wasted far too many hours already trying to get around this. I give up.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> God knows when if at all I will be able to upgrade to Windows 10. I have wasted far too many hours already trying to get around this. I give up.


I thought you planned on waiting until the kinks were worked out?
I plan on going with an entirely separate build, fully water cooled, on Windows 10, but keeping my current rig as the primary, this way I won't have to worry about issues should they arise.








I'm waiting until the end of year, I can't go much higher in terms of performance compared to my current build unless I move to dual GPU, but even then it's sorta overkill


----------



## mortenv

Yeah windows 10 is not "bulletproof" at the moment, getting random kernel crashes BAD_SYSTEM_CONFIG_INFO and random entire system freezes with only the mouse working when opening certain files with certain apps, like opening a .pdf with sumatraPDF. Reboot button comes handy..


----------



## xer0h0ur

Quote:


> Originally Posted by *SAFX*
> 
> I thought you planned on waiting until the kinks were worked out?
> I plan on going with an entirely separate build, fully water cooled, on Windows 10, but keeping my current rig as the primary, this way I won't have to worry about issues should they arise.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm waiting until the end of year, I can't go much higher in terms of performance compared to my current build unless I move to dual GPU, but even then it's sorta overkill


Noooo. I have two SSDs for this reason. The 1TB 850 Pro was meant to get Windows 10 Pro while I kept Windows 7 Ultimate on the 256GB M550. Instead I have two installations of Windows 7 Ultimate, lol.


----------



## mortenv

welp, Got the same bsod upon starting my computer this morning, and now windows 10 doesn't boot at all. black screen and idle temps, SSD access blinks every 10 seconds. GG microsoft. Going back to imaged windows 8.1 if not windows 10 installation media can fix it..


----------



## Medusa666

So uhm. Yeah, can you run different frequencies for the GPU cores and GPU memory in Crossfire?

i,e can the 295X2 run at stock of 1018 MHz, and a 290X at 1150 MHz for ex? Or would it be downclocked to match the 295X2 cores?


----------



## wermad

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Hey guys, have had my 295x2 a little while from msi and I wondered if there is a decent unlocked Voltage BIOS? Or BIOS editing tool for editing my clocks. Been out of the Gpu oc game for a while


Well, I haven't done any overclocking with these guys yet, but I see the sapphire oc bios are very popular to flash (bumps it to 1030 I believe). Tools, afteburner seems to get sketchy sometimes, so trixx is coming up as a better alternative. As for custom bios, not sure on this one but I'm sure someone will chime in with more input







.


----------



## xer0h0ur

There is no issue with running different clocks on each video card.


----------



## wermad

Quote:


> Originally Posted by *Medusa666*
> 
> So uhm. Yeah, can you run different frequencies for the GPU cores and GPU memory in Crossfire?
> 
> i,e can the 295X2 run at stock of 1018 MHz, and a 290X at 1150 MHz for ex? Or would it be downclocked to match the 295X2 cores?
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> There is no issue with running different clocks on each video card.
Click to expand...

qft, its been a couple of gens, but amd allows you to run each core independently while in crossfire. Its longer the case where it has to slow everything down to the lowest clocking core.


----------



## ENTERPRISE

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Hey guys, have had my 295x2 a little while from msi and I wondered if there is a decent unlocked Voltage BIOS? Or BIOS editing tool for editing my clocks. Been out of the Gpu oc game for a while
> 
> 
> 
> Well, I haven't done any overclocking with these guys yet, but I see the sapphire oc bios are very popular to flash (bumps it to 1030 I believe). Tools, afteburner seems to get sketchy sometimes, so trixx is coming up as a better alternative. As for custom bios, not sure on this one but I'm sure someone will chime in with more input
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

After doing some investigating I have come to the conclusion that Flashing the 295x2 is not the most simple process, easily do able but not as simple as once though. Looking at the gains from flashing something like the sapphire BIOS, they are so little and not really worth the risk, so I think I will stick with software OC like Afterburner/Trixx

Thanks !


----------



## wermad

1100 seems doable, but i heard not every core will do this, especially on the stock cooler. I've seen 1200 but I'm sure this is much more silicone-lottery. I'll try for 1100 once I get my psu sorted. I did use the ccc overdrive w/ my old Tahiti setup but these cards were not @ reference clocks (Lightnings BE).


----------



## SAFX

Can I use Asus GPU Tweak on a Sapphire 295x2?


----------



## wermad

I have it installed, but never used it. I would assume yes tbh, asus sells amd gpu's (especially the ares iii dual hawaii







).


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> I have it installed, but never used it. I would assume yes tbh, asus sells amd gpu's (especially the ares iii dual hawaii
> 
> 
> 
> 
> 
> 
> 
> ).


Will give it a try tonight. Trixx is nice, but I hate the UI, being forced to scroll makes me sad


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> 1100 seems doable, but i heard not every core will do this, especially on the stock cooler. I've seen 1200 but I'm sure this is much more silicone-lottery. I'll try for 1100 once I get my psu sorted. I did use the ccc overdrive w/ my old Tahiti setup but these cards were not @ reference clocks (Lightnings BE).


1100 is child's play

1250 is where it's at









No but seriously, 1100 should be do-able on almost every 295X2 - they are binned higher than 290Xs after all (IIRC).


----------



## wermad

What's the highest any one has pushed a 295x2 without going to extreme cooling?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> What's the highest any one has pushed a 295x2 without going to extreme cooling?


Not sure, IIRC there are 2 of us in this thread that have done 1250 before. I haven't seen higher on water or stock.

I run stock 24/7 though.


----------



## xer0h0ur

Quote:


> Originally Posted by *Alex132*
> 
> 1100 is child's play
> 
> 1250 is where it's at
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No but seriously, 1100 should be do-able on almost every 295X2 - they are binned higher than 290Xs after all (IIRC).


Not mine. I must have got the ****tiest 295X2 on the planet because I can't stably run 1100 24/7 while gaming. Its benchmark stable but nothing more than that. That is +100mV and +50 power limit.









1090/1500 are my highest stable clocks for gaming.


----------



## wermad

Quote:


> Originally Posted by *Alex132*
> 
> Not sure, IIRC there are 2 of us in this thread that have done 1250 before. I haven't seen higher on water or stock.
> 
> I run stock 24/7 though.
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Not mine. I must have got the ****tiest 295X2 on the planet because I can't stably run 1100 24/7 while gaming. Its benchmark stable but nothing more than that. That is +100mV and +50 power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1090/1500 are my highest stable clocks for gaming.
Click to expand...

What your guy's asic?

When i had my tri 290s, guys in the 290/290x club said 1100 was very obtainable. Obviously, the vrm designs are different in 290/290x vs 295x2, but im curious to see if the asic mantra evga has adopted for the Ti kpe can be correlated to the higher binned cores of the 295x2. I didn't get a chance to push my triplets but they ran fine on the stock factory of 1000.


----------



## xer0h0ur

Its not the "ASIC quality". Both GPUs on the 295X2 have higher ASIC values than my 290X yet the 290X overclocks higher. Its the power delivery. The 290X can feed more power to the GPU than the 295X2 can to either of its GPUs.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> What your guy's asic?


Not that it makes a huge difference IIRC:




My GTX 690 was ~69% and could OC amazingly on the core too.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its not the "ASIC quality". Both GPUs on the 295X2 have higher ASIC values than my 290X yet the 290X overclocks higher. Its the power delivery. The 290X can feed more power to the GPU than the 295X2 can to either of its GPUs.


Does OC'ing on the memory only help achieve higher OCs on the core? Or does the R9 2xx not share the same rail-balancing that the GTX 7xx has?


----------



## mortenv

for my 295x2 its all about dissipating heat, not voltage limited I think. for example I can run -40mv on 1030mhz core and 1625 on mem, any more volt and it throttles


----------



## wermad

Just for kicks


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> No but seriously, 1100 should be do-able on almost every 295X2 - they are binned higher than 290Xs after all (IIRC).


Alex, how do you go about OC'ing? I can't seem to get about 1100 gpu clock, 1250 mem clock. Here's what I'm doing using ASUS GPU Tweak:

((Power target 50%))
1) Increase gpu clock by 10Mhz
2) Run quick stability
3) Repeat
(it's stable up to 1100)

At this point, how do I test max mem clock? Do you leave gpu clock at 1100, or reset to 1018, then start increasing mem clock?


----------



## gatygun

Quote:


> Originally Posted by *SAFX*
> 
> Alex, how do you go about OC'ing? I can't seem to get about 1100 gpu clock, 1250 mem clock. Here's what I'm doing using ASUS GPU Tweak:
> 
> ((Power target 50%))
> 1) Increase gpu clock by 10Mhz
> 2) Run quick stability
> 3) Repeat
> (it's stable up to 1100)
> 
> At this point, how do I test max mem clock? Do you leave gpu clock at 1100, or reset to 1018, then start increasing mem clock?


I mostly first overclock the core and see how far it goes, then push it back to stock and overclock the memory to see how far it goes.

After i got both value's. I lower them both a bit and push them together and see if things are stable.

If you won't get artifacts or whatever else in fast benchmarks like 3dmark. Just game with it a match in bf4 and see if graphical issue's push forwards.

If it's stable just move forwards until it gets unstable. Then it's basically turned some clocks down and keep on playing with it for days to see if problems come forwards. Every day a bit of fine tuning.

Then after a week or so you got a pretty stable solution.


----------



## BootPirate

My audio is making crackling sounds and sounds like it lags along with the video that I'm watching, or game that I'm playing, ever since I installed the new drivers.
Has anyone else experienced this?


----------



## Alex132

Quote:


> Originally Posted by *SAFX*
> 
> Alex, how do you go about OC'ing? I can't seem to get about 1100 gpu clock, 1250 mem clock. Here's what I'm doing using ASUS GPU Tweak:
> 
> ((Power target 50%))
> 1) Increase gpu clock by 10Mhz
> 2) Run quick stability
> 3) Repeat
> (it's stable up to 1100)
> 
> At this point, how do I test max mem clock? Do you leave gpu clock at 1100, or reset to 1018, then start increasing mem clock?


Use Trixx, bump power+ by 50%, bump core by whatever (I did 10Mhz at a time when I was starting off - then just guessed 1200 and it stuck so yeah) and set it to +50mV for 1100Mhz core and +80mV for 1200 core, +110mV for 1250Mhz core. I can actually set it to 0% power and -10mV for 1050/1500









Programs I used to test stability were Heaven 4.0, 3DMark and games in general. (Ark, War Thunder, Hyperdimension).


----------



## SAFX

Quote:


> Originally Posted by *Alex132*
> 
> I can actually set it to 0% power and -10mV for 1050/1500


Interesting, I guess that says memory overclocking is less power-demanding that core overclocking?


----------



## xer0h0ur

Generally speaking yes but GDDR5 vRAM does account for a good chunk of the card's power draw. Like a third of it.


----------



## mortenv

???


----------



## xer0h0ur

Specifying a question instead of putting question marks is usually the way to get answers. If you're trying to get higher overclocks on the vRAM you need to extend official overclocking limits. If you're having problems with Afterburner going wonky on you with the clocks then you are going to have to remove it without keeping any settings and re-install it.


----------



## mortenv

I am wondering why I can get -40 on 1030 but if I want 1080 or 1100 I need like +40 or +50 voltage. (+80 +90mv change)


----------



## Dagamus NM

That is totally what I assumed you meant by the three question marks.

What is there to wonder? If that is what your card takes then that is what it takes. A 90millivolt swing for 70MHz just means that you know what clocks you can get stable with certain voltages.

Are you asking if this is within expected range?


----------



## mortenv

Quote:


> Originally Posted by *Dagamus NM*
> 
> Are you asking if this is within expected range?


Yes, I haven't done a lot of testing due to thermal limits, but tomorrow I'm going to have some better fans, and therefore I'm wondering if that big of an increase in volt is what it takes to run stable, provided the cooling is adequate. I would like to get about 1100 core because some games do not play nice with crossfire.


----------



## xer0h0ur

You're still doing better than me. If you noted my post earlier. My 295X2 needs +50 power limit +100mV to stabalize at 1090MHz GPU clocks. I just got bad silicon. Can't do anything about silicon lottery. I can push 1100MHz for benchmarking without any adverse effects but I will crash sooner or later gaming at that clock speed.


----------



## blue1512

My tip for games w/o crossfire: disable the slave gpu in device manager and OC the hell out of the remaining one. Even better in 295x2 with two Gpu sharing one aio loop. Literally you will have a higher bin 290x with the aio when one Gpu is disabled
Quote:


> Originally Posted by *mortenv*
> 
> Yes, I haven't done a lot of testing due to thermal limits, but tomorrow I'm going to have some better fans, and therefore I'm wondering if that big of an increase in volt is what it takes to run stable, provided the cooling is adequate. I would like to get about 1100 core because some games do not play nice with crossfire.


----------



## mortenv

Quote:


> Originally Posted by *blue1512*
> 
> My tip for games w/o crossfire: disable the slave gpu in device manager and OC the hell out of the remaining one. Even better in 295x2 with two Gpu sharing one aio loop. Literally you will have a higher bin 290x with the aio when one Gpu is disabled


Terrific! Will Afterburner freak out when disabling one of them? Do I just disable the second one in devmgr or whichever doesn't make the screen black?


----------



## blue1512

Quote:


> Originally Posted by *mortenv*
> 
> Terrific! Will Afterburner freak out when disabling one of them? Do I just disable the second one in devmgr or whichever doesn't make the screen black?


In most case it's the second one, and you need to turn off ab beforehand


----------



## ENTERPRISE

Thus far I have 1120 Core and 1500 Memory at 80+Mv


----------



## fat4l

Im doing:
GPU1 - 1140MHz/1550MHz +60mV
GPU2 - 1180MHz/1550MHz +100mV

Final vaue of voltage is the same despite the fact that +mV differ.... ~1.3V in load.
Can't go more than +60mV on GPU1 cuz then, displayport issues start(Blackscreening).


----------



## ENTERPRISE

Quote:


> Originally Posted by *fat4l*
> 
> Im doing:
> GPU1 - 1140MHz/1550MHz +60mV
> GPU2 - 1180MHz/1550MHz +100mV
> 
> Final vaue of voltage is the same despite the fact that +mV differ.... ~1.3V in load.
> Can't go more than +60mV on GPU1 cuz then, displayport issues start(Blackscreening).


Nice, I may do some more tweaking tonight if I feel my current OC is stable when playing a few games.


----------



## SAFX

Quote:


> Originally Posted by *fat4l*
> 
> Im doing:
> GPU1 - 1140MHz/1550MHz +60mV
> GPU2 - 1180MHz/1550MHz +100mV
> 
> Final vaue of voltage is the same despite the fact that +mV differ.... ~1.3V in load.
> Can't go more than +60mV on GPU1 cuz then, displayport issues start(Blackscreening).


You're OC'ing cores individually?


----------



## fat4l

Quote:


> Originally Posted by *SAFX*
> 
> You're OC'ing cores individually?


yep, with afterburner


----------



## jg900ss

New owner of 295x2.

SITUATION:
I need advice on DVI-D connector and 120/144Hz refresh. I have a BENQ XL2730Z, with Freesync support. The 295x2 worked fine for 2 weeks using a DP1.2 cable connected to the BENQ, running at a 144Hz refresh rate specified in Windows 8.1 PRO, running FREESYNC support under the latest Catalyst 15.7 drivers. System is 5930X stock, Noctua NH12S cooler, 32GB 2133 QVL RAM, MSI X99S MPOWER motherboard, Corsair AX860i PSU, Samsung 850EVO 500GB SSD boot drive, 2 x WD 3TB drives for data.

What I would like to know is whether it is possible to connect the BENQ to the 295x2 using DVI-D and set a refresh rate of 120Hz, since my understanding it that DVI-D will not support 144Hz. Is this possible? My display port connection seems to be troublesome and not consistent (black screen on boot, now regularly, instead of normal boot to log in screen). So I am evaluating just using the DVI-D connection for now, but with a higher refresh rate. yes, I know its probably time to RMA the card, and will most likely do that this week, but in the meanwhile, would this be possible?

Thanks.


----------



## wermad

if your dp isn't working properly, you may have a card issue. Try a different mini dp if that helps or maybe its your cable or adapter (if you're using mini to standard dp). Still no solution, rma if possible. As for the refresh, its up to your monitor depending on the connection. I'm not 100% sure, but dvi-d can't push 120hz @ 1440 (nor can hdmi 1.4), i believe it don't have the bandwidth for it. I maybe wrong so I'm sure someone else can answer this but after dabbling with 3x1, 5x1, mst hubs, and now 4k, displayport is the better connection overall and should work properly in your case unless something is wrong.


----------



## ANGELPUNISH3R

Hey guys just wondering if any one can help me. When im running games on my 295x2 the gpu usagr is really low. Has lots of spikes but probably is averaging about 20% on both cores. So for example BF4 only averages around 40 fps @ 1080p. Its happening on all games, not just BF4. I reinstalled drivers but not really sure whats causing it. Does any one have a solution for this or had this problem before and solved it?


----------



## wermad

Download and install sapphire Trixx (or msi afterburner) and disable ulps. See if that helps increase load.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Hey guys, have had my 295x2 a little while from msi and I wondered if there is a decent unlocked Voltage BIOS? Or BIOS editing tool for editing my clocks. Been out of the Gpu oc game for a while
> 
> 
> 
> Well, I haven't done any overclocking with these guys yet, but I see the sapphire oc bios are very popular to flash (bumps it to 1030 I believe). Tools, afteburner seems to get sketchy sometimes, so trixx is coming up as a better alternative. As for custom bios, not sure on this one but I'm sure someone will chime in with more input
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Click to expand...
> 
> After doing some investigating I have come to the conclusion that Flashing the 295x2 is not the most simple process, easily do able but not as simple as once though. Looking at the gains from flashing something like the sapphire BIOS, they are so little and not really worth the risk, so I think I will stick with software OC like Afterburner/Trixx
> 
> Thanks !
Click to expand...

The Sapphire OC Bios allows you to add upto +300mV in Trixx, The stock Bios (at least on my XFX card) allowed +100mV in afterburner and no voltage control in Trixx.

There are gains to it but only if you are overclocking it fairly well


----------



## ANGELPUNISH3R

Yeh i tried that, didnt change anything. I dont really understand because when i run BF4 at 1080p it sits around 60 fps but then in 4k it runs at pretty much the same 60 fps. The gpu load will be higher under 4k but still spike up and down like i dont see why at any point the gpu load should be contsantly dipping to zero. I've made sure that that frame limiter in CCC is off so its not trying to limit frames.


You can see the GPU load thats at 4K.


And thats at 1080p. Barley using the gpu.

When i use my other pc most games never drop below 70% and never go down to zero gpu load.


----------



## ENTERPRISE

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Hey guys, have had my 295x2 a little while from msi and I wondered if there is a decent unlocked Voltage BIOS? Or BIOS editing tool for editing my clocks. Been out of the Gpu oc game for a while
> 
> 
> 
> Well, I haven't done any overclocking with these guys yet, but I see the sapphire oc bios are very popular to flash (bumps it to 1030 I believe). Tools, afteburner seems to get sketchy sometimes, so trixx is coming up as a better alternative. As for custom bios, not sure on this one but I'm sure someone will chime in with more input
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Click to expand...
> 
> After doing some investigating I have come to the conclusion that Flashing the 295x2 is not the most simple process, easily do able but not as simple as once though. Looking at the gains from flashing something like the sapphire BIOS, they are so little and not really worth the risk, so I think I will stick with software OC like Afterburner/Trixx
> 
> Thanks !
> 
> Click to expand...
> 
> The Sapphire OC Bios allows you to add upto +300mV in Trixx, The stock Bios (at least on my XFX card) allowed +100mV in afterburner and no voltage control in Trixx.
> 
> There are gains to it but only if you are overclocking it fairly well
Click to expand...

Oh right I did not know that, in all honesty the current stock BIOS I have is fine for overclocking as I wont be replacing the AIO cooler and it does not do all that well when even maxing 100mv...so anything above that is likely a no go anyway.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Hey guys, have had my 295x2 a little while from msi and I wondered if there is a decent unlocked Voltage BIOS? Or BIOS editing tool for editing my clocks. Been out of the Gpu oc game for a while
> 
> 
> 
> Well, I haven't done any overclocking with these guys yet, but I see the sapphire oc bios are very popular to flash (bumps it to 1030 I believe). Tools, afteburner seems to get sketchy sometimes, so trixx is coming up as a better alternative. As for custom bios, not sure on this one but I'm sure someone will chime in with more input
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Click to expand...
> 
> After doing some investigating I have come to the conclusion that Flashing the 295x2 is not the most simple process, easily do able but not as simple as once though. Looking at the gains from flashing something like the sapphire BIOS, they are so little and not really worth the risk, so I think I will stick with software OC like Afterburner/Trixx
> 
> Thanks !
> 
> Click to expand...
> 
> The Sapphire OC Bios allows you to add upto +300mV in Trixx, The stock Bios (at least on my XFX card) allowed +100mV in afterburner and no voltage control in Trixx.
> 
> There are gains to it but only if you are overclocking it fairly well
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Oh right I did not know that, in all honesty the current stock BIOS I have is fine for overclocking as I wont be replacing the AIO cooler and it does not do all that well when even maxing 100mv...so anything above that is likely a no go anyway.
Click to expand...

I've done +167mV for 1200/1500 benching on the stock cooling









But you're right, anything above +100mV is too much for the stock cooler in 24/7 stuff


----------



## jg900ss

Thanks WERMAD. I suspected as much. The DP connections WERE working for about 2 weeks, and I was certain to research and order a mini-DP connector/cable that was CERTIFIED DP1.2. There are plenty of fakes out there, so its work checking. My let try this week will be the mini-DP to DVI adapter just to see if the ports are offering any signal at all. After that, its RMA time.


----------



## Alex132

In MSI AB I could only go up to +50mV. I thought anything more was risky for the card?

Trixx is up to +300mV however.


----------



## xer0h0ur

Ahhhh. So that is how people are pumping that much juice to the 295X2. Not sure if I want to flash my BIOS though.


----------



## mortenv

Latest MSI afterburner does +100 without any tricky stuff.


----------



## xer0h0ur

Afterburner has always given 100mV to the 295X2 without flashing the BIOS as far as I know. I was talking about being able to juice it up to 300mV. In other words far more than Afterburner will.


----------



## wermad

Lepa shipping out my rma unit







. I'm still more incline to run two units tbh. I'll see how things roll out.
Quote:


> Originally Posted by *ANGELPUNISH3R*
> 
> Yeh i tried that, didnt change anything. I dont really understand because when i run BF4 at 1080p it sits around 60 fps but then in 4k it runs at pretty much the same 60 fps. The gpu load will be higher under 4k but still spike up and down like i dont see why at any point the gpu load should be contsantly dipping to zero. I've made sure that that frame limiter in CCC is off so its not trying to limit frames.
> 
> 
> You can see the GPU load thats at 4K.
> 
> 
> And thats at 1080p. Barley using the gpu.
> 
> When i use my other pc most games never drop below 70% and never go down to zero gpu load.


1080 can get weird with top end amd cards. I don't know why, but if you look at the reviews, amd can get squirrely. But once you start pushing beyond wqhd, it starts to really shine. If you have 4k or eyefinity, try running that (or wqhd). If you don't, which seems like you do from your testing, try disabling crossfire by creating a profile for the game. This is what folks end up doing if their getting xfire issues, especially @ 1080.
Quote:


> Originally Posted by *jg900ss*
> 
> Thanks WERMAD. I suspected as much. The DP connections WERE working for about 2 weeks, and I was certain to research and order a mini-DP connector/cable that was CERTIFIED DP1.2. There are plenty of fakes out there, so its work checking. My let try this week will be the mini-DP to DVI adapter just to see if the ports are offering any signal at all. After that, its RMA time.


Np and glad to help. Displayport may not be popular but its your best friend so long as you need it and it works. When i had my 5x1 1200 (60hz) array, one monitor would continuously go out or start flickering. It would drop my resolution as the monitor lost communication with the system. The good news was that the issue came from a bad dp cable. These were oem units I bought off ebay and i had a couple of spare ones. That fixed it and it worked properly after that. Since you need a lot of bandwidth for your res and rate, displayport is the way to go, but having mini dp on the card, requires cables and adapters. If possible, look for a mini dp to dp cable (keep it to 6'/2m) to reduce the amount of adapters. One thing i do recall, Hawaii had a lot of issues with dp to dvi-d/hdmi adapters (powered adapters). I lost track of the guys that were having the issues and both they and I gave up when amd had not addressed it and they moved on to something different (either native dp monitors or 4k).


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> Ahhhh. So that is how people are pumping that much juice to the 295X2. Not sure if I want to flash my BIOS though.


Stock BIOS. Use Trixx.


----------



## MIGhunter

I know this is more about ocing but are any of you running Windows 10 with the CCC? I tried this morning and it gave me the black screen of death after reboot. I had to revert back to windows 7


----------



## xer0h0ur

Quote:


> Originally Posted by *Alex132*
> 
> Stock BIOS. Use Trixx.


I really need to stop dragging my feet on trying Trixx. I have never even downloaded it since I have been steadily using Afterburner.
Quote:


> Originally Posted by *MIGhunter*
> 
> I know this is more about ocing but are any of you running Windows 10 with the CCC? I tried this morning and it gave me the black screen of death after reboot. I had to revert back to windows 7


I can't even install Windows 10. I always get the failed OS boot error.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MIGhunter*
> 
> I know this is more about ocing but are any of you running Windows 10 with the CCC? I tried this morning and it gave me the black screen of death after reboot. I had to revert back to windows 7


Works fine for me.

Are you using the Win 10 driver?


----------



## MIGhunter

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Works fine for me.
> 
> Are you using the Win 10 driver?


Yea, I updated to Win 10. It rebooted and loaded everything just fine. Then I got a message about the new 15.7.1 drivers so I installed them. When it finished, it rebooted my system into a black screen that never loaded after that. I found a bunch of threads about it on the net. Ppl are saying it's the xfire, others are saying it's the upls but all in all it's something to do with the 15.7.1 driver. I was able to get my computer to work in safe mode but even after uninstalling the card and it's drivers via the equipment manager, it still wouldn't load. I had to revert back to win 7 to get it to work again.

https://community.amd.com/thread/184727


----------



## Sgt Bilko

Quote:


> Originally Posted by *MIGhunter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Works fine for me.
> 
> Are you using the Win 10 driver?
> 
> 
> 
> Yea, I updated to Win 10. It rebooted and loaded everything just fine. Then I got a message about the new 15.7.1 drivers so I installed them. When it finished, it rebooted my system into a black screen that never loaded after that. I found a bunch of threads about it on the net. Ppl are saying it's the xfire, others are saying it's the upls but all in all it's something to do with the 15.7.1 driver. I was able to get my computer to work in safe mode but even after uninstalling the card and it's drivers via the equipment manager, it still wouldn't load. I had to revert back to win 7 to get it to work again.
> 
> https://community.amd.com/thread/184727
Click to expand...

Interesting,

I was on 8.1 with the 15.7 driver then i did an update to win 10 and installed the 15.7.1 Win 10 driver straight over the top of it.


----------



## bobbavet

Gday Guys

May I have some help please. I bought the Coolermaster V1000, but cannot see how I can wire this.

I see there are 2x 8-pins labeled as Cpu and I need 1 for my mobo.

That leaves me with 1x8pin and 4x6pin pcie. The card has 2x8pins. :/

Any help appreciated.

Cheers Bob


----------



## Alex132

Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> May I have some help please. I bought the Coolermaster V1000, but cannot see how I can wire this.
> 
> I see there are 2x 8-pins labeled as Cpu and I need 1 for my mobo.
> 
> That leaves me with 1x8pin and 4x6pin pcie. The card has 2x8pins. :/
> 
> Any help appreciated.
> 
> Cheers Bob
> 
> 
> Spoiler: Warning: Spoiler!


I'll show you a picture of mine when I get home. But basically, don't use the daisy-chained 8-pin connectors.

I have 2 separate 8pins plugged into my PSU, each individual cable going to a separate PCI-E8 connector on the card.

The 2 CPU ones are separate, and the PCI-E cables are 6+2 - not 6 pin.

This PSU is awesome for this card, even under full CPU (5GHz) and GPU (1200 core) load when overclocked the air coming out the back is cool - and the fan never ramps up


----------



## bobbavet

Thankyou

Mmmm So do I put in a 6+2 into the 6pin pcie on the PSU and then put 2 pins in the next 6pin socket along?


----------



## Alex132

Quote:


> Originally Posted by *bobbavet*
> 
> Thankyou
> 
> Mmmm So do I put in a 6+2 into the 6pin pcie on the PSU and then put 2 pins in the next 6pin socket along?


One side should say "PSU" and the other "6+2", plug the "PSU" side into the "PSU". It should be 6pin only.










You can also see where to put the 6pin (PSU-side) PCIE cables on the PSU front here (top left female ports)


----------



## sheldor1

Hey guys, Im a recent "member" but very unhappy with my Club 3D 295x2.

Installation seemingly worked but once I put load on the GPU, through a game or 3D Mark, the PC just shuts down after a minute or two (with the usual fan spinning phase after shutdown, normal for that PSU).

I

already replaced the mainboard because it was a rather ****ty micro atx which didnt have the right specs but the problem persists.
To make sure it wasnt a Windows related problem, I fired up Linux, installed the ati drivers (everything works including crossfire) and started Cities Skylines, which led to the same issue.
Even though its a single rail PSU, I tried splitting the load and using different connectors but to no avail.
I even flipped the bios switch to see if that would make a difference but after (necessary) reinstall of Catalyst drivers, the same issue remains.
I also ruled out overheating as issue. the R9 is well within acceptable temps and the CPU as well.
So Im running out of ideas now. The PSU has been used in reviews to power two of those cards and they didnt seem picky at all about splitting up the current beyond the usual dual pcie cable.

Any suggestions?

Specs:

Windows 10 / Ubuntu UEFI

Gigabyte Z97P-D3 board
I7 4970k @ 4 GHz
2x8 1600 Kingston DDR3
Be Quiet! Power Zone 1000W single rail PSU
Crucial SSD


----------



## Alex132

Are the cables you're using daisy chained, and have you tried using different connectors on the PSU itself? I know my HX850 hard-wired connectors would cause my PSU to shutdown on load - however the modular ones would not.


----------



## sheldor1

I tried all that with my modular PSU. Not that the specs would mention multiple rails or something but you never know, so no, being daisy chained is not be the issue.


----------



## bobbavet

Thanks Alex

Yes I have this cable. Are you saying I need only 1 of these cables and plug the 2 6+2 ends into the GPU?

Alternatively I have 2 Silverstone pcie cables

They are 8pin on one end and 6+2 on the other.

Are you saying I dont have to use the extra +2pins. Just plug in the 6pin.

I thought it might be like this.


----------



## Dagamus NM

Quote:


> Originally Posted by *sheldor1*
> 
> I tried all that with my modular PSU. Not that the specs would mention multiple rails or something but you never know, so no, being daisy chained is not be the issue.


If you are running a single or dual cable from two separate outputs on the PSU to the separate connectors on the board then it sounds like something is overheating and causing your system to shut down.

I went through a frustrating time last month with a CPU block that couldn't seat due to using the wrong mounting studs.


----------



## bobbavet

Got my card up and running using a single 6pin with the dual 6+2 plugs.

Having my first AMD prob.

Installed drivers and get this when I try to open Catalyst.

"Catalyst Control Center cannot be started. There are no settings that can be configured using Catalyst Control Center"


----------



## Dagamus NM

Which OS and which version of catalyst?


----------



## bobbavet

Latest for win8 64.
Think I have found the solution.

You have to do a reinstall from the AMD install directory on C:

Will try it tommorow.

Now for a water block. I can get an EK but no backplates available in Australia.









Is it hard to use the original backplate?

Its that or get a Koolance from the States. But with exchange prices could only afford the refurb direct from Koolance.


----------



## Alex132

Quote:


> Originally Posted by *bobbavet*
> 
> Thanks Alex
> 
> Yes I have this cable. Are you saying I need only 1 of these cables and plug the 2 6+2 ends into the GPU?


6pin goes into the PSU end, the 6+2 goes to the GPU.

Quote:


> Originally Posted by *bobbavet*
> 
> Got my card up and running using a single 6pin with the dual 6+2 plugs.
> 
> Having my first AMD prob.
> 
> Installed drivers and get this when I try to open Catalyst.
> 
> "Catalyst Control Center cannot be started. There are no settings that can be configured using Catalyst Control Center"


Don't do this! AMD specifically says to not use daisy-chained PSU cables. Sending 70+A through one cable is dangerous.

I used 2 *entirely* separate cables for each PCIE pin on the GPU. So 2 PCIE cables coming from the PSU to the GPU.

Bad pics, but you should get the just of whats going on


----------



## bobbavet

Its OK. I was only in bios and installing windows. Will double cable tommorow.

Pretty sure I can use these single Silverstone pxie cables.

I just want to clear this up.

I can use an EK block with the stock backplate.


----------



## Dagamus NM

Yes, go back a couple pages. Mega Man and I well beat this topic to death.


----------



## Alex132

Quote:


> Originally Posted by *bobbavet*
> 
> Its OK. I was only in bios and installing windows. Will double cable tommorow.
> 
> Pretty sure I can use these single Silverstone pxie cables.


Just use the CM ones? I am. You don't need to use other cables...


----------



## PCModderMike

Been sitting on my parts for a rebuild for about two months now. Things are settling down now, so it's time to move forward. Can't wait to get back under water.


----------



## ur4skin

Ahh...I really wish those waterblocks and cpu blocks were cheaper here in SA.


----------



## Alex132

Quote:


> Originally Posted by *ur4skin*
> 
> Ahh...I really wish those waterblocks and cpu blocks were cheaper here in SA.


SA? South America?


----------



## xer0h0ur

South Africa.

Edit, wrong thread. He may mean South America. Mixed him up with someone else.


----------



## wermad

Quote:


> Originally Posted by *sheldor1*
> 
> Hey guys, Im a recent "member" but very unhappy with my Club 3D 295x2.
> 
> Installation seemingly worked but once I put load on the GPU, through a game or 3D Mark, the PC just shuts down after a minute or two (with the usual fan spinning phase after shutdown, normal for that PSU).
> 
> I
> 
> already replaced the mainboard because it was a rather ****ty micro atx which didnt have the right specs but the problem persists.
> To make sure it wasnt a Windows related problem, I fired up Linux, installed the ati drivers (everything works including crossfire) and started Cities Skylines, which led to the same issue.
> Even though its a single rail PSU, I tried splitting the load and using different connectors but to no avail.
> I even flipped the bios switch to see if that would make a difference but after (necessary) reinstall of Catalyst drivers, the same issue remains.
> I also ruled out overheating as issue. the R9 is well within acceptable temps and the CPU as well.
> So Im running out of ideas now. The PSU has been used in reviews to power two of those cards and they didnt seem picky at all about splitting up the current beyond the usual dual pcie cable.
> 
> Any suggestions?
> 
> Specs:
> 
> Windows 10 / Ubuntu UEFI
> 
> Gigabyte Z97P-D3 board
> I7 4970k @ 4 GHz
> 2x8 1600 Kingston DDR3
> Be Quiet! Power Zone 1000W single rail PSU
> Crucial SSD


Mine was doing the same thing even though I assigned each card the appropriate rails. Turned ouout it was the psu and I should be getting my replacement tomorrow.


----------



## bobbavet

Quote:


> Originally Posted by *Dagamus NM*
> 
> Yes, go back a couple pages. Mega Man and I well beat this topic to death.


Well I am sorry bloke. I just don't wanna fk up.

I went back some pages, but people were confused thinking I want to put an EK back plate on a Koolance. I dont.

I wanted to know if the stock backplate would go with

1. A Koolance full cover .The answer was yes (thanks wermad)

2 An EK full cover. I am still not sure. Will I need to source screws or do the screws supplied allow you to reinstall the stock backplate.

I am also trying to find what thickness the heat pads are on the stock backplate, as I will replace them too.

Maybe if this keeps coming up, a sticky can be added to OP. Maybe a WC FAQ.

Once again sorry.

Cheers Bob

OK found this. So screw ok just different heads.
Quote:


> Originally Posted by *Dagamus NM*
> 
> The ek screws fit the stock backplate. They are hex head but I have stock backplates on ek blocks. My ek backplates are in but I haven't taken the time to install them yet.


----------



## SAFX

EK's are so sexy, but I wish they had LEDs like XSPC.


----------



## Dagamus NM

Quote:


> Originally Posted by *bobbavet*
> 
> Well I am sorry bloke. I just don't wanna fk up.
> 
> I went back some pages, but people were confused thinking I want to put an EK back plate on a Koolance. I dont.
> 
> I wanted to know if the stock backplate would go with an EK full cover. I am still not sure. Will I need to source screws or do the screws supplied allow you to reinstall the stock backplate.
> 
> OK found this. So screw ok just different heads.


It does, again the screws made for the EK backplate fit it just fine. I had some extras from other back plates. The short ones went right in but the longer ones that come with backplates had to be used so I added some nuts to make them work.


----------



## bobbavet

Quote:


> Originally Posted by *Dagamus NM*
> 
> The ek screws fit the stock backplate. They are hex head but I have stock backplates on ek blocks. My ek backplates are in but I haven't taken the time to install them yet.


Thankyou


----------



## wermad

Well FedEx is gonna make me wait. From city of industry socal, package should be an easy trip down to San diego. Well psu is on its 3rd day heading to AZ. I'm sure it's going to their socal hub by tomorrow and it won't make it down here. Since they don't deliver on monday for their home service, I'm looking at Tuesday. Wow...but it's not the first time I've had packages take a detour. What is normally a 2-3 hour drive is taking a week. Getting impatient and more reason to move on to a different psu make







.

edit, eh, I'm just po'd today. More crappy news from our soon to be divorcing us football team, cracked a molar, hurt my back randomly, and still out w/ illness. I guess going back to work soon will help me unload some of these things from my mind and I can then enjoy my rig next week.

I'm thinking of ditching z97 but I'm not sure what to move on to. Maybe a new board and a haswell chip that can do better then this DC.


----------



## bobbavet

HAve been testing out the card and system all day.

All sweet except my 5820K is not to good a clocker. Only [email protected] 4mhz


----------



## Alex132

Quote:


> Originally Posted by *bobbavet*
> 
> All sweet except my 5820K is not to good a clocker. Only [email protected] *4mhz*


That does seem like a very bad CPU


----------



## Faoust

Well I'd like to think i qualify now, since I am running dual R9 295x2's. This is what She looks like.

Intel® Core™ i7-4770K Processor (8M Cache, 4.60 GHz);
ASUS ROG MAXIMUS VI FORMULA Intel® Z87 Chipset;
16GB, G.SKILL Ripjaws X Series 16GB (4 x 4GB) 240-Pin SDRAM 2400 MHz DDR3;
Samsung 840 EVO 500GB 2.5" Solid State Drive, Read: 540MB/s, Write: 520MB/s;
2x Sapphire ATI Radeon R9 295X2 Dual GPU (8GB)(Crossfire)


----------



## Dagamus NM

^^^Best GIF avatar ever!

Now that I realize your "Here she is" comment is about your computer a few thoughts.

Is your lower card's rad down below it and in the front of the case? If so are you using it we exhaust out the front or is the heat dumping into the case? You might find it noisy with any air in the clc being trapped in the pumps.

The ram fans never made enough of a difference to overcome the amount of noise generated. I have removed them from all of the computers I had them in. Just because a ram kit came with it didn't turn out to be something I wanted to use. Your tastes and needs might be different than mine. I kind of liked the way they looked but have since moved on ek waterblocks for the ram.


----------



## Faoust

Quote:


> Originally Posted by *Dagamus NM*
> 
> ^^^Best GIF avatar ever!
> 
> Now that I realize your "Here she is" comment is about your computer a few thoughts.
> 
> Is your lower card's rad down below it and in the front of the case? If so are you using it we exhaust out the front or is the heat dumping into the case? You might find it noisy with any air in the clc being trapped in the pumps.
> 
> The ram fans never made enough of a difference to overcome the amount of noise generated. I have removed them from all of the computers I had them in. Just because a ram kit came with it didn't turn out to be something I wanted to use. Your tastes and needs might be different than mine. I kind of liked the way they looked but have since moved on ek waterblocks for the ram.


Ya top card goes up and to the back exhaust, and the bottom is low down and out the front since there isn't enough room for both the rads with the push/pull system that's on them
A little view of the temps... under a light load.


----------



## Elmy

Quote:


> Originally Posted by *Faoust*
> 
> Well I'd like to think i qualify now, since I am running dual R9 295x2's. This is what She looks like.
> 
> Intel® Core™ i7-4770K Processor (8M Cache, 4.60 GHz);
> ASUS ROG MAXIMUS VI FORMULA Intel® Z87 Chipset;
> 16GB, G.SKILL Ripjaws X Series 16GB (4 x 4GB) 240-Pin SDRAM 2400 MHz DDR3;
> Samsung 840 EVO 500GB 2.5" Solid State Drive, Read: 540MB/s, Write: 520MB/s;
> 2x Sapphire ATI Radeon R9 295X2 Dual GPU (8GB)(Crossfire)
> 
> Welcome to the 2 X 295X2 Club


----------



## Medusa666

Quote:


> Originally Posted by *Elmy*
> 
> Welcome to the 2 X 295X2 Club


Hey man can you add me too? : )

Edit: Realized you are not the TS lol : P


----------



## wermad

Well, turns out i either got the wrong tracking # from Enermax or they sent it to the wrong location. Since their support is in Asia (maybe Taiwan?), I probably won't get a reply until Monday (







).
Quote:


> Originally Posted by *Faoust*
> 
> Well I'd like to think i qualify now, since I am running dual R9 295x2's. This is what She looks like.
> 
> Intel® Core™ i7-4770K Processor (8M Cache, 4.60 GHz);
> ASUS ROG MAXIMUS VI FORMULA Intel® Z87 Chipset;
> 16GB, G.SKILL Ripjaws X Series 16GB (4 x 4GB) 240-Pin SDRAM 2400 MHz DDR3;
> Samsung 840 EVO 500GB 2.5" Solid State Drive, Read: 540MB/s, Write: 520MB/s;
> 2x Sapphire ATI Radeon R9 295X2 Dual GPU (8GB)(Crossfire)
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I hear these nzxt cases are bad for air cooling, how's your load? Make sure you stay under 75°C to avoid the thermal-throttle down









Quote:


> Originally Posted by *Dagamus NM*
> 
> ^^^Best GIF avatar ever!
> 
> Now that I realize your "Here she is" comment is about your computer a few thoughts.
> 
> Is your lower card's rad down below it and in the front of the case? If so are you using it we exhaust out the front or is the heat dumping into the case? You might find it noisy with any air in the clc being trapped in the pumps.
> 
> The ram fans never made enough of a difference to overcome the amount of noise generated. I have removed them from all of the computers I had them in. Just because a ram kit came with it didn't turn out to be something I wanted to use. Your tastes and needs might be different than mine. I kind of liked the way they looked but have since moved on ek waterblocks for the ram.


Lol, don't get too excited for too long, the mods will catch it and have him change it. It happens all the time. Even a mod had to change his avatar pic (non gif animated) cuz it was probably too sexy. I've avoided sexy girl pics because of this. Anime works great and can start good conversations with other members.

Stock xfire, i prefer to run a board like the Vi Extreme for that extra bit of slot spacing. I'm still unsure what to get as I don't want to spend a lot. Thinking of just going back to X79 since its becoming very cheap these days. Saw an EVGA Dark X79 go for $130 (evga







) and RIVBE for $200...very tempting and less tweaking vs IB ->. I'm not jumping on Skylake as that was a total bust imho. The early leaks were too good to be true (lame).


----------



## Dagamus NM

Definitely go RIVBE with a 3930K or 3970X if you can score one.

As far as the gif, yes it will probably be pulled soon. Lame but I get it. That specific one is just more hypnotizing that any other I have seen. My three year old son totally stopped talking about dinosaurs and stared for a good minute before his mind drifted back to his T-Rex.


----------



## Faoust

Actually All the rads have push/pull fans. As well the side of the case Never gets put on, so they grab the cool air direct from outside. My cards never go above 54C.


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> Definitely go RIVBE with a 3930K or 3970X if you can score one.
> 
> As far as the gif, yes it will probably be pulled soon. Lame but I get it. That specific one is just more hypnotizing that any other I have seen. My three year old son totally stopped talking about dinosaurs and stared for a good minute before his mind drifted back to his T-Rex.


I had a lot of issues with the be, specifically the bios. I ended up trading it for a rive + cpu.

Got the rive and ws at the top of x79 list. Cpu, early 3820s tend to clock high and I pretty much don't anything higher then a quad tbh.
Quote:


> Originally Posted by *Faoust*
> 
> Actually All the rads have push/pull fans. As well the side of the case Never gets put on, so they grab the cool air direct from outside. My cards never go above 54C.


Damn, that's wc temps. AC must be cranked to tundra levels! Noyce


----------



## Faoust

Quote:


> Originally Posted by *wermad*
> 
> Damn, that's wc temps. AC must be cranked to tundra levels! Noyce


Well I do live in Canada... so ya know....


----------



## bobbavet

Gday guys

Anyone got an idea of the thickness of the heat pads are underneath the stock backplate. Maybe take note when installing a fullcover block.

I know it's pedantic but like to do a refresh in my block installations.

Cheers Bob


----------



## wermad

My guess its 0.5mm, as 1.0mm would be too thick for a thinly made backplate (imho). Since its a passive thing, i would say there would be little to no impact even if you did not have the right pad. The stock cooler or water block would be taking the majority of the thermal soak. I wish i could look into mine but both are still plumbed to my loop and my cards are right side up (reverse atx). I don't think its 0.7mm (koolance is probably the only one still using this size) and I haven't seen 0.3mm since the days of Danger Den.

I'm sure someone will come up with the correct size for you soon


----------



## bobbavet

Thanks Wermad. I didn't think to just have a look between at the card and plate for an approximate. herpa derp.

Though the plate does have a slight lip.

Might just get a square of .5, 1 and 1.5. Wouldn't hurt to have it kicking around in my kit bag.

This stuff doesn't have a use by date does it? lols

Koolance have marked the waterblock as shipped within 6 hours. Impressive.


----------



## wermad

Don't think so, as long as its got the protective films on them. Even still, most pad does hold well w/out the wrap. Some cheaper stuff goes crusty after some time but others still are malleable enough to reuse. Koolance should give you enough pad for the block and enough to redo it at least once.

here are my blocks before they went on a few months ago:







The bridge is a triple (one spacing between). Koolance does not make any blocking plates but fortunately, they include the o-rings w/ each bridge setup. I just used a couple of pieces of acrylic and some m4 bolts/nuts to block off the middle ports. My board doesn't have the "Z77 layout" and I didn't wanna plunk more money into a new bridge. For two cards on blocks, the bridges provide a good amount of stiffness. I had planned some Bitspower Aqualinks but they ended up sagging enough for me to skip and go w/ the diy blocking plates for my triple bridge.

Word of caution, Koolance does suffer from nickel flaking if you run silver. Check with them on what they recommend. I run pure distilled water with no additives, and have done so for many different builds. Its been a while but they never changed their plating method (like EK did a few years ago) so care with the silver.


----------



## Dagamus NM

The fujipoly ultra extreme gets pretty dry and crusty. That said you can always brush on a little thermal paste on a crusty pad to liven it up.

For the stock plate you can use 0.5mm for the ram chips and vrms, the plx chip back gets 1.5mm as does the strip of caps between the vrms


----------



## wermad

Does the rear have pad for the plx chip and the stock backplate?


----------



## bobbavet

I don't know but am watching some Youtube vids atm.

Does it hurt to layer heat pads? ie layering .5's to get 1 and 1.5 pads?


----------



## wermad

nope, I've done it in the past (gtx 480s and 580 3gb) when I ran out of pad. I didn't see any thermal difference on the core's though i didn't monitor the vrm's tbh. The koolance pad is 0.7 and 1.0 I believe. You should have plenty left over or head to a shop (or ebay for the cheap stuff that's lappy approved).


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Does the rear have pad for the plx chip and the stock backplate?


I put it on there. When I pulled it this morning the stock plate definitely made strong contact with the 1.5mm pad over the back of the PLX. Exactly what you have marked in red.


----------



## bobbavet

Quote:


> Originally Posted by *bobbavet*
> 
> HAve been testing out the card and system all day.
> 
> All sweet except my 5820K is not to good a clocker. Only [email protected] 4ghz


Quote:


> Originally Posted by *Alex132*
> 
> That does seem like a very bad CPU


Got it all sorted. My Crucial Ballistic 2133 OEM ram had no XMP so had to set timmings manual in bios.

5820K @4.4ghz [email protected]


----------



## bobbavet

Quote:


> Originally Posted by *wermad*
> 
> Word of caution, Koolance does suffer from nickel flaking if you run silver. Check with them on what they recommend. I run pure distilled water with no additives, and have done so for many different builds. Its been a while but they never changed their plating method (like EK did a few years ago) so care with the silver.


Just seen this in your post. That is cool as I run distilled water only never any additives.

I have had my current loop for 3 years. Never had any problems with algae or discoloring.
Only thing I have seen is the slightest of slight calcify haze on the clear res.
I have only had to do a little top up once in that time.


----------



## ENTERPRISE

Hey guys,

Quick question. I have flashed the Sapphire BIOS to my 295X2. Now I have flashed both the Master & Slave BIOS, effectively GPU 0 and GPU 1. Do I need to also flip the switch on my card to the slave position and flash again ?

Worth noting that I slaved the Master & Slave with the same BIOS file which I think may be the issue, I am trying to locate cool mikes post with the details.

Thanks !


----------



## ENTERPRISE

Dont worry I have found it, other than the benefits of Voltage Increase, does the sapphire bios add any more benefits ?


----------



## mortenv

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Dont worry I have found it, other than the benefits of Voltage Increase, does the sapphire bios add any more benefits ?


Also does it decrease voltage on stock speed? I can easliy run -40mv on 1030mhz core and 1625mhz mem, but it would be nice to have 1030mhz core and 1300mhz mem on -40mv


----------



## kayan

It has been a while since I last posted, I'm having yet more problems with another game (although bf4 supported after the previous suggestion, thanks). So I reinstalled Star Wars TOR and it runs like crap. I'm on the newest 15.7.1 drivers. I'm seeing frames ranging from 15-25 in cutscenes to 30-60 everywhere else, except in my ship where I'm mostly seeing 20s. This is ridiculous for this game and card.

Does anybody else play and have these issues?


----------



## fat4l

Guys, whats ur opinion on Asus Ares III ?

Its custom pcb, more power connectors, better power regulators etc but does it rly affect overclocking vs "normal" 295x2?
Does anyone here have it ?
Whats max voltage/clock u can get with it ?


----------



## bobbavet

Ares III is a dud for the money. It's not a custom pcb like the Ares II. It's pretty much a 295x2 with a EK block and a custom over clock. Anyone with a WC 295x2 should be able to achieve Ares III performance.

O3CD Review.


----------



## fat4l

Quote:


> Originally Posted by *bobbavet*
> 
> Ares III is a dud for the money. It's not a custom pcb like the Ares II. It's pretty much a 295x2 with a EK block and a custom over clock. Anyone with a WC 295x2 should be able to achieve Ares III performance.
> 
> O3CD Review.


well it is a custom pcb for me with more phases etc


vs



and the card is much wider....


----------



## xer0h0ur

Quote:


> Originally Posted by *bobbavet*
> 
> Ares III is a dud for the money. It's not a custom pcb like the Ares II. It's pretty much a 295x2 with a EK block and a custom over clock. Anyone with a WC 295x2 should be able to achieve Ares III performance.
> 
> O3CD Review.


Yeah I don't know where you're getting that from but the Ares III is a custom PCB with actual design differences from the regular 295X2 by way of power input + delivery and the card's video outputs. As far as I know the only waterblocked 295X2 that is identical was the Cryovenom.


----------



## bobbavet

I posted the link. Fair enough. I Stand corrected. Still was never worth the price tag for gains. imo.


----------



## wermad

Pc devil 13 & Ares iii are both custom 290X-x2.

Ebay has some 295x2 under $500...Looks like FuryX x2 it's coming soon and ppl are dumping their cards.


----------



## fat4l

Im only interested in fact that, Ares 3 has better power delivery, more stable delivery etc....How it clocks ? Can you do 1200MHz easy? 1300MHz easy?
Whats the mV limitations?(afterburner or asus tweak)


----------



## ENTERPRISE

Quote:


> Originally Posted by *wermad*
> 
> Pc devil 13 & Ares iii are both custom 290X-x2.
> 
> Ebay has some 295x2 under $500...Looks like FuryX x2 it's coming soon and ppl are dumping their cards.


That does not surprise me, usually I would be counted among those people but I am going to sit tight until HBM has come out of its baby phases and the 295X2 will hold me nicely until that point I feel.


----------



## xarot

Quote:


> Originally Posted by *fat4l*
> 
> Guys, whats ur opinion on Asus Ares III ?
> 
> Its custom pcb, more power connectors, better power regulators etc but does it rly affect overclocking vs "normal" 295x2?
> Does anyone here have it ?
> Whats max voltage/clock u can get with it ?


I think I ran mine at 1150 MHz but I don't recall the voltage, much higher voltage than stock. It's not really that impressive card but the good thing is the copper EK waterblock and it looks great in case. I used Asus ROG Swift monitor so had many issues back then because 144 Hz just didn't work properly. I still have Ares in my closet, waiting for a project or motivation.


----------



## wermad

Finally heard back from enermax/lepa support and said they would ship out my psu today or tomorrow. Kinda of dry in their reply since they either gave me the wrong # or the shipped my psu to the wrong location. Oh well, I'm crossing my fingers something goes out today and maybe it arrives tomorrow or in the next few days. Unfortunately, due to my illness, I suffered a wee-bit accident and threw out my back. Its hard for my to sit down now so I won't get a chance to game







.

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Pc devil 13 & Ares iii are both custom 290X-x2.
> 
> Ebay has some 295x2 under $500...Looks like FuryX x2 it's coming soon and ppl are dumping their cards.
> 
> 
> 
> That does not surprise me, usually I would be counted among those people but I am going to sit tight until HBM has come out of its baby phases and the 295X2 will hold me nicely until that point I feel.
Click to expand...

I love the quads, really give me a boost in 4k and they get a good work out. Glad i "upgraded" my cooling solution







. They were depreciating quite a bit slowly when i got mine and after waiting a month, i just went for them.


----------



## xer0h0ur

Quote:


> Originally Posted by *wermad*
> 
> Pc devil 13 & Ares iii are both custom 290X-x2.
> 
> Ebay has some 295x2 under $500...Looks like FuryX x2 it's coming soon and ppl are dumping their cards.


The only thing coming soon is the R9 Nano. They have not even stated a month for FuryX2 release.


----------



## kayan

Quote:


> Originally Posted by *xer0h0ur*
> 
> The only thing coming soon is the R9 Nano. They have not even stated a month for FuryX2 release.


When is Nano's ETA? Any word on pricing? I want one for my wifey.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> The only thing coming soon is the R9 Nano. They have not even stated a month for FuryX2 release.


Last rumor I heard, something in the fall. And this p/r pic btw:



edit: ppl will start dumping gear early trying to beat future depreciation. Once ebay gets flooded, you'll see some sellers lower the price to sell asap. Hence, why we are probably seeing sub $500 295x2. The card was holding value strongly for a good few months. I'm guessing the anticipation of a new x2 or just going w/ a new setup.


----------



## ENTERPRISE

Getting a custom BIOS to squeeze every bit of performance out of my 295x2 I think : http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50

will hold me nicely until the newer X2 cards with HBM2.


----------



## Alex132

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Getting a custom BIOS to squeeze every bit of performance out of my 295x2 I think : http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50
> 
> will hold me nicely until the newer X2 cards with HBM2.


Only thing I dislike about overclocking the 295X2 (apart from the crazy artifacts I get because hurrdurr memory clocks spazz between 2d and 3d (apparently)) is the VRMs. In terms of both power delivery and heat.


----------



## magicase

Does anyone know where I can find a bios mod for 295x2? I can only get 1080/1600 stable on my card which to me seems pretty crap....


----------



## ENTERPRISE

Quote:


> Originally Posted by *magicase*
> 
> Does anyone know where I can find a bios mod for 295x2? I can only get 1080/1600 stable on my card which to me seems pretty crap....


There are other BIOS's for the card yes...what extra Voltage are you putting in to get to that OC ?


----------



## xer0h0ur

Quote:


> Originally Posted by *kayan*
> 
> When is Nano's ETA? Any word on pricing? I want one for my wifey.


Last info we were given by AMD was that it would launch this month. At this point its looking like the end of the month if its really going to be August.


----------



## magicase

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicase*
> 
> Does anyone know where I can find a bios mod for 295x2? I can only get 1080/1600 stable on my card which to me seems pretty crap....
> 
> 
> 
> There are other BIOS's for the card yes...what extra Voltage are you putting in to get to that OC ?
Click to expand...

I'm justing CCC to do the OCing, so I'll assuming it's stock voltage.

If i use TRIXX what voltage should I put?


----------



## wermad

Psu comes in tomorrow







...I hope...


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicase*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicase*
> 
> Does anyone know where I can find a bios mod for 295x2? I can only get 1080/1600 stable on my card which to me seems pretty crap....
> 
> 
> 
> There are other BIOS's for the card yes...what extra Voltage are you putting in to get to that OC ?
> 
> Click to expand...
> 
> I'm justing CCC to do the OCing, so I'll assuming it's stock voltage.
> 
> If i use TRIXX what voltage should I put?
Click to expand...

My card does 1090/1500 on stock voltage, +100mV for 1150/1500 and +167mV for 1200/1500.

No, your card isnt crappy


----------



## ENTERPRISE

Quote:


> Originally Posted by *magicase*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicase*
> 
> Does anyone know where I can find a bios mod for 295x2? I can only get 1080/1600 stable on my card which to me seems pretty crap....
> 
> 
> 
> There are other BIOS's for the card yes...what extra Voltage are you putting in to get to that OC ?
> 
> Click to expand...
> 
> I'm justing CCC to do the OCing, so I'll assuming it's stock voltage.
> 
> If i use TRIXX what voltage should I put?
Click to expand...

I would actually use MSI Afterburner personally. It would seem that it is because you are running on stock volts and your OC as it stands on stock volts is not bad at all. In Afterburner what you want to do is set the Power Limit right up to +50 to ensure a higher power limit delivery to your card which will help stabilize your Overclock. Now on the top slider of afterburner you will find your voltage control, I find it best to increase this in increments of 10 until your chosen OC is stable and artifact free.


----------



## gupsterg

Quote:


> Originally Posted by *Sgt Bilko*
> 
> My card does 1090/1500 on stock voltage, +100mV for 1150/1500 and +167mV for 1200/1500.


Only what I've noted with 290/X , 390/X and 295X2, stock roms have EVV (Electronic Variable Voltage) / Fuse based for DPM 1-7. From what I understand there is a leakage ID for GPU, GPU-Z shows this as ASIC quality. Then there is some more calcs going on based on SCLK (GPU default clock in rom) which in turn sets a VID. Only DPM 0 is a fixed voltage, ie what we see at idle.

So one gpu may have a differing VID to another, thus when we state I added x to get x speed we may not be comparing like for like voltages. IMO only way to get a good comparative is state what MAX voltage we're seeing in GPU-Z / MSI AB for VDDC / GPU core voltage.

This VDDC is a drooped amount anyway, Stilts VID app in post 1 of hawaii bios editing thread will give DPM 7 VID. Then if we add what offset we're setting in MSI AB to this it would give a good indication to set amount.

Another thing is that a rom / voltage control chip can have preprogrammed voltage offset, this offset is applied to all DPM states (0-7), when we set core voltage offset in MSI AB (other oc apps) that's what is occurring. Stilts app / driver ignores gpu core voltage offset for calcs for EVV IMO but it is added to those values anyhow when set.

Some info by the Stilt on some of above Post 1 , Post 2 , Post 3 .

Also the Stilts states 295X2 (Hawaii XT(L)) are of the highest quality / low leakage parts.

Another good post RE low leakage GPU, this also IMO explains a little why 295X2 only have 4 phases to GPU. As they should be lowest leakage GPU they can have higher VDDC resulting in lower amps drawn, thus require less beefy VRM compared with 290/X which can be higher leakage (ie higher ASIC quality) with lower VDDC resulting in more amps drawn.


----------



## magicase

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicase*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicase*
> 
> Does anyone know where I can find a bios mod for 295x2? I can only get 1080/1600 stable on my card which to me seems pretty crap....
> 
> 
> 
> There are other BIOS's for the card yes...what extra Voltage are you putting in to get to that OC ?
> 
> Click to expand...
> 
> I'm justing CCC to do the OCing, so I'll assuming it's stock voltage.
> 
> If i use TRIXX what voltage should I put?
> 
> Click to expand...
> 
> I would actually use MSI Afterburner personally. It would seem that it is because you are running on stock volts and your OC as it stands on stock volts is not bad at all. In Afterburner what you want to do is set the Power Limit right up to +50 to ensure a higher power limit delivery to your card which will help stabilize your Overclock. Now on the top slider of afterburner you will find your voltage control, I find it best to increase this in increments of 10 until your chosen OC is stable and artifact free.
Click to expand...

Thanks for the info. Will try later today.


----------



## magicase

Is there any reason why MSI AB voltage allows me to go +100 and no more? Is there any way to go beyond that?


----------



## Sgt Bilko

Quote:


> Originally Posted by *magicase*
> 
> Is there any reason why MSI AB voltage allows me to go +100 and no more? Is there any way to go beyond that?


Use Sapphire Trixx instead


----------



## Lordevan83

Is 290x and 295x2 capable of outputting 10 bit color? or do I need workstation cards to do 10 bit?


----------



## Dagamus NM

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Use Sapphire Trixx instead


Trixx isn't letting me touch the voltage whereas AB did.

I ran my first heaven run after getting everything back together and scored 3033 in 1440p with Tess set at extreme and 8xAA. Any increase in core clock is resulting in a drop in score, artifacts at 1107MHz, lots of micro stutter. Funny thing is that on the GPUz monitor, core #3 is showing the occasional spike to 210C then runs normal at 50-55c under load. I know this is an error but it happens repeatedly. Maybe I gorked something when putting my cards together.

Anyhow, I will download AB sometime soon and increase voltage there. I just won't run the OSD and not install Riva.


----------



## stevenn102

First off, forgive me because I'm new here so I'm sure there is a lot I need to learn.

I'm not sure anyone else is having issues like this but I recently purchased a Sapphire R9 295X2 and have been having a ton of issues with my computer either shutting off when starting almost any game or stuttering and artifacting like crazy. I was previously using two GTX 780s with 6GB of VRAM each and had no issues but I wanted to switch over to a single card with dual GPUs and have been eyeing the 295X2 for awhile now but just wanted to wait for prices to come down a little. I found one at a good price and grabbed; before I installed it, I restored my computer to ensure that there will be no leftovers of anything that could conflict with the new card (growing my Nvidia to AMD in particular). Once I installed the card I installed the latest drivers, currently 15.7.1, and had way to many issues to play any game as mentioned above. I uninstalled those drivers using DDU and installed the latest beta drivers, currently 15.6, and it cleared some of the issues, like stuttering in GTA V but in Far Cary 4 it either crashes my system entirely while trying to load it or artifacts too much to be playable. Unigene Heaven will also crash my system. And 9 times out 10 there is no error message or BSOD, just my computer turning off and turning back on. I know my PSU can handle this card and I've turned off OCP in the Corsair Link and that help a little but not nearly enough to make anything playable.

Any suggestions?

My build:
i7-5930k - cooled with an H100i GTX
Sapphire R9 295X2 - stock cooling
16 GB HyperX Predator DDR4-2133
ASRock Fatal1ty X99M Killer
Samsung 840 250 GB SSD
Samsung 840 EVO 500 GB SSD
WD Green 2 TB HDD
Corsair AX1500i PSU
OS: Windows 10 Pro 64-bit
Monitor: LG 34UM95 (3440x1440)


----------



## xer0h0ur

I believe you have to change the amperage of your 12V rails in Corsair Link. Not disable OCP. Each 8-pin cable should also be connecting to its own 12V rail that isn't being used by anything else.

http://www.corsair.com/en-us/blog/2014/may/setting-up-ocp-on-the-ax1500i

"In Link, there are one of two methods that can be used. One is to check all of the OCP check boxes and turn the OCP up to the maximum of 40A. The other method that can be used is to turn off the OCP altogether. To do this, you must check all of the boxes for OCP and then uncheck them. After no more than 15 seconds, OCP will be disabled on the AX1500i power supply."


----------



## stevenn102

Quote:


> Originally Posted by *xer0h0ur*
> 
> I believe you have to change the amperage of your 12V rails in Corsair Link. Not disable OCP. Each 8-pin cable should also be connecting to its own 12V rail that isn't being used by anything else.
> 
> http://www.corsair.com/en-us/blog/2014/may/setting-up-ocp-on-the-ax1500i
> 
> "In Link, there are one of two methods that can be used. One is to check all of the OCP check boxes and turn the OCP up to the maximum of 40A. The other method that can be used is to turn off the OCP altogether. To do this, you must check all of the boxes for OCP and then uncheck them. After no more than 15 seconds, OCP will be disabled on the AX1500i power supply."


I changed it from 30A to 40A and I was still having the same issue. It wasn't until I turned off OCP that it started working better. I have each 8 pin PCIE cable plug into its own port and I'm not using any pig tails. It's funny because I contacted corsair and sapphire about this and they said using pig tails were fine!


----------



## wermad

I'm shopping for a new psu or psu's, I thought that ocp on the ax1500 was individually controlled. Or is this a multi rail unit like the Seasonic 1250w that turned out to be multi and not single?

I"m having a hard time find a pair of units (used) as the second one needs to be under 180mm in length to clear my rads.


----------



## stevenn102

From what I've read and been told, the ax1500i can be set to either be single rail or multi rail and if it is set on multi rail, then each one can be controlled individually.


----------



## wermad

Interesting....







...

Just missed out on an EVGA 1600w gold new for ~$240 on ebay. Still hunting, I got a severely limited budget since most of it is to upgrade the second system in my case. I'm just not comfortable with the lepa, especially since it has a few months left in the warranty. It did pass 3d11 X and FS X with no issues with everything stock.


----------



## stevenn102

It seems that the AX1500i isn't as good as it's made out to be, especially for power hungry systems that draw a lot of power from the GPU. I'm thinking about switching over to the EVGA 1600W or the Rosewill 1600W Hercules. I'm curious if that will solve my issues with the 295x2..


----------



## wermad

Fyi: The rosewill and the coolmax 1600 units have weird rail distribution imho. Two rails, #1 @72 amps and #2 @50 amps. Atx and pcie fixed go to rail #1 and the cpu to rail #2. Kinda tricky with two 295x2's.

There's a coolmax on hardforum.com for $120 + shipping.


----------



## stevenn102

Quote:


> Originally Posted by *wermad*
> 
> Fyi: The rosewill and the coolmax 1600 units have weird rail distribution imho. Two rails, #1 @72 amps and #2 @50 amps. Atx and pcie fixed go to rail #1 and the cpu to rail #2. Kinda tricky with two 295x2's.
> 
> There's a coolmax on hardforum.com for $120 + shipping.


It seems like a lot of the higher wattage psus on the market have weird rails. Any recommendations for a psu for to run 1 295x2 and maybe a 2nd one down the road?


----------



## Medusa666

Quote:


> Originally Posted by *stevenn102*
> 
> It seems like a lot of the higher wattage psus on the market have weird rails. Any recommendations for a psu for to run 1 295x2 and maybe a 2nd one down the road?


LEPA G1600 was great for me.


----------



## wermad

Quote:


> Originally Posted by *stevenn102*
> 
> It seems like a lot of the higher wattage psus on the market have weird rails. Any recommendations for a psu for to run 1 295x2 and maybe a 2nd one down the road?


There's a list in the op or in my sig i put together.

The obvious choice is the evga 1600. Or two units in tandem if your case supports two (my case can do four or more







). I've added the silverstone 1500w *gold* to my list but the reviews show its a good unit but not too impressive.

Ideally for me, a V1000 for my main system and card #1 and a v850 for #2. Or evga 1000/1300 and a 750 (g2/p). I highly recommend the V1000 from CM. Had two powering quad 7970 lightnings before.


----------



## stevenn102

Quote:


> Originally Posted by *wermad*
> 
> There's a list in the op or in my sig i put together.
> 
> The obvious choice is the evga 1600. Or two units in tandem if your case supports two (my case can do four or more
> 
> 
> 
> 
> 
> 
> 
> ). I've added the silverstone 1500w *gold* to my list but the reviews show its a good unit but not too impressive.
> 
> Ideally for me, a V1000 for my main system and card #1 and a v850 for #2. Or evga 1000/1300 and a 750 (g2/p). I highly recommend the V1000 from CM. Had two powering quad 7970 lightnings before.


I'm leaning towards EVGA's 1600w psu. Having two would be nice but my case doesn't fit two psus (Corsair air 240).

That's insane for four! I'm surprised it could handle that. I was also looking at cooler master PSUs. Any reason why the corsair ones aren't recommended as often?


----------



## magicase

Managed to get a stable 1180/1600 with +100mv. 1mhz per 1mv


----------



## wermad

Quote:


> Originally Posted by *stevenn102*
> 
> I'm leaning towards EVGA's 1600w psu. Having two would be nice but my case doesn't fit two psus (Corsair air 240).
> 
> That's insane for four! I'm surprised it could handle that. I was also looking at cooler master PSUs. Any reason why the corsair ones aren't recommended as often?


Corsair is still a great choice. Just read the specs and some reviews to ensure you're getting something that will play nice with Vesuvius.


----------



## stevenn102

I've read some reviews saying the the ax1500i doesn't play well with the R9 295x2 and others saying they've had no issues. So it gets confusing on what to go with when there is mixed results. Then it's frustrating when you have a graphics card that you know is capable of doing so much yet you can't get it to work right and can't pin point what could be the cause.


----------



## wermad

Some of the earliest xfire owners did run them but just lurking back then, the evga seems to be a more solid choice besides running two units. Think I'm gonna ride the lepa a bit more and see if it will hold with high-power consuming games like metro 2033 and crysis 3.


----------



## stevenn102

You're running two of the power devil 295x2s right? I'm sure those require A LOT of power!


----------



## wermad

Quote:


> Originally Posted by *stevenn102*
> 
> You're running two of the power devil 295x2s right? I'm sure those require A LOT of power!


I'm running two reference units. roaches and a few other members are running D13s. Since there's no full cover block for these bad-boys, I passed despite the many times they were slightly less then the reference units @ newegg (open box or mir sales).

56k killah incoming:


----------



## stevenn102

Oh my!! That's amazing!! What kind of temps are you getting with that? I plan on putting two of them, when I get a second one, under water too but nowhere near like that. I'll be posting some pics of mine when it gets closer to being built but it'll go in a Parvum s2.0 case with the 360mm extension. I thought about the case labs s8 test bench but I've never been that much of a fan of horizontally positioned motherboards.


----------



## wermad

I had them in an X9 but with four monsta rads stuffed to the max and the lepa in there too, temps were almost stock. With the TX10-D and the stupidly overkill radiators i have (six monsta 480s and two 560s, 65x corsair fans), they stay under 50c on cool days, and climb into the 50s on hot days in Southern Cali (90-100F days are







). I used to run my fans at max in the X9 to cope better, but in the tx10D and this absurd amount of radiator powah, I can leave them @ 40% power (minimum the controller goes to) and they stay very cool.

I have a second system running stock air (amd athlon chip). I'm hoping this one gets a nice simple upgraded setup w/ custom water and I'll borrow one or two rads from the main system to run this setup.

With the TX10D, you can put in another four psu bracket in the top chamber and since mine has an added pedestal, it can also do four more. Total of 12 psu's possible with standard placement (not including the many more you can custom mount in this little skyscraper







).

Where's Megaman and his 2Kw Superflower...







(congrats again dude







)


----------



## wermad

Sorry for the double post.

I'm no psu guru, but when i have an itch, I just gotta scratch it







. So, did some reading on the ax1500i. Please correct if I'm wrong, thanks









Ok...after some homework, the ax1500 is a single rail with virtual multiple rails acting as ocp @ the pcie connectors. The single rail is limited to 125amp and in virtual rail mode, it limits each pcie connector to 20amps due to ocp. Running stock w/ no Corsair Link, it auto defaults to the multi rail virtual mode ocp. Could be why some ppl were having issues with the 295x2, especially in crossfire. So, changing this in Corsair Link, as it has been suggested several times, is the solution. OklahomaWolf pointed this out when he accidentally tripped ocp when his testing pushed 29amps and no corsair link.

After some further reading, it seems most multi rail psu's are actually single rail with multiple "virtual rails" for ocp sake. True multi rail units end up being very big and costly to manufacturer. From memory, the old HX1000 was said to be essentially to 500w psu's put into one unit, making it a good example of a true multi rail unit. So, paraphrasing the many psu rail explanations out there, for safety, many virtual rails are implemented for ocp after a single rail.

I wish my Lepa had something like corsair link (or dsp) to see how much I'm drawing from each rail. I'm still a bit skeptical that the 30amp rails are enough for 295x2. I'll soon find out in the next few days as I start to push my rma replacement. Within the rubble of mess I made to trouble shoot this bad psu, I have to find my trusty kill-a-watt. If the rma unit does shut off, it could be the card does exceed 28 amps that amd recommends. Its strange as the power requirements are typically over-estimated by amd and nvidia but the 295x2 could be a case of being underestimated. If this is the case, a beefy unit like the EVGA 1600 or AX1500 (+ corsair link to adjust ocp amp draw) are a must for crossfire, despite maybe one not needing all this wattage (stock for sure). Or go with two units in tandem if possible. As the card becomes more affordable, more and more ppl will run into this power conundrum of the 295x2 imho.

I've dabbled with power hungry setups before (4x 480s, 4x 580s, 4x 7970 Ltg.'s), but nothing that needed this specific amperage. The most I pulled was 1800w at the wall with the 480s oc'd and x79 oc'd. If you're pushing more then 1500w, check the breaker assigned to your outlet. I got luck I had an empty 20amp breaker I switched over to and now i don't have issues with the common 15amp breaker tripping. On a side note, just got done helping the old man and his electrician friend put in a 240v outlet on a 50amp breaker to feed a 47amp arc welder my pop will be using soon. I did the shopping and some homework for my dad, and the electrician did the more delicate and dangerous parts (I'm not brave enough to install a breaker in the box







). We also made an extension using 8awg cable ($100 just to make this from scratch!
 






)

Alright, time to get some sleep and try some games tomorrow if I get a chance.


----------



## ENTERPRISE

More testing tonight with Gup on an improved 295X2/More Unlocked Bios, we hope to have it released soon after some further testing







Good thing I cannot flash my card to death just from the amount of flashing !


----------



## Sgt Bilko

Quote:


> Originally Posted by *ENTERPRISE*
> 
> More testing tonight with Gup on an improved 295X2/More Unlocked Bios, we hope to have it released soon after some further testing
> 
> 
> 
> 
> 
> 
> 
> Good thing I cannot flash my card to death just from the amount of flashing !


Whenever i get my RMA with Corsair done for my PSU i might have to give it a shot


----------



## ENTERPRISE

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> More testing tonight with Gup on an improved 295X2/More Unlocked Bios, we hope to have it released soon after some further testing
> 
> 
> 
> 
> 
> 
> 
> Good thing I cannot flash my card to death just from the amount of flashing !
> 
> 
> 
> Whenever i get my RMA with Corsair done for my PSU i might have to give it a shot
Click to expand...

I would ! We have made some decent improvements, just working on what is effectively the last main one if we can nail it ! All details of changes and improvements over Stock BIOS will be released once we have finalized it all !


----------



## Sgt Bilko

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> More testing tonight with Gup on an improved 295X2/More Unlocked Bios, we hope to have it released soon after some further testing
> 
> 
> 
> 
> 
> 
> 
> Good thing I cannot flash my card to death just from the amount of flashing !
> 
> 
> 
> Whenever i get my RMA with Corsair done for my PSU i might have to give it a shot
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I would ! We have made some decent improvements, just working on what is effectively the last main one if we can nail it ! All details of changes and improvements over Stock BIOS will be released once we have finalized it all !
Click to expand...

Sounds like a plan!

I haven't even shipped my PSU off yet so we'll see.....still doing the 20 questions thing with Corsair atm


----------



## Dagamus NM

Quote:


> Originally Posted by *ENTERPRISE*
> 
> I would ! We have made some decent improvements, just working on what is effectively the last main one if we can nail it ! All details of changes and improvements over Stock BIOS will be released once we have finalized it all !


Well that sure would be cool. When updating the bios on this card is it done one core at a time or do both get loaded at the same time?


----------



## stevenn102

Quote:


> Originally Posted by *wermad*
> 
> I had them in an X9 but with four monsta rads stuffed to the max and the lepa in there too, temps were almost stock. With the TX10-D and the stupidly overkill radiators i have (six monsta 480s and two 560s, 65x corsair fans), they stay under 50c on cool days, and climb into the 50s on hot days in Southern Cali (90-100F days are
> 
> 
> 
> 
> 
> 
> 
> ). I used to run my fans at max in the X9 to cope better, but in the tx10D and this absurd amount of radiator powah, I can leave them @ 40% power (minimum the controller goes to) and they stay very cool.
> 
> I have a second system running stock air (amd athlon chip). I'm hoping this one gets a nice simple upgraded setup w/ custom water and I'll borrow one or two rads from the main system to run this setup.
> 
> With the TX10D, you can put in another four psu bracket in the top chamber and since mine has an added pedestal, it can also do four more. Total of 12 psu's possible with standard placement (not including the many more you can custom mount in this little skyscraper
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> Where's Megaman and his 2Kw Superflower...
> 
> 
> 
> 
> 
> 
> 
> (congrats again dude
> 
> 
> 
> 
> 
> 
> 
> )


That is no doubt overkill for anything!! But who cares?? It's awesome!! I live in Phoenix, AZ so the temps during the summer are hitting anywhere from 100-120. I'm lucking if it is under 100 by nightfall. Which is why I'll be water cooling very soon once I get the 295X2 working right, but nothing like that. I like smaller cases so I'll just have a 360mm and 240mm rad (30mm and 45m respectively).

Why in the world would you need that many psu mounting points? I'm sure that you can probably put two systems in there alone right?

2Kw Superflower? Sounds too good to be true..


----------



## ENTERPRISE

Quote:


> Originally Posted by *Dagamus NM*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> I would ! We have made some decent improvements, just working on what is effectively the last main one if we can nail it ! All details of changes and improvements over Stock BIOS will be released once we have finalized it all !
> 
> 
> 
> Well that sure would be cool. When updating the bios on this card is it done one core at a time or do both get loaded at the same time?
Click to expand...

Each GPU essentially has a BIOS. So you have a Master GPU and a Slave GPU. As such you need to flash a Master BIOS & a Slave BIOS.

Sounds complicated but it is not. Luckily I have created .Bat files for easy flashing which I will upload a long with the final BIOS to make a nice easy flashing package ! All you will need is a USB drive spare as you will be flashing through DOS, the package I will include will also include a utility for creating a DOS bootable USB


----------



## NBrock

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Each GPU essentially has a BIOS. So you have a Master GPU and a Slave GPU. As such you need to flash a Master BIOS & a Slave BIOS.
> 
> Sounds complicated but it is not. Luckily I have created .Bat files for easy flashing which I will upload a long with the final BIOS to make a nice easy flashing package ! All you will need is a USB drive spare as you will be flashing through DOS, the package I will include will also include a utility for creating a DOS bootable USB


Wow Good Guy ENTERPRISE!

That's nice of you to include all that. I am looking forward to giving this a shot.


----------



## ENTERPRISE

Quote:


> Originally Posted by *NBrock*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Each GPU essentially has a BIOS. So you have a Master GPU and a Slave GPU. As such you need to flash a Master BIOS & a Slave BIOS.
> 
> Sounds complicated but it is not. Luckily I have created .Bat files for easy flashing which I will upload a long with the final BIOS to make a nice easy flashing package ! All you will need is a USB drive spare as you will be flashing through DOS, the package I will include will also include a utility for creating a DOS bootable USB
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wow Good Guy ENTERPRISE!
> 
> That's nice of you to include all that. I am looking forward to giving this a shot.
Click to expand...

No problem


----------



## Ritter

You are in PHX also! It sure skews outlook when your "room temp" is closer to 85F natively. The concept of aircooling and any aggressive setups is laughable here.

Just joined forum. Waiting for my 295x to arrive in next several days. Thanks everyone for tons of useful info!

_Ritter_


----------



## Faoust

Quote:


> Originally Posted by *stevenn102*
> 
> First off, forgive me because I'm new here so I'm sure there is a lot I need to learn.
> 
> I'm not sure anyone else is having issues like this but I recently purchased a Sapphire R9 295X2 and have been having a ton of issues with my computer either shutting off when starting almost any game or stuttering and artifacting like crazy. I was previously using two GTX 780s with 6GB of VRAM each and had no issues but I wanted to switch over to a single card with dual GPUs and have been eyeing the 295X2 for awhile now but just wanted to wait for prices to come down a little. I found one at a good price and grabbed; before I installed it, I restored my computer to ensure that there will be no leftovers of anything that could conflict with the new card (growing my Nvidia to AMD in particular). Once I installed the card I installed the latest drivers, currently 15.7.1, and had way to many issues to play any game as mentioned above. I uninstalled those drivers using DDU and installed the latest beta drivers, currently 15.6, and it cleared some of the issues, like stuttering in GTA V but in Far Cary 4 it either crashes my system entirely while trying to load it or artifacts too much to be playable. Unigene Heaven will also crash my system. And 9 times out 10 there is no error message or BSOD, just my computer turning off and turning back on. I know my PSU can handle this card and I've turned off OCP in the Corsair Link and that help a little but not nearly enough to make anything playable.
> 
> Any suggestions?
> 
> My build:
> i7-5930k - cooled with an H100i GTX
> Sapphire R9 295X2 - stock cooling
> 16 GB HyperX Predator DDR4-2133
> ASRock Fatal1ty X99M Killer
> Samsung 840 250 GB SSD
> Samsung 840 EVO 500 GB SSD
> WD Green 2 TB HDD
> Corsair AX1500i PSU
> OS: Windows 10 Pro 64-bit
> Monitor: LG 34UM95 (3440x1440)


Bottom line that card needs to go back to sapphire. if its used its very broken. i prey you bought it new.


----------



## wermad

I'm up and running, everything put back into place. Running stock clocks for all, I launched metro 2033 a few times and no shutting down. I'm surprised it can be playable in 4k w/ the cpu at stock, but then again, at this res, the gpu's are doing the grunt of the work. I'm still open to getting two psu's, just have to shop around for some cheap deals. The fat c19 power cord doesn't play nice with my power strip and I need to fit the kill-a-watt, so no power readings until I can get a second power strip.


----------



## stevenn102

Quote:


> Originally Posted by *wermad*
> 
> I'm up and running, everything put back into place. Running stock clocks for all, I launched metro 2033 a few times and no shutting down. I'm surprised it can be playable in 4k w/ the cpu at stock, but then again, at this res, the gpu's are doing the grunt of the work. I'm still open to getting two psu's, just have to shop around for some cheap deals. The fat c19 power cord doesn't play nice with my power strip and I need to fit the kill-a-watt, so no power readings until I can get a second power strip.


Glad to hear to got everything up and running! With how big your case is, might as well get two PSUs just to be on the safe side.

I think I got most of my issues resolved; installing 14.12 omega drivers seemed to have done the trick for now. I'd like to be using the latest drivers but those are giving me far too many issues to make Far Cry 4 and Witcher 3 playable.

Still waiting for the Lepa G1600 to come in, should be by Thursday. We will see if that takes care of the rest of the issues or increases stability.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> I'm up and running, everything put back into place. Running stock clocks for all, I launched metro 2033 a few times and no shutting down. I'm surprised it can be playable in 4k w/ the cpu at stock, but then again, at this res, the gpu's are doing the grunt of the work. I'm still open to getting two psu's, just have to shop around for some cheap deals. The fat c19 power cord doesn't play nice with my power strip and I need to fit the kill-a-watt, so no power readings until I can get a second power strip.


Don't burn your house down. It sucks that 1.6kW UPSs go for more than one of these cards cost. I have one on my running 5960X build, but my older x79s get no such love.


----------



## stevenn102

That's insane that they cost more than the cards themselves. Oh well.

Haha! The older never get the same love as the newer! At least they still get use though.


----------



## kayan

I'm once again having some issues with my 295x2. Everything stock. While playing BF4 my FPS is steady, hovering around 90, and then all of a sudden it tanks down to about 30. The Screen then turns completely white, except a mouse cursor and I crash to desktop. Origin gives me a base proxy error. When I eventually get back in a game it does the same thing every 3-25 minutes. CPU temps are fine around 45-50C, so that isn't the issue.

While playing Star Wars the Old Republic, I usually hover around 50fps (everything maxed), and then is tanks down into the teens for about 5-10 seconds and then shoots back up to 50.

Last night I walked away from it for a couple of hours came back and the screen was completely black, with just a white mouse cursor showing, and I had to do a hard reboot on my PC because nothing was responding.

I swear there is something wrong with this card, but I'm getting the run around from the manufacturer blaming my CPU and PSU. I am on the newest non-beta drivers for Windows 10. Yes, I had issues in 8.1 as well.

What do you guys think?


----------



## Dagamus NM

Driver crashing? What are your GPU temps?


----------



## kayan

Quote:


> Originally Posted by *Dagamus NM*
> 
> Driver crashing? What are your GPU temps?


Temps are fine, and nope I'm not getting a driver crash error.


----------



## wermad

Quote:


> Originally Posted by *stevenn102*
> 
> Glad to hear to got everything up and running! With how big your case is, might as well get two PSUs just to be on the safe side.
> 
> I think I got most of my issues resolved; installing 14.12 omega drivers seemed to have done the trick for now. I'd like to be using the latest drivers but those are giving me far too many issues to make Far Cry 4 and Witcher 3 playable.
> 
> Still waiting for the Lepa G1600 to come in, should be by Thursday. We will see if that takes care of the rest of the issues or increases stability.


Just remember to keep that rail distribution handy







. Edit: could be an old miner so make sure you get a receipt from the seller for warranty (three years through lepa/enermax) and use some heavy duty duster: ie: compressor, compressed air cans, data-vac, etc.
Quote:


> Originally Posted by *Dagamus NM*
> 
> Don't burn your house down. It sucks that 1.6kW UPSs go for more than one of these cards cost. I have one on my running 5960X build, but my older x79s get no such love.


I'm good and I've done some pretty hefty but not insane hardware pushing. I've hit 1800 watts before so 295x2 should be much easier. I've just been using an ots strip with 2kw max trip, with the second system and all the peripherials I run, I need to move the c19 cord as its too bulky.
Quote:


> Originally Posted by *stevenn102*
> 
> That's insane that they cost more than the cards themselves. Oh well.
> 
> Haha! The older never get the same love as the newer! At least they still get use though.


Lol, true true. I got the second system up and running after some fiddling with the hardware and bios. I was hoping it would fail to give the boss the excuse to upgrade this one but I was more thrilled to see my little one get happy she could use a puter. She's got some tutorials to do online to help her stay sharp during the many break periods (this school district doesn't have long summer breaks, more like quarters tbh). This is just an oem put into the large, second chamber of my case. Looks very small and outdated but I was able to shoe in the hdd bracket and cage that come with my case.

Gonna put the little ones to bed soon and then i can fire up some games. Itching to try Alien Isolation as its been sitting in my steam box for a few months now.


----------



## stevenn102

Quote:


> Originally Posted by *wermad*
> 
> Just remember to keep that rail distribution handy
> 
> 
> 
> 
> 
> 
> 
> 
> I'm good and I've done some pretty hefty but not insane hardware pushing. I've hit 1800 watts before so 295x2 should be much easier. I've just been using an ots strip with 2kw max trip, with the second system and all the peripherials I run, I need to move the c19 cord as its too bulky.
> Lol, true true. I got the second system up and running after some fiddling with the hardware and bios. I was hoping it would fail to give the boss the excuse to upgrade this one but I was more thrilled to see my little one get happy she could use a puter. She's got some tutorials to do online to help her stay sharp during the many break periods (this school district doesn't have long summer breaks, more like quarters tbh). This is just an oem put into the large, second chamber of my case. Looks very small and outdated but I was able to shoe in the hdd bracket and cage that come with my case.
> 
> Gonna put the little ones to bed soon and then i can fire up some games. Itching to try Alien Isolation as its been sitting in my steam box for a few months now.


As long as it works that's all that matters! Aesthetics and stuff like that come later. I haven't tried alien isolation yet with the 295x2; I'll be playing that probably Monday night when I get back from being out of town. Like you, it sat in my library for months before I started playing it, like most of my games. Oh well though, they're there whenever I decide to start playing them. I just don't like starting multiple games and not finishing them. I'll play one and beat it then move on to the next game.


----------



## Bluefenix

Hello!!!
We want to join your club!!! I know is not the best build but I did what I can, since in my country will never find any good componente, I had to invest a lot on shipping (the best graphic card at sale here is a pny R9 280 for 420$)


Spoiler: Warning: Spoiler!


----------



## stevenn102

Quote:


> Originally Posted by *kayan*
> 
> I'm once again having some issues with my 295x2. Everything stock. While playing BF4 my FPS is steady, hovering around 90, and then all of a sudden it tanks down to about 30. The Screen then turns completely white, except a mouse cursor and I crash to desktop. Origin gives me a base proxy error. When I eventually get back in a game it does the same thing every 3-25 minutes. CPU temps are fine around 45-50C, so that isn't the issue.
> 
> While playing Star Wars the Old Republic, I usually hover around 50fps (everything maxed), and then is tanks down into the teens for about 5-10 seconds and then shoots back up to 50.
> 
> Last night I walked away from it for a couple of hours came back and the screen was completely black, with just a white mouse cursor showing, and I had to do a hard reboot on my PC because nothing was responding.
> 
> I swear there is something wrong with this card, but I'm getting the run around from the manufacturer blaming my CPU and PSU. I am on the newest non-beta drivers for Windows 10. Yes, I had issues in 8.1 as well.
> 
> What do you guys think?


What resolution are you running at and what CPU and PSU do you have?

I recently got a 295x2 and had all kinds of issues on the 15.7 drivers and on the 15.6 beta drivers. I installed the 14.12 omega drivers and everything started working. Maybe give that a try and see if that helps.


----------



## kayan

I've contacted the manufacturer again, maybe this time they'll look at it. One can hope, right?

Does anyone else have any ideas of what could be going on with my card (issue was posted 6 posts ago)?


----------



## kayan

Quote:


> Originally Posted by *stevenn102*
> 
> What resolution are you running at and what CPU and PSU do you have?
> 
> I recently got a 295x2 and had all kinds of issues on the 15.7 drivers and on the 15.6 beta drivers. I installed the 14.12 omega drivers and everything started working. Maybe give that a try and see if that helps.


Whoops, somehow I missed your post.

I'm on a 3440x1440, and I have a 5820k @ 4.5. Temps aren't an issue, at least not according to monitoring software.


----------



## stevenn102

Have you verified that your power supply can supply at least 28 Amps on the 12v rails you're using for the card?


----------



## wermad

V1000 supplies 83amps, enough for a single 295x2. seems more like drivers/software or gpu. on a single rail, the recommended spec is 50 for combined per amd. I benched with a single card using this model.

@kayan, which drivers have you tried? have you tried other games btw or some benchmarks for testing?


----------



## kayan

Quote:


> Originally Posted by *stevenn102*
> 
> Have you verified that your power supply can supply at least 28 Amps on the 12v rails you're using for the card?


I'm good on PSU, I've got a v1000.

Quote:


> Originally Posted by *wermad*
> 
> V1000 supplies 83amps, enough for a single 295x2. seems more like drivers/software or gpu. on a single rail, the recommended spec is 50 for combined per amd. I benched with a single card using this model.
> 
> @kayan, which drivers have you tried? have you tried other games btw or some benchmarks for testing?


I've tried all the drivers, certified and beta, since Omega.

Witcher 3 I still get mad flickering, and performance has dropped from 50-60 on xfire down to 40ish, which is only a bit more than I had before xfire support.

Diablo 3 chugs to low frames too. Teens and 20s.

I should have better performance, regardless of xfire support. Especially in those older games.


----------



## Dagamus NM

Sounds like something is throttling. Run driver uninstaller and reinstall your driver that worked. If that doesn't work then I suspect a fried capacitor or VRM phase on the back of your board.

I nothing else works it might be worth your time to pop off your backplate and take a look. I have this happen to me more than once.


----------



## kayan

Quote:


> Originally Posted by *Dagamus NM*
> 
> Sounds like something is throttling. Run driver uninstaller and reinstall your driver that worked. If that doesn't work then I suspect a fried capacitor or VRM phase on the back of your board.
> 
> I nothing else works it might be worth your time to pop off your backplate and take a look. I have this happen to me more than once.


I ran DDU this morning before work. Still no dice.


----------



## wermad

What's your gpu cores load?


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> What's your gpu cores load?


I'll check later, am working now. I'll check when I get off.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> What's your gpu cores load?


Core and Memory Clocks are as they should. 1018/1250 each
GPU temp 1 = 69, #2 = 67
Load on both cores was 100%.

To note, my fps was consistently 10 higher than last night. Don't know what's going on. This was in BF4.


----------



## wermad

Downloading right now, though I only do the sp campaign tbh. i'm on 15.7, win7 though (haven't bothered w/ win10 upgrade tbh).

You have any other program running in the background that could affect it?


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Downloading right now, though I only do the sp campaign tbh. i'm on 15.7, win7 though (haven't bothered w/ win10 upgrade tbh).
> 
> You have any other program running in the background that could affect it?


Nothing in the background that is different. Some days it works great, other days it's crap. This GPU is possessed.


----------



## wermad

Weird and with full load; still downloading







.

found my gpu usage erratic in metro 2033 for some odd reason with a lot of stutter (stock cpu). Unchecked disable ulps in ab, restart (per ab), check it via Trixx (enabled start Trixx with Windows), back to normal


----------



## Dagamus NM

Is there any difference in disabling ULPS via software checkbox and disabling via regedit?


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Weird and with full load; still downloading
> 
> 
> 
> 
> 
> 
> 
> .
> 
> found my gpu usage erratic in metro 2033 for some odd reason with a lot of stutter (stock cpu). Unchecked disable ulps in ab, restart (per ab), check it via Trixx (enabled start Trixx with Windows), back to normal


I'm not sure if this much but it's a screenie of GPU1 in GPU-Z (GPU2 isn't being used, but my old 7970 plays SW: tOR better than this card does :/)



Those points where the load on the GPU drops to single digits is where my frames also drop into the teens.


----------



## Dagamus NM

Do you have the gpuz shot from core 2?

The only thing I see is a corresponding dip in current/power that matches GPU usage. GPU usage dropping is probably to reason it is dropping but you might toy with your current


----------



## ENTERPRISE

Quote:


> Originally Posted by *Dagamus NM*
> 
> Is there any difference in disabling ULPS via software checkbox and disabling via regedit?


No, it simply edits the registry to disable it, so you can either use software or delve into the registry, same result either way.


----------



## kayan

Quote:


> Originally Posted by *Dagamus NM*
> 
> Do you have the gpuz shot from core 2?
> 
> The only thing I see is a corresponding dip in current/power that matches GPU usage. GPU usage dropping is probably to reason it is dropping but you might toy with your current


Core 2 is not showing any usage at all, but I play tor in windowed mode, so it only uses 1 GPU core anyway.

Now while playing BF4 it has the same peaks and valleys, but usage doesn't drop below 40% on either core.

How can I adjust current?


----------



## ENTERPRISE

Hey guys,

I need of a Stock XFX BIOS, 1 for Master and 1 for Slave. Can anyone supply me with that ? '

Cheers.

E


----------



## Sgt Bilko

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Hey guys,
> 
> I need of a Stock XFX BIOS, 1 for Master and 1 for Slave. Can anyone supply me with that ? '
> 
> Cheers.
> E


I can when i finish work in about 9hrs time if someone else doesnt provide it before then


----------



## ENTERPRISE

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Hey guys,
> 
> I need of a Stock XFX BIOS, 1 for Master and 1 for Slave. Can anyone supply me with that ? '
> 
> Cheers.
> E
> 
> 
> 
> I can when i finish work in about 9hrs time if someone else doesnt provide it before then
Click to expand...

If nobody beats you to it then thanks !


----------



## mortenv

Quote:


> Originally Posted by *ENTERPRISE*
> 
> If nobody beats you to it then thanks !


https://mega.nz/#!idJGRbwY!qJwoZf6Mqp4JeTFT6fD6a4wFaYxBaQlt-ac-VjSravk


----------



## Alex132

Here is my stock XFX Bios:

295X2XFXBIOS.zip 74k .zip file


Also there is this (same ID as my BIOS it seems): http://www.techpowerup.com/vgabios/163128/xfx-r9295x2-4096-140311.html

It really doesn't make a difference what 295X2 BIOS you use, this one doesn't even show that it's XFX.


----------



## ENTERPRISE

Quote:


> Originally Posted by *mortenv*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> If nobody beats you to it then thanks !
> 
> 
> 
> https://mega.nz/#!idJGRbwY!qJwoZf6Mqp4JeTFT6fD6a4wFaYxBaQlt-ac-VjSravk
Click to expand...

Brilliant thanks !


----------



## JohnnyMoore

Hi all , i have a blood splatters on the camera flickering in Dying light and lock fps 30 with vsync someone had resolved this issue?


----------



## Dagamus NM

Dying light plays just fine with the current drivers. Part of the flickering is actually supposed to be that way. Like coming out of water or when you have blood in your eyes.

If your character has been sitting on a roof for a minute or so just staring at the sky and you still get flickering then you need to run DDU and reinstall drivers.

Why do you have it locked at 30fps?


----------



## JohnnyMoore

Quote:


> Originally Posted by *Dagamus NM*
> 
> Dying light plays just fine with the current drivers. Part of the flickering is actually supposed to be that way. Like coming out of water or when you have blood in your eyes.
> 
> If your character has been sitting on a roof for a minute or so just staring at the sky and you still get flickering then you need to run DDU and reinstall drivers.
> 
> Why do you have it locked at 30fps?


I dont know why , if fps is about 60+ it still be locked at 60 and when lower at 30.In cross have flickering ambient sun light and blood splatters on the screen.Im on 15.7.1 driver and last game version.Its not supposed be that way ,when runing single card I see the difference.


----------



## gpuKiller

Quote:


> Originally Posted by *Alex132*
> 
> Here is my stock XFX Bios:
> 
> 295X2XFXBIOS.zip 74k .zip file
> 
> 
> Also there is this (same ID as my BIOS it seems): http://www.techpowerup.com/vgabios/163128/xfx-r9295x2-4096-140311.html
> 
> It really doesn't make a difference what 295X2 BIOS you use, this one doesn't even show that it's XFX.


subvendor ATI LOL


----------



## Alex132

Quote:


> Originally Posted by *gpuKiller*
> 
> subvendor ATI LOL


Subvendor 1002 has always been ATI - so GPU-Z just probably keeps it as reading like that rather than AMD.


----------



## gupsterg

In the rom can be manufacturer ...



Some apps don't display this ...


----------



## SAFX

Quote:


> Originally Posted by *magicase*
> 
> Managed to get a stable 1180/1600 with +100mv. 1mhz per 1mv


nice!


----------



## MIGhunter

Anyone know of any tricks to get better FPS with windows 10? I went from 110-145 fps down to 30-75 in Archeage. I used DDU to wipe out everything AMD related and went to their site and manually picked the 15.7.1 driver and it didn't help any.


----------



## mortenv

Heard process lasso works


----------



## MIGhunter

I also noticed that sometimes, the graphics glitch out on normal stuff. Like my Google Chrome will have some stuff show through it, then I move it and it fixes itself.


----------



## Ritter

Quote:


> Anyone know of any tricks to get better FPS with windows 10? I went from 110-145 fps down to 30-75 in Archeage. I used DDU to wipe out everything AMD related and went to their site and manually picked the 15.7.1 driver and it didn't help any.


I just received and installed my card last night (used, EK waterblock, custom waterloop) and it tests fine with my Win 7 install.
My previous attempts with Win10 and crossfire 6950's was terrible. Black screen on driver install stuff.

Is the install straightforward for Win10 now, or is there a trick to make it install acceptable drivers for this card? Assume clean slate install.

Thanks in advance!

_Ritter_


----------



## Intelligents

Quote:


> Originally Posted by *Ritter*
> 
> I just received and installed my card last night (used, EK waterblock, custom waterloop) and it tests fine with my Win 7 install.
> My previous attempts with Win10 and crossfire 6950's was terrible. Black screen on driver install stuff.
> 
> Is the install straightforward for Win10 now, or is there a trick to make it install acceptable drivers for this card? Assume clean slate install.
> 
> Thanks in advance!
> 
> _Ritter_


No tricks. Just go to AMD's website and download the latest drivers. They work just fine.


----------



## JohnnyMoore

Quote:


> Originally Posted by *MIGhunter*
> 
> I also noticed that sometimes, the graphics glitch out on normal stuff. Like my Google Chrome will have some stuff show through it, then I move it and it fixes itself.


I had this graphics glitch when my card was dying...


----------



## wermad

Quote:


> Originally Posted by *kayan*
> 
> I'm not sure if this much but it's a screenie of GPU1 in GPU-Z (GPU2 isn't being used, but my old 7970 plays SW: tOR better than this card does :/)
> 
> 
> 
> Those points where the load on the GPU drops to single digits is where my frames also drop into the teens.


Sorry about the wait dude, been a bit sick lately and I just fired up bf4 this morning. Here's my usage (stock cpu and gpu clocks, quad fire, settings: auto/high):


----------



## MIGhunter

Quote:


> Originally Posted by *Ritter*
> 
> I just received and installed my card last night (used, EK waterblock, custom waterloop) and it tests fine with my Win 7 install.
> My previous attempts with Win10 and crossfire 6950's was terrible. Black screen on driver install stuff.
> 
> Is the install straightforward for Win10 now, or is there a trick to make it install acceptable drivers for this card? Assume clean slate install.
> 
> Thanks in advance!
> 
> _Ritter_


I had that problem the first time. Second time I did a fresh install of windows 7. No issues with windows 10 now other than low fps with my card. I am thinking about doing a fresh win 10 install just to see if that matters.
Quote:


> Originally Posted by *JohnnyMoore*
> 
> I had this graphics glitch when my card was dying...


Nah, it's a brand new card and no issues anywhere else or on windows 7.


----------



## Dagamus NM

Quote:


> Originally Posted by *MIGhunter*
> 
> I had that problem the first time. Second time I did a fresh install of windows 7. No issues with windows 10 now other than low fps with my card. I am thinking about doing a fresh win 10 install just to see if that matters.
> Nah, it's a brand new card and no issues anywhere else or on windows 7.


You got a new win 10 license? Your 7 key won't work for a fresh 10 install. I had issues on one of my amd builds so I decided to just go straight win 10. Had to buy a new key, then got a second as I am sure I will need it sooner or later.

I get artifacts on my quad 280x build with windows 10. Sometimes when I open a new page or bring up a program I get some horizontal tearing on the middle 1/2" of my 25" acer 1440p monitors. It is annoying but only seems to do it under those conditions. It is strictly a work computer so I can live with it. It is connected to the wifi at my work and certain downloads just don't happen, tried to get mat lab several times today and never made it past 1%.

I digress, but there are more annoying things than these artifacts. I also had some issues trying to install catalyst 15.7.1 but went back in and installed the chipset drivers and those for the cpu (also catalyst, but different) and then I had no problem with the gpu drivers other than a little tearing.


----------



## kayan

Quote:


> Originally Posted by *wermad*
> 
> Sorry about the wait dude, been a bit sick lately and I just fired up bf4 this morning. Here's my usage (stock cpu and gpu clocks, quad fire, settings: auto/high):


Thanks for that. Looks pretty similar to my usage. I did another driver reinstall in W10 and things seem to be a bit better. Though SW tOR still runs like poop. I really want to wait until the next gen, HBM2, to get a new card, but I need solid, and I can't help but think if I didn't have xfire I wouldn't have some of the issues I currently have.

Off topic, sorta, is my wife's OCZ z850 good to swap my 295 into to give it a test?


----------



## MIGhunter

Quote:


> Originally Posted by *Dagamus NM*
> 
> You got a new win 10 license? Your 7 key won't work for a fresh 10 install. I had issues on one of my amd builds so I decided to just go straight win 10. Had to buy a new key, then got a second as I am sure I will need it sooner or later.
> 
> I get artifacts on my quad 280x build with windows 10. Sometimes when I open a new page or bring up a program I get some horizontal tearing on the middle 1/2" of my 25" acer 1440p monitors. It is annoying but only seems to do it under those conditions. It is strictly a work computer so I can live with it. It is connected to the wifi at my work and certain downloads just don't happen, tried to get mat lab several times today and never made it past 1%.
> 
> I digress, but there are more annoying things than these artifacts. I also had some issues trying to install catalyst 15.7.1 but went back in and installed the chipset drivers and those for the cpu (also catalyst, but different) and then I had no problem with the gpu drivers other than a little tearing.


You don't need a new 10 license. You have to update from 7 to 10. Then pull the new Key from your 10 install. There are programs that do it for free, just google those. Then you can reinstall 10 fresh with the download from Microsoft.

Here's a pretty good link with all that info and some internal links on what to do and where to go: http://www.windowscentral.com/windows-10-clean-install-vs-upgrade

The clean install of windows 7 and then upgrade to 10 fixed my issues with catalyst 15.7.1. I had a black screen I couldn't get out of. Have you tried to use Display Driver uninstaller? If not, use it, do it in safe mode like it says. Reboot, install the 15.7.1 driver from that reboot. See how that works for you.


----------



## wermad

Quote:


> Originally Posted by *kayan*
> 
> Thanks for that. Looks pretty similar to my usage. I did another driver reinstall in W10 and things seem to be a bit better. Though SW tOR still runs like poop. I really want to wait until the next gen, HBM2, to get a new card, but I need solid, and I can't help but think if I didn't have xfire I wouldn't have some of the issues I currently have.
> 
> Off topic, sorta, is my wife's OCZ z850 good to swap my 295 into to give it a test?


Should be fine, its got 71 amps. I got through most of the single player campaign last night with stock cpu clocks







.


----------



## Dagamus NM

Quote:


> Originally Posted by *MIGhunter*
> 
> You don't need a new 10 license. You have to update from 7 to 10. Then pull the new Key from your 10 install. There are programs that do it for free, just google those. Then you can reinstall 10 fresh with the download from Microsoft.
> 
> Here's a pretty good link with all that info and some internal links on what to do and where to go: http://www.windowscentral.com/windows-10-clean-install-vs-upgrade
> 
> The clean install of windows 7 and then upgrade to 10 fixed my issues with catalyst 15.7.1. I had a black screen I couldn't get out of. Have you tried to use Display Driver uninstaller? If not, use it, do it in safe mode like it says. Reboot, install the 15.7.1 driver from that reboot. See how that works for you.


Ok, but that isn't really a clean windows 10 install. You still end up with windows old folders and other potential junk.

I had an issue where windows 7 refused to load on a machine that would accept windows 10 just fine. That is why I needed licenses. I used the update tool for downloading 10 on my machines with win 7.

So if you want a clean win 10 install you will use the same tool you referenced to create an ISO thumb drive. Load win 10 and get a key within 30 days.


----------



## MIGhunter

Quote:


> Originally Posted by *Dagamus NM*
> 
> Ok, but that isn't really a clean windows 10 install. You still end up with windows old folders and other potential junk.
> 
> I had an issue where windows 7 refused to load on a machine that would accept windows 10 just fine. That is why I needed licenses. I used the update tool for downloading 10 on my machines with win 7.
> 
> So if you want a clean win 10 install you will use the same tool you referenced to create an ISO thumb drive. Load win 10 and get a key within 30 days.


No, you are missing what I am saying.

You upgrade from win7 to 10. Then you pull the 10 key from the system. Then you format your HDD and do a fresh install with the thumbdrive or ISO you created using that win10 key.


----------



## Dagamus NM

Ok, gotcha. In that case you are right, no key purchase necessary just time involved.

Anyhow, I needed a key as installing 7 was not an option at the time, that machine just wouldn't take it,


----------



## BigCatRoach

So I'm having some issues just got a sapphire 295x2 last week. My CPU is a Intel I7 4820k I have 32gb or ram (overkill I know but I' borrowing it from work so why not). Neither my CPU or GPU is overclocked at the moment. I have it 3 way crossfire with my 290x but when I realized the problems I turned that off for the time being.
Idk if it helps but my 3D Mark Firestrike scores are a little below average for a 295x (I think after quick search) at 14500 give or take.

I am having issues with FPS in games I'm currently testing with Witcher 3 cause that is what I have been playing most lately. At 1080p its okay 50ish FPS settings all on high to ultra free sync currently off. When I bump it up to 4k and turn the settings all to high I'm getting <15 FPS at all time and drops all the way down to 7. That is with DSR as I do not have my 4k monitor yet I'm wondering if this could be the issue but without an actually 4k I cant test. I have also compared and matched setting to videos I have seen where people are getting solid 45 FPS steady but still same result. I also matched setting for 1080p benchmark videos still 50ish FPS while others are getting 60+.
Other game I tested just cause it is older is assassins creed 3 still 4k at high setting is getting <30 FPS and dips all the way down to 20.
To be honest I think I had much steadier FPS with my single R9 290x with 35-40fps 4k Withcer 3 medium to high setting and solid 60 with 1080 with a little higher setting than 4k. So I'm wondering if its just not liking crossfire for some reason.
Any help would be greatly appreciated can provide other info if needed.


----------



## kayan

@xer0h0ur @Dagamus NM@wermad

Anyone else is welcome to see and comment. I'm really at a loss for what to do. I've reinstalled drivers multiple times via DDU, uninstall via CCC, install over existing drivers, everything from Omega to current 15.7.1. This type of thing is occurring in W8.1 and in W10, both 64bit.

I linked everyone who has tried to help me with my issue with my 295x2. I tried another game that I haven't played in a while (Far Cry 4), and within 1 minute of loading in the following happened. Please watch the youtube links (as I uploaded from fraps) and give me some suggestions.






I stupidly quit recording when I was respawning, but started recording a couple of seconds later:






Thanks in advance.


----------



## MIGhunter

Quote:


> Originally Posted by *kayan*
> 
> @xer0h0ur @Dagamus NM@wermad
> 
> Anyone else is welcome to see and comment. I'm really at a loss for what to do. I've reinstalled drivers multiple times via DDU, uninstall via CCC, install over existing drivers, everything from Omega to current 15.7.1. This type of thing is occurring in W8.1 and in W10, both 64bit.
> 
> I linked everyone who has tried to help me with my issue with my 295x2. I tried another game that I haven't played in a while (Far Cry 4), and within 1 minute of loading in the following happened. Please watch the youtube links (as I uploaded from fraps) and give me some suggestions.
> 
> 
> 
> 
> 
> 
> I stupidly quit recording when I was respawning, but started recording a couple of seconds later:
> 
> 
> 
> 
> 
> 
> Thanks in advance.


This is on your sig rig?

Have you tried to use DDU to uninstall your driver and then before installing new ones, use window's drivers and see if you get the same thing? Also, have you checked your BIOS to make sure there isn't a conflict between your onboard video and your 295x2? I have seen some threads where that is an issue.

[edit]
Also, dbl check your drivers for DX11, .NET and your monitor. I know some monitors use that silly Highlight feature. Far stretch but better than assuming it's the video card.


----------



## kayan

Quote:


> Originally Posted by *MIGhunter*
> 
> This is on your sig rig?
> 
> Have you tried to use DDU to uninstall your driver and then before installing new ones, use window's drivers and see if you get the same thing? Also, have you checked your BIOS to make sure there isn't a conflict between your onboard video and your 295x2? I have seen some threads where that is an issue.


Yes, this is my signature rig. Yes to DDU and new drivers. There is no on board GPU on my 5820k, but there are also no conflicts, according to device manager.


----------



## MIGhunter

Quote:


> Originally Posted by *kayan*
> 
> Yes, this is my signature rig. Yes to DDU and new drivers. There is no on board GPU on my 5820k, but there are also no conflicts, according to device manager.


My edit was probably during your post, check those things as well


----------



## kayan

Quote:


> Originally Posted by *MIGhunter*
> 
> My edit was probably during your post, check those things as well


DirectX was reinstalled about a week ago,or so, but had problems before and since then. Same with .net. Monitor works fine on desktop and movies, so I doubt it is that.

And honestly I'd prefer the GPU is bad. The monitor was way more expensive, lol.


----------



## MIGhunter

Quote:


> Originally Posted by *kayan*
> 
> DirectX was reinstalled about a week ago,or so, but had problems before and since then. Same with .net. Monitor works fine on desktop and movies, so I doubt it is that.
> 
> And honestly I'd prefer the GPU is bad. The monitor was way more expensive, lol.


Do you have a back up card? If so, have you tried it in your system and see what it does?


----------



## kayan

Quote:


> Originally Posted by *MIGhunter*
> 
> Do you have a back up card? If so, have you tried it in your system and see what it does?


I installed my wife's 290x into my system and took another short video from the exact same place in FC4. Game runs better, more FPS and smoother. Absolutely zero artifacts.






I then loaded up SW: tOR (one of the other games in question), and no more random frame drops or stuttering.

Edit: Also loaded up BF4 and I had none of the slowdowns either.

After reinstalling my 295x2, check out my pink fallout trees.....



Taken the next second...


----------



## drm8627

hey guys, i was just wondering , is there a good AIO replacement for the stock cooler for these 295x2 gpus? I dont plan on overclocking, but i do want to run it as cold as possible, even if it is a little expensive.

Im also considering a custom loop just for the gpu, but if i do that id want to make it overpowered, so i can get the temps as low as possible.

any suggestions?

i have a h110 sitting around, i was wondering if i got the nzxt adapter kit for cpu aio to use on a gpu what kinds of temps id get. thanks ahead of time for any suggestions!


----------



## Sgt Bilko

Quote:


> Originally Posted by *drm8627*
> 
> hey guys, i was just wondering , is there a good AIO replacement for the stock cooler for these 295x2 gpus? I dont plan on overclocking, but i do want to run it as cold as possible, even if it is a little expensive.
> 
> Im also considering a custom loop just for the gpu, but if i do that id want to make it overpowered, so i can get the temps as low as possible.
> 
> any suggestions?


Maybe 2 separate AIO's?

Something like 2 x H55's or something i guess but tbh i think the best ways to go about cooling this card are either living with the stock cooler (better fans) or custom loop.


----------



## drm8627

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Maybe 2 separate AIO's?
> 
> Something like 2 x H55's or something i guess but tbh i think the best ways to go about cooling this card are either living with the stock cooler (better fans) or custom loop.


what about a h110 with push pull?
edit: I can get some fans with strong static pressure


----------



## Sgt Bilko

Quote:


> Originally Posted by *drm8627*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Maybe 2 separate AIO's?
> 
> Something like 2 x H55's or something i guess but tbh i think the best ways to go about cooling this card are either living with the stock cooler (better fans) or custom loop.
> 
> 
> 
> what about a h110 with push pull?
> edit: I can get some fans with strong static pressure
Click to expand...

The problem is the 295x2 uses 2 separate blocks/pump heads so you'd need two more to replace them, while a H110 would work very well for one of them you'd still need another or something else to use for the other Die


----------



## drm8627

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The problem is the 295x2 uses 2 separate blocks/pump heads so you'd need two more to replace them, while a H110 would work very well for one of them you'd still need another or something else to use for the other Die


ok, so id need two h110s. thats not an issue, but is there any adapter for a dual gpu setup? the nzxt kraken adapter is for a single gpu, so i dont think thatd work.


----------



## bobbavet

Gday Guys

Still getting my build together. The Koolance WB turned up and installed fine.

Wondering if I can get some advice / opinion on 2 pcie power cables I have here and how I wish to use it.

Pictured below is a Silverstone pcie cable. It is 8pin / 6+2pin. The 8pin end goes into the Silverstone PSU and the 6+2 is used for the GPU.



What I would like to know is. Could this cable be used in reverse? Or are they one directional?

That is, can the 8 pin go into my GPU and use the 6pin in my Coolermaster as the Pcie power outputs on the CM are 6pin.

The reason I wish to use them is that they are the exact length required for my Matx build. They also are much tidier than using the 2 long ribbon cables that came with the CM as they have daisy chain connectors hanging off them as well.

cheers

Bob


----------



## Sgt Bilko

Quote:


> Originally Posted by *drm8627*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> The problem is the 295x2 uses 2 separate blocks/pump heads so you'd need two more to replace them, while a H110 would work very well for one of them you'd still need another or something else to use for the other Die
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ok, so id need two h110s. thats not an issue, but is there any adapter for a dual gpu setup? the nzxt kraken adapter is for a single gpu, so i dont think thatd work.
Click to expand...

You would need 2 AIO's of some kind, whether that means 2 x H110's or 2 of another kind thats it.
Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> Still getting my build together. The Koolance WB turned up and installed fine.
> 
> Wondering if I can get some advice / opinion on 2 pcie power cables I have here and how I wish to use it.
> 
> Pictured below is a Silverstone pcie cable. It is 8pin / 6+2pin. The 8pin end goes into the Silverstone PSU and the 6+2 is used for the GPU.
> 
> 
> 
> What I would like to know is. Could this cable be used in reverse? Or are they one directional?
> 
> That is, can the 8 pin go into my GPU and use the 6pin in my Coolermaster as the Pcie power outputs on the CM are 6pin.
> 
> The reason I wish to use them is that they are the exact length required for my Matx build. They also are much tidier than using the 2 long ribbon cables that came with the CM as they have daisy chain connectors hanging off them as well.
> 
> cheers
> 
> Bob


I wouldn't mix and match them, With my Silverstone PSU i used 2 of the daisy chained cables but only plugged in the 8 pin from each, that was in a CM Elite 130 case so cable management didn't matter that much









But regardless i don't think it'd be good to use the CPU cable for the GPU


----------



## drm8627

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You would need 2 AIO's of some kind, whether that means 2 x H110's or 2 of another kind thats it.


so all i need is two aio's of the same kind? it comes with mounting hardware built in and vrm solution?
my apologies, ive never had one apart, yet.


----------



## Sgt Bilko

Quote:


> Originally Posted by *drm8627*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> You would need 2 AIO's of some kind, whether that means 2 x H110's or 2 of another kind thats it.
> 
> 
> 
> so all i need is two aio's of the same kind? it comes with mounting hardware built in and vrm solution?
> my apologies, ive never had one apart, yet.
Click to expand...

So long as you have the Asetek design (round pump head) I think they would fit just fine and there is a copper heatsink for the vrm's that is cooled by the fan on the card itself



^ That's what the card looks like under the shroud (ignore the red box, that was for another member ages ago)


----------



## drm8627

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So long as you have the Asetek design (round pump head) I think they would fit just fine and there is a copper heatsink for the vrm's that is cooled by the fan on the card itself
> 
> 
> 
> ^ That's what the card looks like under the shroud (ignore the red box, that was for another member ages ago)


well damn thats really convenient!! awesome!! only problem is now, is figuring out which case i can fit two h110s close enough to fit the gpu. Id also like to have a third aio solution for my 4790k, wanted a h110 for that, too. So i would need a case that has room for three h110s, and two close to the gpu, is there such a case?

Only reason im leaning towards AIO, is because im more comfortable using them, less likely ill mess something up doing a custom loop.


----------



## bobbavet

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I wouldn't mix and match them, With my Silverstone PSU i used 2 of the daisy chained cables but only plugged in the 8 pin from each, that was in a CM Elite 130 case so cable management didn't matter that much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But regardless i don't think it'd be good to use the CPU cable for the GPU


It's not a CPU cable though. It is a pcie cable.

I could live with the lentgh of the CM cables but these daisy chains are a pain in the butt, to much excess. Is it safe to cut off the daisy chain some how?


----------



## Mega Man

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> I would ! We have made some decent improvements, just working on what is effectively the last main one if we can nail it ! All details of changes and improvements over Stock BIOS will be released once we have finalized it all !
> 
> 
> 
> Well that sure would be cool. When updating the bios on this card is it done one core at a time or do both get loaded at the same time?
> 
> Click to expand...
> 
> Each GPU essentially has a BIOS. So you have a Master GPU and a Slave GPU. As such you need to flash a Master BIOS & a Slave BIOS.
> 
> Sounds complicated but it is not. Luckily I have created .Bat files for easy flashing which I will upload a long with the final BIOS to make a nice easy flashing package ! All you will need is a USB drive spare as you will be flashing through DOS, the package I will include will also include a utility for creating a DOS bootable USB
Click to expand...

i would recommend using rufus it makes boot disks so easy !~

Quote:


> Originally Posted by *drm8627*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> So long as you have the Asetek design (round pump head) I think they would fit just fine and there is a copper heatsink for the vrm's that is cooled by the fan on the card itself
> 
> 
> 
> ^ That's what the card looks like under the shroud (ignore the red box, that was for another member ages ago)
> 
> 
> 
> well damn thats really convenient!! awesome!! only problem is now, is figuring out which case i can fit two h110s close enough to fit the gpu. Id also like to have a third aio solution for my 4790k, wanted a h110 for that, too. So i would need a case that has room for three h110s, and two close to the gpu, is there such a case?
> 
> Only reason im leaning towards AIO, is because im more comfortable using them, less likely ill mess something up doing a custom loop.
Click to expand...

fyi afaik the mounting is different
Quote:


> Originally Posted by *bobbavet*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I wouldn't mix and match them, With my Silverstone PSU i used 2 of the daisy chained cables but only plugged in the 8 pin from each, that was in a CM Elite 130 case so cable management didn't matter that much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But regardless i don't think it'd be good to use the CPU cable for the GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's not a CPU cable though. It is a pcie cable.
> 
> I could live with the lentgh of the CM cables but these daisy chains are a pain in the butt, to much excess. Is it safe to cut off the daisy chain some how?
Click to expand...

mixing cables is dangerous
you can
but you may blow up something

you can always repin everything ~! but i would not recommend it if you dont know what you are doing or how to check wiring i dont recommend it, imo the risk is not worth it

ps i am back guys baby is taking a bit of time but i got a new job and stuffs should be able to start staying involved


----------



## Dagamus NM

Thank you sir. Enjoy every second of the new baby, it is fleeting and you won't get it again until you have another.


----------



## GoodwinAJ

Even with a custom loop and four 560 rads with two of them running at 100% (FAH) I'll still get to thermal throttling (74F) in about 20 minutes, 5930k at 4.4 at 100% also. These cards just run hot when pressed.


----------



## drm8627

Quote:


> Originally Posted by *GoodwinAJ*
> 
> Even with a custom loop and four 560 rads with two of them running at 100% (FAH) I'll still get to thermal throttling (74F) in about 20 minutes, 5930k at 4.4 at 100% also. These cards just run hot when pressed.


yeah with two 295x2 and an overclocked cpu im sure its difficult to keep them cool, if i were you id separate the loops, but im not an experienced watercooler , so i may be wrong. All i want to do is keep one 295x2, at stock speeds, at lower temps, like id like the max temp while being pushed to be around 50f. IF it takes a whole custom loop for my one card then so be it. Considering going custom loop , too because i want to watercool the VRMs as well.


----------



## Alex132

Thermal throttling is at 75℃ - not 75℉.
I'm not sure what you're doing as even overclocked I haven't ever hit it on my stock 295X2


----------



## drm8627

Quote:


> Originally Posted by *Alex132*
> 
> Thermal throttling is at 75℃ - not 75℉.
> I'm not sure what you're doing as even overclocked I haven't ever hit it on my stock 295X2


so youve never experienced throttling with your card? i figured that might be an issue with these. if its not, then i might just keep the stock cooling on it. (havent gotten mine yet)


----------



## Alex132

Quote:


> Originally Posted by *drm8627*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Thermal throttling is at 75℃ - not 75℉.
> I'm not sure what you're doing as even overclocked I haven't ever hit it on my stock 295X2
> 
> 
> 
> so youve never experienced throttling with your card? i figured that might be an issue with these. if its not, then i might just keep the stock cooling on it. (havent gotten mine yet)
Click to expand...

I run the stock cooler with 2x1500rpm Cougar fans (PWM enabled so they stick at 800rpm most of the time), and even with +120mV and 1220/1500 it hits around 70℃ / 72℃.


----------



## GoodwinAJ

Quote:


> Originally Posted by *Alex132*
> 
> Thermal throttling is at 75℃ - not 75℉.
> I'm not sure what you're doing as even overclocked I haven't ever hit it on my stock 295X2


My oversight.


----------



## GoodwinAJ

Quote:


> Originally Posted by *Alex132*
> 
> Thermal throttling is at 75℃ - not 75℉.
> I'm not sure what you're doing as even overclocked I haven't ever hit it on my stock 295X2


I only get that hot during full resource Folding at Home (FAH), 100% load uninterrupted. And it takes about 20 min, my coolant delta T maxes at 8C over ambient. The whole machine is pulling over 1000w at load.


----------



## Mega Man

Quote:


> Originally Posted by *GoodwinAJ*
> 
> Even with a custom loop and four 560 rads with two of them running at 100% (FAH) I'll still get to thermal throttling (74F) in about 20 minutes, 5930k at 4.4 at 100% also. These cards just run hot when pressed.


i dunno what you are doing but i never break 50c and rarely break 45 unless i dont turn on my ac in my room

i bet it is your fans, but i am not there so that is just a shot in the dark,

although getting better, 140fans suck still to this day

with that said it sounds like you have an issue whether with setup or with your loop i dont know

2 pumps should be fine for that size loop ( 5 cpu/gpus ) i use 4 but that is because i am crazy !~ mine has 5 480s, tbh i am surprised when the fans ramp up

Quote:


> Originally Posted by *drm8627*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GoodwinAJ*
> 
> Even with a custom loop and four 560 rads with two of them running at 100% (FAH) I'll still get to thermal throttling (74F) in about 20 minutes, 5930k at 4.4 at 100% also. These cards just run hot when pressed.
> 
> 
> 
> yeah with two 295x2 and an overclocked cpu im sure its difficult to keep them cool, if i were you id separate the loops, but im not an experienced watercooler , so i may be wrong. All i want to do is keep one 295x2, at stock speeds, at lower temps, like id like the max temp while being pushed to be around 50f. IF it takes a whole custom loop for my one card then so be it. Considering going custom loop , too because i want to watercool the VRMs as well.
Click to expand...

a good open loop = 40-50c max


----------



## NBrock

Quote:


> Originally Posted by *GoodwinAJ*
> 
> I only get that hot during full resource Folding at Home (FAH), 100% load uninterrupted. And it takes about 20 min, my coolant delta T maxes at 8C over ambient. The whole machine is pulling over 1000w at load.


I don't have two and a cpu in a loop but I would be tempted to say you may need better fans. I have the stock watercooling on my 295x2 and it never gets above 71* while folding 24/7 @ 1100 core. I used a different thermal paste and Corsair SP 120 High Performance fans in push pull.


----------



## Alex132

One GPU core loading at ~72'c while the others loading at 64-69'c points to bad TIM application. Make sure you don't overdo it, and that they're making proper contact.

I see you're using 2 D5 pumps (maybe try running them both at 5 and see if it makes any difference?), parallel GPU blocks, Bitfenix fullcover block - but what radiators? Fin density and thickness is also important.

4x560 radiators should be way more than enough for all that heat, the only reason it could be getting that hot is poor TIM application (is the overall GPU hot too? VRM area and such?) or poor fans or poor airflow. I'm guessing they're external, - could we see a picture of them?


----------



## Ritter

I went external radiator. I have been fiddling with water cooling since the Athlon XP days, and most of my daily use rigs are watercooled - mostly for the noise reduction.
I just put in a 295x2 a couple of weeks ago. My house temp sits around 27C.

I think idle is 31C and after 12+ hours of x11 coin hashing it was 50C if I recall. Can't check it from work just now.
Here is a pic of the radiator. Had it made at a local shop - was cheaper than getting the MO-RA3 420 Pro. That is a nice rad, but I passed on the nice looking mounts - I seldom show off my hardware to anyone who cares - to save 40%.





BTW - What's the beef with 140mm fans? I found these (pic 1) that I like because of the quiet factor - 12dBA at 48CFM - no pwm needed.

Ritter


----------



## Alex132

..is... is it aluminium? It's totally aluminium isn't it...


----------



## Ritter

Yes. Total AL. Copper adds to cost and with as often as I change stuff, the problem of electrochemical interaction is not too great in my opinion.

Water wetter from the auto store is in with distilled water in the lines. Mix of line types due to me being too cheap to buy all new to match (nobody sees it) so some are thick wall food grade silicone (prevents kinking), external are vinyl braided since flex is not essential, and the vid card is stepped down from the 1/2" system to 3/8" ID to the EK waterblock.

R

BTW - you want one? He might make more.


----------



## Alex132

Quote:


> Originally Posted by *Ritter*
> 
> Yes. Total AL. Copper adds to cost and with as often as I change stuff, the problem of electrochemical interaction is not too great in my opinion.


Just because you don't believe it happens, doesn't mean it doesn't happen.


----------



## Mega Man

In his defense he didn't say it doesn't happen. He said he doesn't care

Dear god where to start.

Al does not hold as much heat so the temps will be reduced there.

Crapy fans ( not that they are 140. Will deal with that later ) I don't link it now. But cfm and static pressure are meaningless ( see https://martinsliquidlab.wordpress.com/2013/02/18/why-static-pressure-max-flow-specs-are-poor-measures-of-fan-performance/ )

And just by what the blades look like I can tell you they are crappy.

Most 140mm fans have yet to get to the level 120 mm fans are (pq). There are a few ok ones. But I still use 120 for a reason. With a rad like that you need some very aggressive fans. Silence is not an option if you want good cooling. You can make a silent read. But the point is you didn't ( dense fins and high restriction. That looks like it belongs on my truck.

And I can tell you from personal experience that fan is loud.

The size rubbing I am sure plays a part as well. You have a long run with what sounds like pretty twisty run

I could go on bit I'll stop there.

As to the price comment. I'll quote my dog. "Budget components = budget results"


----------



## Ritter

I completely agree that with a SIGNIFICANT increase in cost I could come to a system that definitely looks better. Might even perform better as well, as long as we are looking at the same metrics.

Thank you for the informative article link. I do not dispute the information provided, the author did provide a great example of what I agree is the proper way to compare fans.

One key element is "real world" application.

From his comparison, the graph appears to demonstrate 2 important things: 1) CFM is not the only spec to concern yourself with and 2) These 2 fans displayed are almost the same in CFM in real world application.
Quote:


> And just by what the blades look like I can tell you they are crappy."


<<< please explain further - by the article you have provided box stats cannot tell the story, so I am curious how a poor quality snapshot does.

You make several other blanket statements that I would like more explanation on:
Quote:


> Most 140mm fans have yet to get to the level 120 mm fans are (pq). There are a few ok ones. But I still use 120 for a reason.


What reason? Do you use pq as your only determining factor of fan standard?
Quote:


> With a rad like that you need some very aggressive fans.


I disagree. This is a completely opinion statement.
Quote:


> Silence is not an option if you want good cooling.


Also, complete opinion presented as fact.
Quote:


> You can make a silent read. But the point is you didn't ( dense fins and high restriction.


You are correct there. I could have made a silent rad but chose not to. This was originally intended for 3x 6950's plus the AMD FX-8650 that I was looking to O/C a little to get a little more oomph occasionally.
Quote:


> That looks like it belongs on my truck.


This just appears to be insulting me for no reason at all.
Quote:


> And I can tell you from personal experience that fan is loud.


This is good information. It is not my experience thus far, but your quietness requirements may differ from mine. How and when did you use this fan?

You see, I wish to keep my overall system noise down as much as possible without feeling guilty on how much $ I spent. Originally I was cooling more devices. I consider the RAD almost a simple mass heat sink given the current load. I will probably disable the fans and see what the temps look like at idle now that you mention the passive possibility. The ambient sounds in the family room mask a lot of low level hum, however I am sure that you could hear this install in a truly silent room, theater, or crypt. I don't really need total silence, but I need low enough to be masked by distant house sounds of traffic, winds, and air conditioner.

Your post comes across as almost hostile, inferring that you do not have time for me or my current layout. If that is so, I understand, everyone gets busy. Please re-read your post and consider how it may look to a true entry level (or second attempt) person who may be thinking about going full watercool.

Thanks for any information / responses to my questions above. And thanks again for the reminder to try passive.

Ritter


----------



## Mega Man

Ok when I get home I will edit this post with a response.

I am driving home ( at a red light now )

I would like to mention nothing wad intended as am insult. It was just fact without emotion.


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Ritter*
> 
> I completely agree that with a SIGNIFICANT increase in cost I could come to a system that definitely looks better. Might even perform better as well, as long as we are looking at the same metrics.
> 
> Thank you for the informative article link. I do not dispute the information provided, the author did provide a great example of what I agree is the proper way to compare fans.
> 
> One key element is "real world" application.
> 
> A)From his comparison, the graph appears to demonstrate 2 important things: 1) CFM is not the only spec to concern yourself with and 2) These 2 fans displayed are almost the same in CFM in real world application.
> 
> B)
> Quote:
> 
> 
> 
> And just by what the blades look like I can tell you they are crappy."
> 
> 
> 
> <<< please explain further - by the article you have provided box stats cannot tell the story, so I am curious how a poor quality snapshot does.
> 
> You make several other blanket statements that I would like more explanation on:
> 
> C
> Quote:
> 
> 
> 
> Most 140mm fans have yet to get to the level 120 mm fans are (pq). There are a few ok ones. But I still use 120 for a reason.
> 
> Click to expand...
> 
> What reason? Do you use pq as your only determining factor of fan standard?
> 
> D)
> Quote:
> 
> 
> 
> With a rad like that you need some very aggressive fans.
> 
> Click to expand...
> 
> I disagree. This is a completely opinion statement.
> Quote:
> 
> 
> 
> Silence is not an option if you want good cooling.
> 
> Click to expand...
> 
> Also, complete opinion presented as fact.
> 
> E part 1)
> Quote:
> 
> 
> 
> You can make a silent read. But the point is you didn't ( dense fins and high restriction.
> 
> Click to expand...
> 
> You are correct there. I could have made a silent rad but chose not to. This was originally intended for 3x 6950's plus the AMD FX-8650 that I was looking to O/C a little to get a little more oomph occasionally.
> 
> E part 2)
> Quote:
> 
> 
> 
> That looks like it belongs on my truck.
> 
> Click to expand...
> 
> This just appears to be insulting me for no reason at all.
> 
> E part3)
> Quote:
> 
> 
> 
> And I can tell you from personal experience that fan is loud.
> 
> Click to expand...
> 
> This is good information. It is not my experience thus far, but your quietness requirements may differ from mine. How and when did you use this fan?
> 
> You see, I wish to keep my overall system noise down as much as possible without feeling guilty on how much $ I spent. Originally I was cooling more devices. I consider the RAD almost a simple mass heat sink given the current load. I will probably disable the fans and see what the temps look like at idle now that you mention the passive possibility. The ambient sounds in the family room mask a lot of low level hum, however I am sure that you could hear this install in a truly silent room, theater, or crypt. I don't really need total silence, but I need low enough to be masked by distant house sounds of traffic, winds, and air conditioner.
> 
> F)Your post comes across as almost hostile, inferring that you do not have time for me or my current layout. If that is so, I understand, everyone gets busy. Please re-read your post and consider how it may look to a true entry level (or second attempt) person who may be thinking about going full watercool.
> 
> G)Thanks for any information / responses to my questions above. And thanks again for the reminder to try passive.
> 
> Ritter
Click to expand...





ok so
A)CFM is a very poor spec,
martins point was the rating system is excessively flawed as it is not a controlled spec, one comp can test one way and another company a different way.
CFM is also EXCESSIVELY misleading ( all of the above are same to be said about static pressure )
Pq charts take out these variables

B) sorry i cant help much here. i do hvac for a living, i deal with all kinds of fans including fans that will adj pitch ( like these ) and i know what kind of a fan it is by the blades, for examples look up known rad fans ( high and low speed gentle typhoons )

C)again this mostly comes from experience, i have read a ton of reviews, on martinsliquidlab you can check on a bunch of the fan roundups- unfortunately his hosting contract was up so he moved it to free hosting, which broke the tool bar at the top of it, so he removed it. it isnt as easy to point to as it used to be 140s are starting to get better, but show me any 140 and match it to a simalar speed GT and the gt will win at this point ( again PQ chart )

E)part one - 2 and 3 you separated a statement which was not intended to be

you made a high restriction rad that is designed to be used in automotive ( looks like a intercooler but that is neither here nor there ) automotive fans are designed to be used with excessively high static pressure and due to this NOISY fans

my part 2 was not an insult to the looks but a statement of how this equipment was designed to be used that is all nothing more and part 3 was explaining it further

F) again was not intended but at the same time i wont coddle you i will be honest and straight to the point, if you dont like it, sorry but that is the way i am

G) dont go passive that rad is not good for passive

i was taking a break from being on a roof that was 120deg f

i didnt have a lot of time and some of the miscom was my fault as i didnt fully explain. i hope this helps some but i am still in a hurry my tx10 just arrived and i want to go play with it


----------



## Dagamus NM

Quote:


> Originally Posted by *Ritter*
> 
> Yes. Total AL. Copper adds to cost and with as often as I change stuff, the problem of electrochemical interaction is not too great in my opinion.
> 
> Water wetter from the auto store is in with distilled water in the lines. Mix of line types due to me being too cheap to buy all new to match (nobody sees it) so some are thick wall food grade silicone (prevents kinking), external are vinyl braided since flex is not essential, and the vid card is stepped down from the 1/2" system to 3/8" ID to the EK waterblock


Haha, nice to see somebody using water wetter. I considered it.

I much prefer larger fans. 140mm+ 120mm are just too noisy.

I have a 1.35kw line active UPS and the 92mm fan on it makes me want to kill myself. I want to find an adapter to go to a bigger fan but I cannot find any.


----------



## Mega Man

Why is using water wetter nice?

He would be far better served using 30-50% antifreeze due to the anti corrosion properties which are more advanced


----------



## Dagamus NM

Because he is an obvious gear head who is now into computers.


----------



## Ritter

Dagamus - (Hoping your comment was joking in nature) Not really a gear head, but for personal/tinker use I will gladly borrow from easy to source and affordable alternatives. I wanted way more surface area like the WaterCOOL MO-RA3 PRO or the Phobya Xtreme SUPERNOVA 1260 Radiator - but both of those went above my line of willing to spend. I settled on this for affordability.

My first watercooled rig (2000?) was loud but effective, and at the time it was difficult for me to find even a waterblock, much less have any selection of radiator - and most were well above the cost line of tinker/play. One had to be a full-on enthusiast to justify the $$. I have too many other hobbies.

Mega - Thank you VERY much for the further explanation as well as your perspective source. Knowing your professional career allows me to take more of your statements at face value - I am sure you have run into posters who are just as insistent but have no basis in real-world. (PS - I am glad there are people like you who will work in that heat!)

I will state again, that I get that there are purpose built items for this hobby, but the cost is an entry barrier. You (Mega) are obviously committed to the sport, since you reference a TX10 - which I assume is that massive CaseLabs unit. Congrats on that! Not anything I will be able to approach either in affordability nor in space in the house - it doesn't fit in with any of the approved décor!

This is a long-winded way of saying these things:
Thanks for giving more detail - I understand much more now.
Thanks for continuing the discussion and not writing me off.
I am slightly jealous of your case and toys.
Please continue to be understanding that not all of us can participate at that level.

When I plan next upgrade, I will certainly try to reach out here to you for some input and recommendations.


----------



## Mega Man

Seriously no need. You asked and I answered.

I was serious about the anti freeze.

Use the green (it is poisonous)

If you can't deal with the poison user any of the other colors. Green is polyethylene others are polypropylene ( sugar ). I say others because all the manufactured have their own and then there is generic.

Please don't hesitate to ask.

I never said you had to upgrade. You asked why your temps were so high. That is what I talked about.

Imo save up. Get a cheap swiftech 480 or 2 (even 2x360 ). And some good rad fans

(For the price http://www.swiftech.com/fan120x25mmrdm1225s.aspxthese can't be beat. swiftechs shipping starts at 10 but is quote reasonable when you buy everything ( buy the rads there))
(Then you can just use distilled and biocide)
If you do that I bet you would see a significant decrease in temps

Another option is watch ocn Market place have got several steals there

You are correct about the tx10. It makes my 5th cl.

2xs3,m8,th10,and lastly tx10.

Thank god my wife loves me


----------



## Faoust

Quote:


> Originally Posted by *drm8627*
> 
> hey guys, i was just wondering , is there a good AIO replacement for the stock cooler for these 295x2 gpus? I dont plan on overclocking, but i do want to run it as cold as possible, even if it is a little expensive.
> 
> Im also considering a custom loop just for the gpu, but if i do that id want to make it overpowered, so i can get the temps as low as possible.
> 
> any suggestions?
> 
> i have a h110 sitting around, i was wondering if i got the nzxt adapter kit for cpu aio to use on a gpu what kinds of temps id get. thanks ahead of time for any suggestions!


Tbh I just used the fan from the second 295X2 on the first card for a push/pull system and a pair of http://www.canadacomputers.com/product_info.php?cPath=8_130&item_id=048700 on the second. even under full load in Star citizen it never goes above 48c. its the cheapest of the solutions outside of doing nothing.


----------



## drm8627

hey guys, I just recently noticed EK makes AIO watercoolers, that are expandable. theyre called the "predator" series, and you can modify them.They are actually made out of the components they sell for custom loops. I think what i might do is buy one of those and get a full gpu coverage EK block, that covers both processors and the vrms. Im considering getting the 360 rad unit, for that. I dont plan on overclocking, and i dont doubt the temps are OK with the stock cooler, id just like the card to run as cool as possible, for the purpose of longevity, and stability. And because it just puts me at ease to know my components are way cooler than they need to be. http://predator.ekwb.com/
Im planning on putting this in a corsair c70 case, so i need to do some research on which rad would fit. The rad would be specifically for the gpu, i have a h110 specifically for the cpu( that i don tplan on overclocking either)

A separate questions I have is do you guys ever have any trouble with stuttering with this card? i heard that AMD dual gpu cards have problems with stuttering and tearing. If this is true, what are the fixes? I heard there are software options you can enable to get rid of this problem, at the expense of some input lag. Which isnt a HUGE deal for me, i am gaming on a 6950 right now, im used to gaming at around 30 fps on some of the newer games, and am pretty successful at it. Thoughts? Additions?

Any and all help is appreciated, thanks guys.


----------



## Mega Man

imo the stuttering/tearing is overly cried about by fanbois however amd is far better at higher res then lower res, just something to think about with lower res being 1080p


----------



## NBrock

Quote:


> Originally Posted by *Mega Man*
> 
> imo the stuttering/tearing is overly cried about by fanbois however amd is far better at higher res then lower res, just something to think about with lower res being 1080p


Yeah honestly I have never had an issue with drivers or stuttering...and that includes multiple setups;
*X800 XT
*Brief run with Nvidia Dual 7900 GTX...driver issues and then they eventually died and I got shafted on warranty.
*Dual 3870 @ 1080p in my Alienware M17 (last actual Alienware computer before Dell took over)
*Single 7790 @ 1080p
*Single 7970 @ 1080p
*Single 7970 @ 1440p
*Dual 7970 @ 1440p
*Single 290x @ 1440p
* Single 295x2 @ 1440p


----------



## bobbavet

Gday Guys

Finally got me rig built. Thanks for all the help, Wermad in particular.
Very impressed with the 295x2. It is rendering Warthunder heaps better than my old GTX690.
Ready for a 4K monitor now.









Cheers Bob


----------



## SAFX

Anyone test 15.8 beta drivers?


----------



## SAFX

Is it required or recommended to install AMD Chipset drivers? never noticed these before under Optional Downloads...


----------



## Mega Man

you have an intel chipset
you want to install an intel chipset driver


----------



## SAFX

wanted to make sure this wasn't related to gpu, thanks


----------



## ColeriaX

Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> Finally got me rig built. Thanks for all the help, Wermad in particular.
> Very impressed with the 295x2. It is rendering Warthunder heaps better than my old GTX690.
> Ready for a 4K monitor now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers Bob
> 
> Great looking rig Bob!


----------



## Dagamus NM

That is very nice looking Bob. What are your temps?


----------



## xer0h0ur

Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> Finally got me rig built. Thanks for all the help, Wermad in particular.
> Very impressed with the 295x2. It is rendering Warthunder heaps better than my old GTX690.
> Ready for a 4K monitor now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers Bob


Nice little box rig there. I like it.


----------



## bobbavet

Thanks for the comps guys. This is my 2nd small powerpackage build. They are a deadset PIA to build but the satisfaction is awesome.

Havent done any temps. I don't go worrying much about them.
What I have noted though is 30 idle and 52max in 3Dmark. The 295 is fed by a Koolance 240 copper rad.

5820K 4.4OC sits at around 28 and maxes out at 65 in Aida benching.

All good enough to keep me happy.

Quote:


> Originally Posted by *SAFX*
> 
> Anyone test 15.8 beta drivers?


I got this straight up for my install. Only have 2 games installed but all fine.
They now have "Amd Gaming Evolved" powered by Raptor which I am happy to see as I like this optimizing Gui with Nvidia.


----------



## Dagamus NM

Good temps then. I cannot stand raptr, glad you like it though.


----------



## bobbavet

Quote:


> Originally Posted by *Dagamus NM*
> 
> Good temps then. I cannot stand raptr, glad you like it though.


It's not as "sleek" as the Nvidia Gui and it's a pain that it is logon startup. I just like the Driver checking and for the 1 click optimizations. I hate trolling through settings in games.


----------



## wermad

I've been tossing the idea of going back to 5x1 eyefinity and I found the ideal monitor (dell p2416d ips 2560x1440). I've done this before but using 24" 1200 monitors instead, so its not uncharted territory, including the use of mst hubs. Wife won't let me spend any monies, so i may sell my uber overkill case and loop and go back to stock air. Just curious on what's your guys opinions on the best air cooling case to run two stock cards and an aio for the cpu? I'm wanna keep my atx board due to the extra spacing. I'm probably 0.01% sure I wanna do this







, so don't assume I'm doing it now







.


----------



## Mega Man

AHAHAHAHAHAHAHAHAHA

just remember i called it and have dibs !~

to be honest keep the case and just save up

that is my recommendation

did you see my case pics ?


----------



## wermad

I would definitely keep it but its an asset I can easily liquidate to make some monies for five of these monitors. I can do three if I sell my 4k monitor locally and a few more items i have lying around, but I would still have to wait for the two more. Monitors are fairly recent and so waiting may get them cheaper (saw a bnib on ebay for $250). Ever since I saw the Corsair 350Demon build, i wanted to run these cards stock







. Right now with temps hitting 90-95°F, the cards easily are hitting the high fifties °C with the fans in low rpm. 4k is fun, but I miss my old 5x1 and my goal was to go from 1200 to 1440 5x1 portrait. I'll need a hub, and i saw dual usb powered ones are available now.
Quote:


>


You got your tx already?!?!?!?!?!?! Sweet







. Nah, I haven't been keeping up w/ ocn for the last few weeks (gaming and recovering). Link to log or post????


----------



## Mega Man

hehehe only showing you teasers !~

if you want my opinion i think that next gen possibly with fury x4 4k eyefinity is/will be possible

dont invest in old tech


Spoiler: Warning: Spoiler!


----------



## wermad

I think 3x1 4k is way too much, even for fury x (especially since 3/4 way sucks). The 5x1 array is a very nice experience, very immersive and overwhelming in a good way









Edit, I need to break down the whole rig. Some of the fans are clicking or slowing/stopping. Found its the modmytoyz hubs are causing it (last fan of a few hubs). Might go w/ nzxt hubs or something different.


----------



## Mega Man

like i said fury x2 is coming

as to 3/4way sucking, i have some bad news for you


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> like i said fury x2 is coming
> 
> as to 3/4way sucking, i have some bad news for you


The few quad and three way reviews show both sides suck this gen. Then again, they're only pushing 4k and amd does better the higher the resolution. Where you seeing better scaling beyond two cores?

Just some #s and didn't round to get some perspective:

1920x1080: 2.07M
2560x1080: 2.76M
2560x1440: 3.68M
2560x1600: 4.09M
3440x1440: 4.95M
1920x1080x3: 6.22M
1920x1200x3: 6.91M
3840x2160: 8.29M
2560x1080x3: 8.29M
4096x2160: 8.84M
1920x1080x5: 10.36M
2560x1440x3: 11.05M
5120x2160: 11.05M
1920x1200x5: 11.52M
2560x1600x3: 12.28M
1920x1080x6: 12.44M
1920x1200x6: 13.82M
3440x1440x3: 14.86M
2560x1440x5: 18.43M
2560x1600x5: 20.48M
2560x1440x6: 22.11M
2560x1600x6: 24.57M
3840x2160x3: 24.88M
4096x2160x3: 26.54M
5120x2160x3: 33.14M
7680x4320 (8k): 33.17M

Btw, got through all of BF4 on stock cpu!







....only issue was the crappy movie/loading (and crappy) cut scenes.


----------



## Mega Man

maybe ... like i said tight lipped


----------



## drm8627

guys check this out its a 390x2.

http://www.overclock.net/t/1572452/hexus-powercolor-devil-13-dual-core-r9-390-16gb-announced/40#post_24387582


----------



## fat4l

Quote:


> Originally Posted by *drm8627*
> 
> guys check this out its a 390x2.
> 
> http://www.overclock.net/t/1572452/hexus-powercolor-devil-13-dual-core-r9-390-16gb-announced/40#post_24387582


well its 2x 290 instead of 2x 290X....
295x2= more horse power

Also guys I can get Asus Ares 3 ...one of 500








I like it but .............ahhhh not sure if its worth it.

custom pcb and stuff is sooo nice. If I only could get some crazy clocks with it ....


----------



## Mega Man

i love asus but imo all oo the dual custom cards from them are fail.

i esp hate the inputs on the card when will gpus manufactures learn 6 mini dps please !~


----------



## MIGhunter

Is there a ocing guide somewhere? Google is failing me.


----------



## d875j

Does someone have a modded custom bois i need 1 for my XFX R9 295X2. I need a undervolted and pretty nice clocked. Also hope to see it better in temps. I later get water block for it.


----------



## SAFX

can't seem to find decent benchmarks comparing 295x2 and 390x, anyone?


----------



## xer0h0ur

Quote:


> Originally Posted by *fat4l*
> 
> well its 2x 290 instead of 2x 290X....
> 295x2= more horse power
> 
> Also guys I can get Asus Ares 3 ...one of 500
> 
> 
> 
> 
> 
> 
> 
> 
> I like it but .............ahhhh not sure if its worth it.
> 
> custom pcb and stuff is sooo nice. If I only could get some crazy clocks with it ....


I can only imagine that AMD told them to stay away from making a 295X2 beater since the Fury X2 is on the horizon.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> I can only imagine that AMD told them to stay away from making a 295X2 beater since the Fury X2 is on the horizon.


So Asus Ares IV is coming ?


----------



## PCModderMike

Couple of final shots before I tear the stock cooler off. Definitely not going to miss it.


----------



## Intelligents

Quote:


> Originally Posted by *PCModderMike*
> 
> Couple of final shots before I tear the stock cooler off. Definitely not going to miss it.


I'm a week or two away from getting mine under water as well. The stock cooler is quite gorgeous on these. Too bad it doesn't do a better job.

Post some pics on the other side!


----------



## PCModderMike

Quote:


> Originally Posted by *Intelligents*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> Couple of final shots before I tear the stock cooler off. Definitely not going to miss it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm a week or two away from getting mine under water as well. The stock cooler is quite gorgeous on these. Too bad it doesn't do a better job.
> 
> Post some pics on the other side!
Click to expand...

Will do!


----------



## Mega Man

too bad amd just didnt get rid of air everything ( including the vrm fan ) and sell with just a waterblock with fittings sticking out, ( single slot of course ) !~


----------



## PCModderMike

Quote:


> Originally Posted by *Mega Man*
> 
> too bad amd just didnt get rid of air everything ( including the vrm fan ) and sell with just a waterblock with fittings sticking out, ( single slot of course ) !~


Single slot FTW!


----------



## Mega Man

single slot pc master race !~


----------



## jg900ss

WERMAD!

Wanted to close the loop on a question I had posted about my 295x2 crashing and power. Before going the RMA route, I gave the card to a friend to test after struggling to keep it "lit" for more than a few minutes, and he had some issues also, but he had no way to test the DP ports, only DVI. He had many crashes also and we looked at his Silverstone 1200 and saw the single rail set up with 4 OCPs, effectively making it 4 rails, and thus not able to handle the minimum 28 AMPs per cable. After loaning him another Silverstone 1200 that had clearly marked a single rail of 100 AMP capacity, all of a sudden everything worked fine. It became clear that my AX860i, on a machine with X99S, 5930K, 32GB RAM power requirements, was just not enough to guarantee that the single rail would have the amps needed to keep the 295x2 card working. I got the 295x2 card back from him, and stuck it in a new machine with Asrock 990FX Fatality board, an FX8350, 16GB RAM, and an EVGA 1600 G2. The EVGA is a beast but in a recent test seems to have met almost all the platinum measurements handily. Its huge, but oozes quality. So, a DP connection to my Benq XL2730Z for Freesync, and off it goes, not a care in the world. So clearly this was all about power and not taking for fact the claims by the PSU brands. Too bad, because the AX860i is a great PSU, platinum rated, and is now matched with a Sapphire Tri-X Fury OC, which along with the X99S, and 5930K system seems to be a better all around system. For the games I am playing most of the time, the Fury OC is more than enough at 1440p for max settings and over 100FPS. I even purchased a Corsair HX1200i PSU just to stick in the closet in case I need the BIG single rail in the future for who knows what.
Anyway, thanks for listening, and just thought it would be instructive to report back on the issues about power and rails as it relates to the 295x2. Cheers!


----------



## wermad

I actually found out the EVGA nex 1500 has multi rail. seems like the ocp multi-rail option comes up more frequently. i would recommend a v1000 or G/P1000-1300. I know for sure the V1000s don't have options to split the ocp protection and thus cutting down your amps per "virtual rails".

Did some gaming but its so freak'en hot down here its just not funs....


----------



## SAFX

help me out, I'm a bit confused on AMD's 300 series cards, so the R9 390x is available for purchase, is that a "Fury" card?
What is the top-of-the-line AMD card at the moment?


----------



## drm8627

Quote:


> Originally Posted by *SAFX*
> 
> help me out, I'm a bit confused on AMD's 300 series cards, so the R9 390x is available for purchase, is that a "Fury" card?
> What is the top-of-the-line AMD card at the moment?


fury x and 295x2.


----------



## wermad

@Safx

FuryX/Fury/FuryNano: Fiji (newest core)

290/290X/295x2/390/390X: hawaii (previous core gen)


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> help me out, I'm a bit confused on AMD's 300 series cards, so the R9 390x is available for purchase, is that a "Fury" card?
> What is the top-of-the-line AMD card at the moment?


390 and 390x are a tweaked version of the 290 and 290x with 8GB of GDDR5 on them.

the top tier cards atm are the Fury X (AIO cooled), Fury (Air cooled) and the Nano (ITX air cooled)


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> @Safx
> 
> FuryX/Fury/FuryNano: Fiji (newest core)
> 
> 290/290X/295x2/390/390X: hawaii (previous core gen)


Are all available?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> @Safx
> 
> FuryX/Fury/FuryNano: Fiji (newest core)
> 
> 290/290X/295x2/390/390X: hawaii (previous core gen)
> 
> 
> 
> Are all available?
Click to expand...

The Fury cards had a stock shortage for a while but i think they will be picking back up again soon


----------



## SAFX

@wemad, Sgt Bilko

Thanks, does the 295x2 keep up with Fury cards?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> @wemad, Sgt Bilko
> 
> Thanks, does the 295x2 keep up with Fury cards?


The 295x2 is still faster than every other Graphics card out there atm.....provided you have a working crossfire profile of course


----------



## drm8627

is the 295x2 affected by how many pcie lanes your cpu has, since its a singe slot gpu?


----------



## wermad

8x 3.0 is the recommended minimum. You can run two with a 5280k (28 lanes) a 8x/8x with any issues.


----------



## MIGhunter

Quote:


> Originally Posted by *Sgt Bilko*
> 
> .....provided you have a working crossfire profile of course


and how do we do that...


----------



## drm8627

Quote:


> Originally Posted by *wermad*
> 
> 8x 3.0 is the recommended minimum. You can run two with a 5280k (28 lanes) a 8x/8x with any issues.[
> Quote:
> 
> 
> 
> Originally Posted by *MIGhunter*
> 
> and how do we do that...
> 
> 
> 
> its up to the game devs to make sure theyre included, if theyre not there are often workarounds for it
Click to expand...


----------



## drm8627

Quote:


> Originally Posted by *wermad*
> 
> 8x 3.0 is the recommended minimum. You can run two with a 5280k (28 lanes) a 8x/8x with any issues.


one for a 4790k should be fine though right?


----------



## MIGhunter

Quote:


> Originally Posted by *drm8627*
> 
> its up to the game devs to make sure theyre included, if theyre not there are often workarounds for it


What kind of work arounds? Are they game specific or are they general? I'd like to find one for Ark. My FPS in that game is terrible for a 295x2.


----------



## drm8627

Quote:


> Originally Posted by *MIGhunter*
> 
> What kind of work arounds? Are they game specific or are they general? I'd like to find one for Ark. My FPS in that game is terrible for a 295x2.


usually there are ways to enable crossfie, usually game specific, most I have found usually involve going into the games file and adding a line of code enabling crossfire.


----------



## Sgt Bilko

Quote:


> Originally Posted by *drm8627*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MIGhunter*
> 
> What kind of work arounds? Are they game specific or are they general? I'd like to find one for Ark. My FPS in that game is terrible for a 295x2.
> 
> 
> 
> usually there are ways to enable crossfie, usually game specific, most I have found usually involve going into the games file and adding a line of code enabling crossfire.
Click to expand...

Go here:



I've got my 390x in atm but on yours when you click add and find the .exe file for whatever app it is you'll get a crossfire option menu down the bottom, might take some messing about but sometimes you can get good results


----------



## fat4l

Nice. My Asus Ares III is coming ...








Will be seeling my 295X2 soon ....


----------



## wermad

Quote:


> Originally Posted by *drm8627*
> 
> one for a 4790k should be fine though right?


I'm running a *stock* 4690K


----------



## Ironjer

here we go 5960X overclocked to 4.125 gpu at stock


----------



## PCModderMike

Very nice.


----------



## PCModderMike

Warning, graphic nudity contained in this post.


Spoiler: Warning: NSFW


----------



## xer0h0ur

*sees NSFW tag*
*still opens it at work*


----------



## wermad

I can never achieve such purrty pic taking...


----------



## PCModderMike

Quote:


> Originally Posted by *xer0h0ur*
> 
> *sees NSFW tag*
> *still opens it at work*




Quote:


> Originally Posted by *wermad*
> 
> I can never achieve such purrty pic taking...


With the crazy builds you do, and how frequently you change them, I would *love* to get your stuff in my studio for some pics.


----------



## MIGhunter

is it worth it to run 2 of these cards on a board that is 8/8 if both lanes are used? I bought a Intel Core i7-5820K and a ASRock Fatal1ty X99X Killer LGA 2011-v3. Planning on watercooling and just not sure if I should get 2 cards or just stick with 1


----------



## wermad

Quote:


> Originally Posted by *MIGhunter*
> 
> is it worth it to run 2 of these cards on a board that is 8/8 if both lanes are used? I bought a Intel Core i7-5820K and a ASRock Fatal1ty X99X Killer LGA 2011-v3. Planning on watercooling and just not sure if I should get 2 cards or just stick with 1


8x 3.0 is perfectly fine. hardocp did their 295x2 quad review running 8x/8x 3.0. I wouldn't do 4x though (even 3.0). If you have a 2.0 board, I would recommend 16x 2.0.

Quote:


> Originally Posted by *PCModderMike*
> 
> With the crazy builds you do, and how frequently you change them, I would *love* to get your stuff in my studio for some pics.


They're subtle tbh


----------



## MIGhunter

Quote:


> Originally Posted by *wermad*
> 
> 8x 3.0 is perfectly fine. hardocp did their 295x2 quad review running 8x/8x 3.0. I wouldn't do 4x though (even 3.0). If you have a 2.0 board, I would recommend 16x 2.0.


Quote:


> 3 x PCI Express 3.0 x16 Slots (PCIE1 @ x16 mode; PCIE3 @ x16 mode; PCIE5 @ x8 mode)
> *If you install CPU with 28 lanes, PCIE1/PCIE3/PCIE4 will work at x16/x8/x4 and 3-Way SLI is not supported. To support 3-Way SLI, please install CPU with 40 lanes.


Relooking at my MB and CPU, it looks like it will run lane 1 at 16 and 3 at 8. So, if I install 2(295x2) does that mean it they run at 16 and 8? and if so, is there a problem with that?


----------



## silencespr

My XFX R9 295x2 still crashes after about 30 minutes of playing BF4 at 4k... do i really need a custom loop to get this thing under control ?


----------



## PCModderMike

Quote:


> Originally Posted by *silencespr*
> 
> My XFX R9 295x2 still crashes after about 30 minutes of playing BF4 at 4k... do i really need a custom loop to get this thing under control ?


Shouldn't have to. You believe the crashing is from the card overheating?


----------



## silencespr

Quote:


> Originally Posted by *PCModderMike*
> 
> Shouldn't have to. You believe the crashing is from the card overheating?


yeah because when i cool off the radiator with ice bag i can play way longer, but if i just have it in the system it crashes after 30-40 mins.


----------



## joeh4384

Quote:


> Originally Posted by *silencespr*
> 
> yeah because when i cool off the radiator with ice bag i can play way longer, but if i just have it in the system it crashes after 30-40 mins.


What temps do you see while gaming? If the card is overheating, typically in my experience the core clocks drop and if it is the VRMs overheating they drop to 300mhz for moments. I have never had a temp issue completely crash my system. Do you have an issue with power?


----------



## wermad

Quote:


> Originally Posted by *MIGhunter*
> 
> Relooking at my MB and CPU, it looks like it will run lane 1 at 16 and 3 at 8. So, if I install 2(295x2) does that mean it they run at 16 and 8? and if so, is there a problem with that?


If the cpu/mb combo allows it, check the manual or just insert the cards (if you have them on the stock cooler). I chose this board because of the spacing; i have a bridge I could use and in case I ran stock to allow more breathing room.



Its got a plx to double the DC's lanes but you can still run it 8x if you wish. The card has its own plx that duplicates the lane for both rather then splitting the lanes.

Quote:


> Originally Posted by *silencespr*
> 
> My XFX R9 295x2 still crashes after about 30 minutes of playing BF4 at 4k... do i really need a custom loop to get this thing under control ?


Thermal cut off for the card is 75°C but it starts to throttle down your gpu's speed, it shouldn't crash tbh. What's your temps without the ice on the rad?


----------



## silencespr

Quote:


> Originally Posted by *joeh4384*
> 
> What temps do you see while gaming? If the card is overheating, typically in my experience the core clocks drop and if it is the VRMs overheating they drop to 300mhz for moments. I have never had a temp issue completely crash my system. Do you have an issue with power?


Quote:


> Originally Posted by *wermad*
> 
> If the cpu/mb combo allows it, check the manual or just insert the cards (if you have them on the stock cooler). I chose this board because of the spacing; i have a bridge I could use and in case I ran stock to allow more breathing room.
> 
> 
> 
> Its got a plx to double the DC's lanes but you can still run it 8x if you wish. The card has its own plx that duplicates the lane for both rather then splitting the lanes.
> Thermal cut off for the card is 75°C but it starts to throttle down your gpu's speed, it shouldn't crash tbh. What's your temps without the ice on the rad?


im always 70-71 and most likely during the crash it reaches past 75 when a big explosion happens. I have a 1350 power supply from platimax i use two separate rails to handle the power hunger of the card.


----------



## MIGhunter

Quote:


> Originally Posted by *silencespr*
> 
> im always 70-71 and most likely during the crash it reaches past 75 when a big explosion happens. I have a 1350 power supply from platimax i use two separate rails to handle the power hunger of the card.


I thought 2 separate rails was bad?


----------



## Mega Man

no there is a single rail and multi rail units, both have pluses and both have minuses

neither are good nor bad


----------



## MIGhunter

Quote:


> Originally Posted by *Mega Man*
> 
> no there is a single rail and multi rail units, both have pluses and both have minuses
> 
> neither are good nor bad


I haven't bought a PSU in a while, I know when I bought my last one, multi rails was very bad because they didn't regulate as well and one rail might be really good while the other would struggle. This was causing issues in builds. I'm sure the tech is better now.


----------



## Mega Man

ffxi what server were you on ?


----------



## MIGhunter

Quote:


> Originally Posted by *Mega Man*
> 
> ffxi what server were you on ?


Leviathan, you?


----------



## Mega Man

I quit years ago Bismarck. The first time I quit I had a novio ( did not keep it as I felt it would be wasted ). Wasted 9 years to that game


----------



## MIGhunter

Quote:


> Originally Posted by *Mega Man*
> 
> I quit years ago Bismarck. The first time I quit I had a novio ( did not keep it as I felt it would be wasted ). Wasted 9 years to that game


I played from ps2 release until the Quake in Japan shut the server down. At that point I played WoW and then Tera. Eventually went back to 11 but didn't play super long after that.


----------



## BootPirate

Very nice.


----------



## fat4l

So my Asus Ares 3 is finally here.
Just a quick test if everything works








(My 295X2 still in the rig...It looks sooo poor and tiny vs Ares III







)


----------



## ColeriaX

Pretty nice looking. Lets see some Ares III solo results


----------



## TooManyAlpacas

I just wanted to share this experience with other owners of this card. For a long time I have hated the silver on the card due the my case being the H440 designed by Razer. The card just did not fit in, so recently I have decided to take care of this with none other than plasti dip. Surprisingly the results, at least in my eyes, are amazing. I have finally got a beast of a card that matches by build.


----------



## predator06

Hi, i will receive my R9 295X2.

I have a question : Is it possible to manage the fan on the radiator ? For exemple with afterburner ?


----------



## Dagamus NM

You would need to connect it to your motherboard or fan controller. Use a mini fan to regular adapter. While you got the cover off you can change the TIM and thermal pads.


----------



## fat4l

Quote:


> Originally Posted by *predator06*
> 
> Hi, i will receive my R9 295X2.
> 
> I have a question : Is it possible to manage the fan on the radiator ? For exemple with afterburner ?


Nope.
Quote:


> Originally Posted by *ColeriaX*
> 
> Pretty nice looking. Lets see some Ares III solo results


So far I can tell, power delivery on this card is awesome. Its so much more stable and precise than on 295X2.
You can use Trixx and run +200mV no problem. That's something you cant do with 295X2 cuz of "blackscreen" issue.
Using asus Gpu tweak, u can do +150mV. Afterburner, only +100mV.
Im looking forward 1200+MHz on this card.

I still need to change thermal pads on the card and re-paste it with Grizzly Kryonaut cuz eom paste and pads are ****...


----------



## Dagamus NM

What magic thing happens at +1200MHz? If you keep gaining FPS or synthetic bench points then it makes sense but my money is on you having better performance at a mark below that high value.

While the Ares III should volt and clock higher than the 295x2, there is only so much to get out of these Hawaii cores.

I found my synthetic scores peaked at a 30MHz overclock. It ran just fine at higher clocks without artifacts but min FPS, max FPS, and overall FPS all dropped at the higher clocks.

Please prove me wrong though.


----------



## Medusa666

The 92mm fan on my 295X2 shroud has started to make some weird rattling noise during heavier loads, in idle or low it goes away.

Is this reason enough for RMA or?


----------



## ColeriaX

Synthetic scores for my 295x2s only increased with mhz and volts. Weird.


----------



## Mega Man

Quote:


> Originally Posted by *Medusa666*
> 
> The 92mm fan on my 295X2 shroud has started to make some weird rattling noise during heavier loads, in idle or low it goes away.
> 
> Is this reason enough for RMA or?


you will have to ask the manufacture. I would get a video of it prior to asking however (it is not nessisarry, but proof makes it easier imo


----------



## fat4l

Well what amazes me the most is that 3 memory modules are completelly without any kind of cooling. They dont have thermal pads and they have no contact with backplate. Its cuz of that Ares cutout. Still they could cool them partially...
Also paste was spread horribly. Didn't rly expect that from Asus and Ares Series....


----------



## fat4l

Some screens of naked ares3








U can see the pads missing for 3 modules cuz of Ares cut-out.


----------



## F4ze0ne

Hi. Has anyone replaced the stock fan on the 295x2 rad?

My fan is rattling from being mounted horizontal and I want to replace it with an FDB fan so it's quieter.

I have a 4-pin PWM fan here, but I just realized that it won't work.









Do 3-pin fans work properly on this even though the stock fan is 2-pin?


----------



## ColeriaX

Got a new toy in the mail to help cool these beasts down


----------



## Medusa666

Quote:


> Originally Posted by *F4ze0ne*
> 
> Hi. Has anyone replaced the stock fan on the 295x2 rad?
> 
> My fan is rattling from being mounted horizontal and I want to replace it with an FDB fan so it's quieter.
> 
> I have a 4-pin PWM fan here, but I just realized that it won't work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do 3-pin fans work properly on this even though the stock fan is 2-pin?


Yes it does.


----------



## Mega Man

Quote:


> Originally Posted by *ColeriaX*
> 
> Got a new toy in the mail to help cool these beasts down


congrats, debating about this or 3 480 monstas for my tx10-( per section ) thinking about going 480s but ill need like 15-20 of them, thankfully this build will not happen for a year or so
Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F4ze0ne*
> 
> Hi. Has anyone replaced the stock fan on the 295x2 rad?
> 
> My fan is rattling from being mounted horizontal and I want to replace it with an FDB fan so it's quieter.
> 
> I have a 4-pin PWM fan here, but I just realized that it won't work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do 3-pin fans work properly on this even though the stock fan is 2-pin?
> 
> 
> 
> Yes it does.
Click to expand...

fans are really simple 2 pin = 12v and ground
3 pin ( generally ) 12v, ground, rpm
4pin fan connector 12v, ground, rpm, pwm signal

( this is of course PC fans and not other fans


----------



## wermad

Current rad setup:
6x Monsta 480s
2x Monsta 560s


----------



## ColeriaX

You certainly are mad heh. I need a bigger case apparently!


----------



## wermad

it's uber overkill but it can be done with this monster case: tx10-d + pedestal.


----------



## Mega Man

You know wermad I know the case is big for 2 pcs. But think about using it for 1. It is like having a 2 story 54' semi. ...


----------



## F4ze0ne

Quote:


> Originally Posted by *Medusa666*
> 
> Yes it does.


Quote:


> Originally Posted by *Mega Man*
> 
> fans are really simple 2 pin = 12v and ground
> 3 pin ( generally ) 12v, ground, rpm
> 4pin fan connector 12v, ground, rpm, pwm signal
> 
> ( this is of course PC fans and not other fans


Thanks guys.


----------



## wermad

^^^

PWM can be wired differently from product to product, so double check your manual as colors for the speed and pulse lines can be confusing.


----------



## F4ze0ne

All the wires are black and it came with no manual.









Here is the fan I bought.

http://www.newegg.com/Product/Product.aspx?Item=N82E16835352022


----------



## ColeriaX

Finally cracked the 50k GPU barrier in FS! Not sure if theres anything left for this rig to do







. Too bad we dont have access to some dx12 titles, id love to see what our 295s can do with proper drivers and optimization.


----------



## MIGhunter

Quote:


> Originally Posted by *ColeriaX*
> 
> Finally cracked the 50k GPU barrier in FS! Not sure if theres anything left for this rig to do
> 
> 
> 
> 
> 
> 
> 
> . Too bad we dont have access to some dx12 titles, id love to see what our 295s can do with proper drivers and optimization.
> 
> 
> Spoiler: Warning: Spoiler!


Are you running 4 295x2s or is that reading 2 295x2s as 4 GPUs?


----------



## ColeriaX

Quote:


> Originally Posted by *MIGhunter*
> 
> Are you running 4 295x2s or is that reading 2 295x2s as 4 GPUs?


It reads 4x GPUs for 2x 295x2. I wish i could have four though







. However that would require another visit from the electrician and another PSU. On that note i just consolidated my 2 PSUs, 1300 G2 and AX 850 into one EVGA 1600 T2. Heres a GPU pic for reference.



Edit: Spell check


----------



## MIGhunter

Quote:


> Originally Posted by *ColeriaX*
> 
> It reads 4x GPUs for 2x 295x2. I wish i could have four though
> 
> 
> 
> 
> 
> 
> 
> . However that would require another visit from the electrician and another PSU. On that note i just consolidated my 2 PSUs, 1300 G2 and AX 850 into one EVGA 1600 T2. Heres a GPU pic for reference.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Edit: Spell check


You think you need 1600watts? I am working on my new build and debating on what PSU to get. I was thinking of the 1200 watt version of your PSU. Where is your build?


----------



## ColeriaX

Quote:


> Originally Posted by *MIGhunter*
> 
> You think you need 1600watts? I am working on my new build and debating on what PSU to get. I was thinking of the 1200 watt version of your PSU. Where is your build?


Sorry, didn't realize my rig wasn't in my sig. Updated for your viewing pleasure. BTW with just the SuperNOVA G2 1300W I was getting shutdowns constantly along w/ breaker failures (not synonymous with one another). Had 20 Amp dedicated circuit installed to combat the breaker problem and just recently switched from the 2 PSU combo to the 1600w







.

Edit: Here's a link to a review for 295x2 Quadfire done by HardOCP. They estimated roughly 1400 Watts were necessary and ended up using 2 PSU's.

http://www.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review/13#.VgUSh_lVhBc


----------



## MIGhunter

Quote:


> Originally Posted by *ColeriaX*
> 
> Sorry, didn't realize my rig wasn't in my sig. Updated for your viewing pleasure. BTW with just the SuperNOVA G2 1300W I was getting shutdowns constantly along w/ breaker failures (not synonymous with one another). Had 20 Amp dedicated circuit installed to combat the breaker problem and just recently switched from the 2 PSU combo to the 1600w
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Edit: Here's a link to a review for 295x2 Quadfire done by HardOCP. They estimated roughly 1400 Watts were necessary and ended up using 2 PSU's.
> 
> http://www.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review/13#.VgUSh_lVhBc


Thanks, good to know.

Here is what I have:
i7 5820k
ASRock Fatal1ty X99X Killer
Fractal Design Define S with Window Side Panel
SAMSUNG 850 EVO 2.5" 500GB
SAPPHIRE Radeon R9 295x2

I still need the memory, PSU and watercooling. I'm really debating on adding a 2nd 295x2, which also makes me wish I'd gone with your processor for the 40 lanes instead of the 28 I have on mine.


----------



## ColeriaX

Yeah if I had to do it over again I probably wouldn't have gotten another, little to no benefit for games that I play. It does look nice though, so theres always that


----------



## Dagamus NM

yes you want 1600W for two. While there is not a lot to be gained from adding the second in games, open cl applications will get demolished.

That said, I did not enjoy having two on air. My loop is all setup other than needing to stick another pump in the mix. I have two 420mm copper aquacomputer rads linked side to side. One has the pump/res integrated. 12 cougar 140mm fans in push pull. Both GPUs, CPU, RIVE blocks, and two memory blocks on the loops. The only part that seems to even get warm is the fitting coming out of the terminal block after the second gpu. Before I couldn't even think of overclocking, heck I dealt with thermal throttling constantly. Now mine are quite happy. Stable power and stable temperature. I can actually just use it for what I built it for. If I get bored I go tweak one of the other computers.


----------



## Ironjer

I got [email protected] and [email protected] with Corsair AX1500i working good the max consumption i did see 1400 Watt.

You can test all the system con AIDA64 GPU and CPU


----------



## wermad

V1000 is a great choice for a single card setup.

Most I pulled was avg. 1200w @ the wall running 3dmark11 all stock ( current sig rig). I know metro 2033 should pull more as it killed my last g1600. Hawaii is known to suck a lot more wattage once you turn up the dials. 20 amp breaker is highly recommended for a dual card setup as 15 amps will easily trip.


----------



## Alex132

Using V1000 here. 10/10 would recommend. Doesn't even get hot or ramp up the fan when pushing the card to the edge +200mv 1280/1500 w/1.52v 5ghz 2500k.


----------



## wermad

Dat SB







, still impressive







.


----------



## ColeriaX

Quote:


> Originally Posted by *Alex132*
> 
> Using V1000 here. 10/10 would recommend. Doesn't even get hot or ramp up the fan when pushing the card to the edge +200mv 1280/1500 w/1.52v 5ghz 2500k.


What program you using to push volts?


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> Dat SB
> 
> 
> 
> 
> 
> 
> 
> , still impressive
> 
> 
> 
> 
> 
> 
> 
> .


It's actually rather bad for a 2500k. TheBlademaster01 has a 2600k which could hit like 5.4Ghz at 1.48v or something.

Mine has just worn down, I used to do 1.4875v for 5Ghz back when I got the chip - but around 4 years later at that voltage and clock constantly I need 1.52v to hit the same 5Ghz









Quote:


> Originally Posted by *ColeriaX*
> 
> What program you using to push volts?


Sapphire Trixx. I found it was better than MSI AB.


----------



## N0o0B

Hello,

I just set up a quadfire 295x2 build







, I will be making an official join-the-club post with photos as soon as get a few things sorted out.

I am having trouble trying to get the cards to bench a high score with Unigine Valley (~4600 is the max I could get with quadfire) which I know is low

Link: http://www.bit-tech.net/hardware/graphics/2014/04/08/amd-radeon-r9-295x2-review/8

in a nutshell,

quadfire 295x2 @ 1060/1475
i5 6600k @ 4.6
32gb DDR4 @ 2400
win 8.1 x64
latest beta drivers

GPUZ: all but one GPU have 0% load (says crossfire is enabled though..)
Afterburner: Load spread across the cards but no fan speed

input anyone?


----------



## Mega Man

there is stuff you have to do with valley to get cfx to work ( iirc you have to make a cfx profile for it to force it )


----------



## N0o0B

Ok, I now have CF working. I still get the same score though?

I am not sure what it is, but I don't think it's hardware related These cards are insane, I just ran skyrim @ 4k and had 230+ fps indoors (vanilla







)

What's up with Valley!?


----------



## wermad

Running thru an oldie but a goodie game, lost planet 2. Had to disable xfire in dx11 otherwise it crashes. Dx9 is pretty meh and won't run as smooth on my cpu as dx11. Still rocking stock DC







. Starting to open the case up for a bit of cleaning. Tossing the idea of tapping my rads to m4


----------



## Mega Man

you regretted it last time dont forget !~

that game is on my list to do


----------



## wermad

Two of the rads were tapped to 6-32...







.and I was planning to buy more and better screws but the eBay seller went mia. The alpha rads were pretty good but the ek ones were the sucks ones. Might have better luck with the slightly more flexible sp120s.


----------



## Dagamus NM

Quote:


> Originally Posted by *Alex132*
> 
> Sapphire Trixx. I found it was better than MSI AB.


How did you go about getting the voltage slider to work in trixx with your xfx 295x2? I downloaded trixx and I get nothing but power, core, and memory.


----------



## Alex132

Quote:


> Originally Posted by *Dagamus NM*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Sapphire Trixx. I found it was better than MSI AB.
> 
> 
> 
> How did you go about getting the voltage slider to work in trixx with your xfx 295x2? I downloaded trixx and I get nothing but power, core, and memory.
Click to expand...

Didn't do anything, just opened it and it works lol


----------



## Dagamus NM

Mine does not look like that. i see it is 4.9. Let me see what I have.

Ok, so for the record version 5.0 is lame. 4.9.1 starts at 1000mV and goes up to 1300.

What is the base voltage on these cards anyhow?


----------



## N0o0B

*EDIT: GOD I SUCK AT PHOTOGRAPHY!!**

Ok, here's the official request!

I was going for a covert quadfire build, so I went with the NZXT s340 and crammed everything in there, in a nice loving way!

Part list:

PCPartPicker part list / Price breakdown by merchant

*CPU:* Intel Core i5-6600K 3.5GHz Quad-Core Processor ($248.95 @ Amazon)
*CPU Cooler:* Deepcool CAPTAIN 240 91.1 CFM Liquid CPU Cooler ($109.99 @ Newegg)
*Motherboard:* MSI Z170A GAMING M7 ATX LGA1151 Motherboard ($212.98 @ Newegg)
*Memory:* Team Dark 16GB (2 x 8GB) DDR4-2666 Memory ($120.99 @ Newegg)
*Memory:* Team Dark 16GB (2 x 8GB) DDR4-2666 Memory ($120.99 @ Newegg)
*Storage:* Samsung 850 EVO-Series 500GB 2.5" Solid State Drive ($172.79 @ Amazon)
*Video Card:* Asus Radeon R9 295X2 8GB Video Card (2-Way CrossFire) ($951.76 @ Amazon)
*Video Card:* Asus Radeon R9 295X2 8GB Video Card (2-Way CrossFire) ($951.76 @ Amazon)
*Case:* NZXT S340 (Black) ATX Mid Tower Case ($68.99 @ SuperBiiz)
*Power Supply:* LEPA G Series 1600W 80+ Gold Certified Fully-Modular ATX Power Supply ($214.99 @ Newegg)
*Monitor:* Samsung U28D590D 60Hz 28.0" Monitor ($549.99 @ Best Buy)
*Total:* $3724.18
_Prices include shipping, taxes, and discounts when available_
_Generated by PCPartPicker 2015-09-29 06:06 EDT-0400_

Pics:


----------



## ColeriaX

Hey, welcome to the club. Go flash some custom bios, overclock and post your results here. More and more people with Quadfire


----------



## Ironjer

Now i'm running my quad fire 295 with FreeSync 4K Samsung running beautiful with Crysis 3 and BF4 60fps locked but the stock cooler sucks when all the core at 100% up to 74c i will get push pull SP120 corsair i need to improve that temps


----------



## Dagamus NM

Quote:


> Originally Posted by *ColeriaX*
> 
> Hey, welcome to the club. Go flash some custom bios, overclock and post your results here. More and more people with Quadfire


What bios are you running on your xfx? Your heaven bench scores are awesome with these cards. Mine just run out of gas it seems. I put my voltage at 1250mV and got stuck in a flashing screen loop that reminded me of old NES games. Managed to reset everything, so I am trying to sort where to go next.

Is the stock voltage 1.000V? This is where trixx starts at. AB just had the +/- scale and it ran +100mV no problem. I honestly have not time to run many voltages but I really just need to know if +100mV in afterburner is equal to +100mV in trixx. Do they both share the same reference point. Everything else is identical so O assume the same holds for voltage.


----------



## ColeriaX

Quote:


> Originally Posted by *Ironjer*
> 
> Now i'm running my quad fire 295 with FreeSync 4K Samsung running beautiful with Crysis 3 and BF4 60fps locked but the stock cooler sucks when all the core at 100% up to 74c i will get push pull SP120 corsair i need to improve that temps


Before i went custom loop 2 sp120s in push pull mounted in the front with barbs on the bottom never throttled.


----------



## ColeriaX

Quote:


> Originally Posted by *Dagamus NM*
> 
> What bios are you running on your xfx? Your heaven bench scores are awesome with these cards. Mine just run out of gas it seems. I put my voltage at 1250mV and got stuck in a flashing screen loop that reminded me of old NES games. Managed to reset everything, so I am trying to sort where to go next.
> 
> Is the stock voltage 1.000V? This is where trixx starts at. AB just had the +/- scale and it ran +100mV no problem. I honestly have not time to run many voltages but I really just need to know if +100mV in afterburner is equal to +100mV in trixx. Do they both share the same reference point. Everything else is identical so O assume the same holds for voltage.


Sapphire custom bios on both of mine. Stock voltage is 1000. With one card i could push voltage and clocks near 1250 but in quadfire i can barely give it any extra volts/clock speed. Oh well. Btw id tell you to give up afterburner for these cards buts thats just a peeference of mine to hse trixx. Seems to regulate voltage better. Also higher memory clocks wont always give you higher scores ive found. My game clocks when playing are much less than the bench clocks. Any other questions feel free to ask. Sorry about any spelling or grammar mistakes on phone.


----------



## MIGhunter

Quote:


> Originally Posted by *N0o0B*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *EDIT: GOD I SUCK AT PHOTOGRAPHY!!**
> 
> Ok, here's the official request!
> 
> I was going for a covert quadfire build, so I went with the NZXT s340 and crammed everything in there, in a nice loving way!
> 
> Part list:
> 
> PCPartPicker part list / Price breakdown by merchant
> 
> *CPU:* Intel Core i5-6600K 3.5GHz Quad-Core Processor ($248.95 @ Amazon)
> *CPU Cooler:* Deepcool CAPTAIN 240 91.1 CFM Liquid CPU Cooler ($109.99 @ Newegg)
> *Motherboard:* MSI Z170A GAMING M7 ATX LGA1151 Motherboard ($212.98 @ Newegg)
> *Memory:* Team Dark 16GB (2 x 8GB) DDR4-2666 Memory ($120.99 @ Newegg)
> *Memory:* Team Dark 16GB (2 x 8GB) DDR4-2666 Memory ($120.99 @ Newegg)
> *Storage:* Samsung 850 EVO-Series 500GB 2.5" Solid State Drive ($172.79 @ Amazon)
> *Video Card:* Asus Radeon R9 295X2 8GB Video Card (2-Way CrossFire) ($951.76 @ Amazon)
> *Video Card:* Asus Radeon R9 295X2 8GB Video Card (2-Way CrossFire) ($951.76 @ Amazon)
> *Case:* NZXT S340 (Black) ATX Mid Tower Case ($68.99 @ SuperBiiz)
> *Power Supply:* LEPA G Series 1600W 80+ Gold Certified Fully-Modular ATX Power Supply ($214.99 @ Newegg)
> *Monitor:* Samsung U28D590D 60Hz 28.0" Monitor ($549.99 @ Best Buy)
> *Total:* $3724.18
> _Prices include shipping, taxes, and discounts when available_
> _Generated by PCPartPicker 2015-09-29 06:06 EDT-0400_
> 
> Pics:


How are you liking that Team Dark memory and that LEPA PSU? I'm finishing up the parts on my build and the last 2 things I need to add are memory and the PSU. I haven't decided on which I want to buy. I haven't heard of either brand before.


----------



## Mega Man

Lepa is a good brand think of it as the high end brand of enermax. Cause it is. The psu was epic back in the day but tbh you would be better off with a evga g2/p2 or leadex


----------



## MIGhunter

Quote:


> Originally Posted by *Mega Man*
> 
> Lepa is a good brand think of it as the high end brand of enermax. Cause it is. The psu was epic back in the day but tbh you would be better off with a evga g2/p2 or leadex


thanks, while we are at it, since you seem to be knowledgeable, http://www.overclock.net/t/1566455/new-build-which-way-to-go/10#post_24462091

[Edit]-Also, do you think Eco mode is worth the extra $50?


----------



## Mega Man

Personal preference.

How silent do you want your rig to be


----------



## wdpir32k3

Quote:


> Originally Posted by *N0o0B*
> 
> *EDIT: GOD I SUCK AT PHOTOGRAPHY!!**
> 
> Ok, here's the official request!
> 
> I was going for a covert quadfire build, so I went with the NZXT s340 and crammed everything in there, in a nice loving way!
> 
> Part list:
> 
> PCPartPicker part list / Price breakdown by merchant
> 
> *CPU:* Intel Core i5-6600K 3.5GHz Quad-Core Processor ($248.95 @ Amazon)
> *CPU Cooler:* Deepcool CAPTAIN 240 91.1 CFM Liquid CPU Cooler ($109.99 @ Newegg)
> *Motherboard:* MSI Z170A GAMING M7 ATX LGA1151 Motherboard ($212.98 @ Newegg)
> *Memory:* Team Dark 16GB (2 x 8GB) DDR4-2666 Memory ($120.99 @ Newegg)
> *Memory:* Team Dark 16GB (2 x 8GB) DDR4-2666 Memory ($120.99 @ Newegg)
> *Storage:* Samsung 850 EVO-Series 500GB 2.5" Solid State Drive ($172.79 @ Amazon)
> *Video Card:* Asus Radeon R9 295X2 8GB Video Card (2-Way CrossFire) ($951.76 @ Amazon)
> *Video Card:* Asus Radeon R9 295X2 8GB Video Card (2-Way CrossFire) ($951.76 @ Amazon)
> *Case:* NZXT S340 (Black) ATX Mid Tower Case ($68.99 @ SuperBiiz)
> *Power Supply:* LEPA G Series 1600W 80+ Gold Certified Fully-Modular ATX Power Supply ($214.99 @ Newegg)
> *Monitor:* Samsung U28D590D 60Hz 28.0" Monitor ($549.99 @ Best Buy)
> *Total:* $3724.18
> _Prices include shipping, taxes, and discounts when available_
> _Generated by PCPartPicker 2015-09-29 06:06 EDT-0400_
> 
> Pics:


God man I love your pc could you make a video I really want to see those videos cards going to town


----------



## Dagamus NM

Quote:


> Originally Posted by *ColeriaX*
> 
> Sapphire custom bios on both of mine. Stock voltage is 1000. With one card i could push voltage and clocks near 1250 but in quadfire i can barely give it any extra volts/clock speed. Oh well. Btw id tell you to give up afterburner for these cards buts thats just a peeference of mine to hse trixx. Seems to regulate voltage better. Also higher memory clocks wont always give you higher scores ive found. My game clocks when playing are much less than the bench clocks. Any other questions feel free to ask. Sorry about any spelling or grammar mistakes on phone.


Maybe the custom bios is the trick, I ran at 1250mV this morning. Bumped up my cores by 10MHz and saw a decrease in score.

Mind sharing that bios?
Quote:


> Originally Posted by *MIGhunter*
> 
> [Edit]-Also, do you think Eco mode is worth the extra $50?


I don't think it is.


----------



## ColeriaX

Quote:


> Originally Posted by *Dagamus NM*
> 
> Maybe the custom bios is the trick, I ran at 1250mV this morning. Bumped up my cores by 10MHz and saw a decrease in score.
> 
> Mind sharing that bios?
> I don't think it is.


Heres the link

https://www.techpowerup.com/vgabios/157762/Sapphire.R9295X2.4096.140414.rom


----------



## Dagamus NM

Quote:


> Originally Posted by *ColeriaX*
> 
> Heres the link
> 
> https://www.techpowerup.com/vgabios/157762/Sapphire.R9295X2.4096.140414.rom


Thank you sir. I will have to wait to install it. Centurylink suspended my service for a bill that is due October 10th. WTH??


----------



## ColeriaX

Quote:


> Originally Posted by *Dagamus NM*
> 
> Thank you sir. I will have to wait to install it. Centurylink suspended my service for a bill that is due October 10th. WTH??


Tether your phone? Its a rather small file.


----------



## Dagamus NM

X79 doesn't have built in wifi. My x99 builds do, but my 295x2's are not on them. My usb wifi adapters are at work.

I am tethered to my phone via my iPad now. Thanks though.


----------



## MIGhunter

Quote:


> Originally Posted by *ColeriaX*
> 
> Heres the link
> 
> https://www.techpowerup.com/vgabios/157762/Sapphire.R9295X2.4096.140414.rom


What's the trick to updating the BIOS? Is it worth it? I honestly haven't tried to OC mine yet. It's sitting in my computer as an interim until I build my new system next month.


----------



## N0o0B

Quote:


> Originally Posted by *ColeriaX*
> 
> Hey, welcome to the club. Go flash some custom bios, overclock and post your results here. More and more people with Quadfire


Custom bios, you say....ha? *scratches head*








Quote:


> Originally Posted by *MIGhunter*
> 
> How are you liking that Team Dark memory and that LEPA PSU? I'm finishing up the parts on my build and the last 2 things I need to add are memory and the PSU. I haven't decided on which I want to buy. I haven't heard of either brand before.


The PSU is great, but it's a bit tricky to set up, as you have to use 2 rails for each card. I also like the ram's performance.
Quote:


> Originally Posted by *MIGhunter*
> 
> How are you liking that Team Dark memory and that LEPA PSU? I'm finishing up the parts on my build and the last 2 things I need to add are memory and the PSU. I haven't decided on which I want to buy. I haven't heard of either brand before.


Thanks! it'll probably be a while before I do any videos though


----------



## N0o0B

Double post


----------



## fat4l

Ok finally got to testing my Ares 3.
I can tell you right away that AQ waterblock on my 295X2 is much better than this custom EK block.
Tepm wise, at stock, temps are about ~4C higher than on my 295X2.
At +200mV I'm getting 57C, at stock about 41C, in Unigine Heaven 4.0, 1440p maxed out.
What I'm more worried is VRM temps..
With +200mV I was getting 97C on VRM#1 of GPU2. VRM#2 of GPU2 was @72C.
VRMs of GPU1 were @81C and @56C.

I'm using Phobya XT thermal pads + Thermal Grizzly Kryonaut.


----------



## wermad

i believe the A3 has extra components that lend to more heat. Even with a custom block you're still contending with two cores and a bit more heat vs a reference cad. Vrm can take that amount of heat and some can goes as high as 120°C. My old 480s @ max (unmodded bios) stable clocks had vrm kissing 100c frquently. These bastards ran ~60-65c each while benching.

I don't personally use it, but a lot of ppl swear by fuji poly as making a big difference if you're willing to drop the coin for it.


----------



## bobbavet

Gday Guys.

Reporting in. My 295x2 is still running sweet and I have now matched it up too one of those Philips 40" 4k monitors.

Gotta say it bloody brilliant. Can only fault the monitor with some ghosting of black text when you scroll and a "sleep mode" that can cut in on start ups and be a pain in the butt at times.

Someone said they fixed this issue with a bios update of their GPU. So wondering if there ever has been an official bios update from any of the vendors?

I have seen custom bios for OC but am looking for an official bios that may fix compatibility issues.

I averted a major disaster this morning. My pump was noisy and had lost some liquid. Couldn't see any signs of a leak. I turned off an tilted my PC fo a while and sure enough I had pooling in the front floor.









On closer inspection I found this.



It appears the Oring had blown or was distorted out when tightening. I thought I had checked my build, but upon inspecting my build photos it was there all the time.









The Oring must have been fine initially, but finally gave way after 3 weeks. It did have crimping from not sitting in the oring seat properly.

So I drained and fixed, but had start fails and then no start.







Broke out the wife hairdryer and did the board. Still no go.
Got a partial start and a crackle from the PSU. Then blow dryried the PSU.

All Good.







Phew....lols


----------



## MIGhunter

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> i believe the A3 has extra components that lend to more heat. Even with a custom block you're still contending with two cores and a bit more heat vs a reference cad. Vrm can take that amount of heat and some can goes as high as 120°C. My old 480s @ max (unmodded bios) stable clocks had vrm kissing 100c frquently. These bastards ran ~60-65c each while benching.
> 
> I don't personally use it, but a lot of ppl swear by fuji poly as making a big difference if you're willing to drop the coin for it.


hmm, never heard of that TIM before. Is it really good? I'm working on my build now and I was debating between using the TIM that comes with the EK waterblock vs something else.
Quote:


> Originally Posted by *bobbavet*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Gday Guys.
> 
> Reporting in. My 295x2 is still running sweet and I have now matched it up too one of those Philips 40" 4k monitors.
> 
> Gotta say it bloody brilliant. Can only fault the monitor with some ghosting of black text when you scroll and a "sleep mode" that can cut in on start ups and be a pain in the butt at times.
> 
> Someone said they fixed this issue with a bios update of their GPU. So wondering if there ever has been an official bios update from any of the vendors?
> 
> I have seen custom bios for OC but am looking for an official bios that may fix compatibility issues.
> 
> I averted a major disaster this morning. My pump was noisy and had lost some liquid. Couldn't see any signs of a leak. I turned off an tilted my PC fo a while and sure enough I had pooling in the front floor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On closer inspection I found this.
> 
> 
> 
> It appears the Oring had blown or was distorted out when tightening. I thought I had checked my build, but upon inspecting my build photos it was there all the time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Oring must have been fine initially, but finally gave way after 3 weeks. It did have crimping from not sitting in the oring seat properly.
> 
> So I drained and fixed, but had start fails and then no start.
> 
> 
> 
> 
> 
> 
> 
> Broke out the wife hairdryer and did the board. Still no go.
> Got a partial start and a crackle from the PSU. Then blow dryried the PSU.
> 
> All Good.
> 
> 
> 
> 
> 
> 
> 
> Phew....lols


Close call. That's what scares me most about watercooling. Working on my 1st WC build now and I'm in stitches!


----------



## fat4l

Quote:


> Originally Posted by *MIGhunter*
> 
> hmm, never heard of that TIM before. Is it really good? I'm working on my build now and I was debating between using the TIM that comes with the EK waterblock vs something else.


Fujipoly is thermal pads. They are the best u can get. However not available in Eu/UK.
Maybe I will get them from the USA.
The best thermal paste however is thermal grizzly kryonaut.


----------



## Mega Man

afaik it is Coollab Ultra

the best thermal paste i mean


----------



## MIGhunter

Quote:


> Originally Posted by *fat4l*
> 
> Fujipoly is thermal pads. They are the best u can get. However not available in Eu/UK.
> Maybe I will get them from the USA.
> The best thermal paste however is thermal grizzly kryonaut.


Quote:


> Originally Posted by *Mega Man*
> 
> afaik it is Coollab Ultra
> 
> the best thermal paste i mean


Are they GPU specific or good for CPU as well? The TIM that is.


----------



## Mega Man

You generally don't use pads on cpus or gpus. You generally use pastes

You can use coolabs ultra on cpus. But imo it isn't worth it


----------



## wermad

No gelid exteme ???


----------



## Ironjer

Hi guys i need help over here. My Dual 295X2 is getting hot quickly 74c and throttle. I want upgrade the fan to push/pull my options SP120 High Performance or SP120 Quiet Edition remember push/pull? I need maintain those temps below 74 c full load at stock clock. Any experience?

thanks!


----------



## axiumone

Quote:


> Originally Posted by *Ironjer*
> 
> Hi guys i need help over here. My Dual 295X2 is getting hot quickly 74c and throttle. I want upgrade the fan to push/pull my options SP120 High Performance or SP120 Quiet Edition remember push/pull? I need maintain those temps below 74 c full load at stock clock. Any experience?
> 
> thanks!


how are your rads positioned?


----------



## Ironjer

Quote:


> Originally Posted by *axiumone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ironjer*
> 
> Hi guys i need help over here. My Dual 295X2 is getting hot quickly 74c and throttle. I want upgrade the fan to push/pull my options SP120 High Performance or SP120 Quiet Edition remember push/pull? I need maintain those temps below 74 c full load at stock clock. Any experience?
> 
> thanks!
> 
> 
> 
> how are your rads positioned?
Click to expand...











Enviado desde mi iPhone utilizando Tapatalk


----------



## wermad

Both cards hitting 75c?

You might wanna setup the rads intake (typically front to back).


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> afaik it is Coollab Ultra
> 
> the best thermal paste i mean


yes, but im talkign about non-conductive and easyly-spreadable and removable









and yes thermal grizzly kryonaut is better than gelid gc extreme.


----------



## fat4l

ok.
I wil lge buying fujipoly but just for VRMs.
Can anyone experienced enough tell me which ones exaclty are VRMs pls ??
(they say its using 16-phase)

Here is the pic of the card and I marked what exactly was cooled by thermal pads. I just need to know which ones are VRM1 and 2 for both cores pls. Mark them for me pls.
full img: http://postimg.org/image/paqsq9eqr/full/


Clean card pic: http://postimg.org/image/c5bak5kv7/full/

Thank you +++++++


----------



## Ironjer

Quote:


> Originally Posted by *wermad*
> 
> Both cards hitting 75c?
> 
> You might wanna setup the rads intake (typically front to back).


GPU1 70 -71 max but GPU2 3 & 4 rise up 74


----------



## Ironjer

If i flip the second radiator it will blow hot air into the card.

I think my case is limited (cosmos 2)


----------



## axiumone

It's definitely rad placement. From experience, you can't have the rads for these cards as intake blowing directly on the cards. It's a recipe for overheating.

Push/pull isn't going to help in this config, as you'll just be blowing the same hot air over your cards.


----------



## Alex132

You will also want to do push/pull with different fans on the 295X2 radiators. It helps a lot with temps / noise.


----------



## Alex132

And do something like this;


----------



## N0o0B

I have two fans on each rad with the entire case running negative pressure.


----------



## Alex132

Quote:


> Originally Posted by *N0o0B*
> 
> I have two fans on each rad with the entire case running negative pressure.


My case has 0 intake fans. 800D lol. Well... apart from for the HDDs








I just never use the side-panel. I haven't put it on in like 2 years.

Only pics I really have of it at the moment:


----------



## Ironjer

Quote:


> Originally Posted by *Alex132*
> 
> And do something like this;


that is exactly my airflow configuration. i want replace the original fans for 4 Corsair SP120 push/pull


----------



## Ironjer

Quote:


> Originally Posted by *axiumone*
> 
> It's definitely rad placement. From experience, you can't have the rads for these cards as intake blowing directly on the cards. It's a recipe for overheating.
> 
> Push/pull isn't going to help in this config, as you'll just be blowing the same hot air over your cards.


i don't blowing hot air over the cards. fresh air comes from 2 x 120mm lateral fans


----------



## Ironjer

remember i have 2 x 120mm lateral fans intake in the lid.


----------



## Alex132

Quote:


> Originally Posted by *Ironjer*
> 
> that is exactly my airflow configuration. i want replace the original fans for 4 Corsair SP120 push/pull


I'd suggest (in order):

EK Vardar fans > Gentle Typhoon fans > Noiseblocker fans > Noctua fans > Cougar fans = beQuiet! fans > rest


----------



## Ironjer

Quote:


> Originally Posted by *Alex132*
> 
> I'd suggest (in order):
> 
> EK Vardar fans > Gentle Typhoon fans > Noiseblocker fans > Noctua fans > Cougar fans = beQuiet! fans > rest


I have not option for it. Only SP120 QE or SP120 PE

i am guided by this


----------



## fat4l

Quote:


> Originally Posted by *Ironjer*
> 
> I have not option for it. Only SP120 QE or SP120 PE
> 
> i am guided by this


SP120 is a good fan as well. Better than noctua
I have the PE version.

Btw where are u from that u cant get EK Vardar? Send EK a email to see if they ship to your country.


----------



## Ironjer

Quote:


> Originally Posted by *fat4l*
> 
> SP120 is a good fan as well. Better than noctua
> I have the PE version.
> 
> Btw where are u from that u cant get EK Vardar? Send EK a email to see if they ship to your country.


Exactly. the corsair i can find it in local store.


----------



## Dagamus NM

Quote:


> Originally Posted by *Alex132*
> 
> And do something like this;


Ironjer, the setup above is correct however you might consider adding another lateral fan at the level of the cards so that airflow is directed to the VRM fans. I ghetto rigged a 200mm fan at that spot and it helped until I was able to setup my loop.


----------



## Ironjer

already (cooler master cosmos 2) 2 lateral fans (corsair quiet) and the setup above is my current setup.

i am playing at 4K when all 4 gpu at 100% load rise up 74c quickly


----------



## wermad

Can you place the rads where the lateral fans in the bottom chamber are? i know I've seen a monsta 240 fit down there in some builds, so its possible you can squeeze them there. That would allow very good seperation and exhaust to prevent hot air blowing to your cards. And its typical for the card to heat up a bit.

Btw, if you haven't, try to some different (and good) tim.

edit:
Quote:


>


----------



## fat4l

I would suggest, push pull, and repaste, use gelid GC extreme or thermal grizzly kryonaut(grizzly is better)


----------



## Ironjer

Quote:


> Originally Posted by *wermad*
> 
> Can you place the rads where the lateral fans in the bottom chamber are? i know I've seen a monsta 240 fit down there in some builds, so its possible you can squeeze them there. That would allow very good seperation and exhaust to prevent hot air blowing to your cards. And its typical for the card to heat up a bit.
> 
> Btw, if you haven't, try to some different (and good) tim.
> 
> edit:


Negative, all my liquid cooled system is sealed. i can't drive the tubes.


----------



## Mega Man

You should fix that problem


----------



## kayan

Quick question folks. I rma'd my 295 but they are insisting it isn't a problem with my card. This issue only shows up when both GPUs are in use. I've since replaced my wife's 290x with a gtx960 (her PC is on for 14 hours a day, and it was hot), anyway I'm using the 290x as my backup.

My question is could I tri-fire the 290x with my 295 and have the 290x as my main card, so xfire is disabled by default, rather than perma enabled like my 295? If so, how would I accomplish this (290x in first slot and 295 in second, I assume)?

Also, what would be the best PSU to purchase for this setup, from the psu list in this thread?


----------



## SAFX

Here's what I did, steady temps of 65-67 after playing GTA V for an hour, the trick is to take in cool air, stream it above the 295x2, then out the case:


----------



## SAFX

Anyone on Windows 10? how's the driver situation?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Anyone on Windows 10? how's the driver situation?


Fantastic!


----------



## kayan

Quote:


> Originally Posted by *SAFX*
> 
> Anyone on Windows 10? how's the driver situation?


Drivers are rock solid and stable.


----------



## Mega Man

But I thought only nvidia could say that?


----------



## wermad

Quote:


> Originally Posted by *Ironjer*
> 
> Negative, all my liquid cooled system is sealed. i can't drive the tubes.


So the stock tubes don't reach that far? I'm sure these guys are pretty lengthy.
Quote:


> Originally Posted by *kayan*
> 
> Quick question folks. I rma'd my 295 but they are insisting it isn't a problem with my card. This issue only shows up when both GPUs are in use. I've since replaced my wife's 290x with a gtx960 (her PC is on for 14 hours a day, and it was hot), anyway I'm using the 290x as my backup.
> 
> My question is could I tri-fire the 290x with my 295 and have the 290x as my main card, so xfire is disabled by default, rather than perma enabled like my 295? If so, how would I accomplish this (290x in first slot and 295 in second, I assume)?
> 
> Also, what would be the best PSU to purchase for this setup, from the psu list in this thread?


You want ~250-300w for the 290x but you won't have enough amps due to the hungry 295x2. Try it out w/ your v1000. i ran triple 290s (factory oc'd) on a V1000 and did have some shutting down. As for a new psu, 1200-1350w over 100 amps (ie G/P1300 et al). Tbh, you may wanna look for a unit that can run dual 295x2, as its tempting once you go tri-fire to get a second 295x2 (







).


----------



## fat4l

Here are some pics of ares 3 vs 295X2.
295x2 is a bit longer, but ares is much "bigger" overall



















Now I'm waiting for Fujipoly Extreme Plus 14W/mK, 1.0mm and Fujipoly Extreme Plus 14W/mK, 0,5mm so I can properly cool VRMs while running +200mV cores and +100mV AUX.


----------



## mojobear

Hey ya'll 4 GPU owners out there









Anyone have any luck with mantle and 4way/quadfire? My games (BF4 and dragons age inquisiton) wont load at all with mantle!









Thanks!


----------



## SAFX

Quote:


> Originally Posted by *kayan*
> 
> Drivers are rock solid and stable.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Fantastic!


I knew this would happen!! Now I have to build a new PC, after building a new one 3 months ago, OH THE TRAGEDY


----------



## wermad

I don't mantle.......


----------



## ColeriaX

Quote:


> Now I'm waiting for Fujipoly Extreme Plus 14W/mK, 1.0mm and Fujipoly Extreme Plus 14W/mK, 0,5mm so I can properly cool VRMs while running +200mV cores and +100mV AUX.


How are you modifying aux voltage along with core volts? What does increasing aux volts "do" in addition to core volts?
Could you screen shot your overclocking utility?

Ive never been able to get dragon age to run on mantle, and from what i saw it performed worse iirc anyways. But it did always bug me that it didnt even work at all.


----------



## fat4l

Quote:


> Originally Posted by *ColeriaX*
> 
> How are you modifying aux voltage along with core volts? What does increasing aux volts "do" in addition to core volts?
> Could you screen shot your overclocking utility?
> 
> Ive never been able to get dragon age to run on mantle, and from what i saw it performed worse iirc anyways. But it did always bug me that it didnt even work at all.


U have to use afterburner for aux voltage + trixx for core volts.
I will show u screens when I wake up. For now I can tell u aux voltage is not available on 295X2 only on 290X. Ares 3 consists of 2x 290X so....








And aux voltage helps with stablity when clocking high....


----------



## SAFX

Does it support active VRM cooling?


----------



## fat4l

Quote:


> Originally Posted by *SAFX*
> 
> Does it support active VRM cooling?


Not sure what you mean.

Anyway...


http://postimg.org/image/hw1z4smar/full


----------



## SAFX

I meant something like this, an active mechanism for cooling the VRMs exposed on the backplate

http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-backplates/16363/aquacomputer-backplate-for-kryographics-hawaii-r9-290x/290-active-xcs


----------



## fat4l

Quote:


> Originally Posted by *SAFX*
> 
> I meant something like this, an active mechanism for cooling the VRMs exposed on the backplate
> 
> http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-backplates/16363/aquacomputer-backplate-for-kryographics-hawaii-r9-290x/290-active-xcs


oh I had it on my 295X2.
not on ares 3 tho


----------



## SAFX

What did you have before?


----------



## fat4l

Quote:


> Originally Posted by *SAFX*
> 
> What did you have before?


I had 295X2+water blocks.
Now I have ares 3


----------



## MIGhunter

Quote:


> Originally Posted by *fat4l*
> 
> I had 295X2+water blocks.
> Now I have ares 3


What water blocks did you have? Any chance you had or still have a 295x2 EKWB backplate, lol. Looking so hard to find one and can't...


----------



## Mega Man

ironically i have one BNIB i bought 2 but the ppcs website was new and it sent me all sorts of stuff i didnt order ( Like 3 295x2 backplates ) i would need to locate it ( EK ) but pm if you are interested


----------



## MIGhunter

Quote:


> Originally Posted by *Mega Man*
> 
> ironically i have one BNIB i bought 2 but the ppcs website was new and it sent me all sorts of stuff i didnt order ( Like 3 295x2 backplates ) i would need to locate it ( EK ) but pm if you are interested


PMing you now


----------



## fat4l

ok. fujipoly project complete...

*Results:*
(ΔT, Phobya XT-7W/mK vs Fujipoly Extreme Plus-14W/mK+Thermal Grizzly Kryonaut on pads as well)
*GPU1_vrms*
VRM1=Δ21C
VRM2=Δ7C

*GPU2_vrms*
VRM1=Δ34C
VRM2=Δ17C

AWESOME!

Will post screens soon


----------



## Wrecker66

i have temps around 65C to 70C after playing BF4 around half an hour. Is that ok?

i ask becouse i mount the rad from graphic card onto rad from proc


----------



## joeh4384

Quote:


> Originally Posted by *Wrecker66*
> 
> i have temps around 65C to 70C after playing BF4 around half an hour. Is that ok?
> 
> i ask becouse i mount the rad from graphic card onto rad from proc


Those temps seem pretty reasonable.


----------



## MIGhunter

Quote:


> Originally Posted by *Wrecker66*
> 
> i have temps around 65C to 70C after playing BF4 around half an hour. Is that ok?
> 
> i ask becouse i mount the rad from graphic card onto rad from proc


um, are those fans both in push? i.e. you are pushing one rad into the other?


----------



## wermad

Quote:


> Originally Posted by *fat4l*
> 
> ok. fujipoly project complete...
> 
> *Results:*
> (ΔT, Phobya XT-7W/mK vs Fujipoly Extreme Plus-14W/mK+Thermal Grizzly Kryonaut on pads as well)
> *GPU1_vrms*
> VRM1=Δ21C
> VRM2=Δ7C
> 
> *GPU2_vrms*
> VRM1=Δ34C
> VRM2=Δ17C
> 
> AWESOME!
> 
> Will post screens soon


Sweet









Quote:


> Originally Posted by *Wrecker66*
> 
> i have temps around 65C to 70C after playing BF4 around half an hour. Is that ok?
> 
> i ask becouse i mount the rad from graphic card onto rad from proc
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *MIGhunter*
> 
> um, are those fans both in push? i.e. you are pushing one rad into the other?
Click to expand...

Can't tell tbh but the real question is why does he have the 295x2 rad sandwiched with the Enermax 240 aio cpu cooler rad???









Edit...just realized your wondering the same thing


----------



## Wrecker66

Becouse there is no space elswere. And yes there are both pushing. I have no experience with radiators and aio. What can go wrong with this config?


----------



## wermad

move your gpu rad to the top and set both rads to intake from the outside.


----------



## Wrecker66

there is a problem when i try to mount it on top, the rad hits the motherboard.


----------



## wermad

Your temps are pretty solid but time for a new case imho. There's a few itx cases out there that can fit this card with an extra bit of room. If you can't change the case, get a gtx 980 ti or fury x instead (same price range). Good luck


----------



## Wrecker66

i played BF4 for 2 hours and max temp was 74C. i think that i will try first to swamp enermax aio for nepton140xl.
i don't like that when i touch the radiatior from gpu is too hot. the enermax is not even close that warm.


----------



## Alex132

Quote:


> Originally Posted by *Wrecker66*
> 
> i played BF4 for 2 hours and max temp was 74C. i think that i will try first to swamp enermax aio for nepton140xl.
> i don't like that when i touch the radiatior from gpu is too hot. the enermax is not even close that warm.


74'c is basically throttle temps, so monitor core clock to see if it's throttling.


----------



## Wrecker66

I think that i will just swamp coolers tommarow and i will see whats going to happen.


----------



## wermad

a thick rad like the h80 will work good as well for your cpu (better if its delid) and its 120mm









Totally forgot to post this: what's your cpu's temps doing (out of curiousity)???


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> a thick rad like the h80 ut60 will work good as well for your cpu (better if its delid) and its 120mm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Totally forgot to post this: what's your cpu's temps doing (out of curiousity)???


Fixed


----------



## wermad

Not on the BI bandwagon?


----------



## Mega Man

Bi? Nope happily married with a kid


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> Bi? Nope happily married with a kid




BI = Black Ice radiators (specifically, the new nemesis).


----------



## Mega Man

Not used them yet. May be they will go into my tx.... I dunno


----------



## Wrecker66

Quote:


> Originally Posted by *wermad*
> 
> a thick rad like the h80 will work good as well for your cpu (better if its delid) and its 120mm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Totally forgot to post this: what's your cpu's temps doing (out of curiousity)???


Around 60c under load. But stock speeds. I7 3770k


----------



## BootPirate

I've been having trouble with my audio and video. I think it may have something to do with the card. It seems to stutter, quiet badly at times and I have no idea what is causing it.
I have searched high and low for a solution online to no avail. I'm asking for you help, maybe together we can figure out what has been plaguing my system.
The problem is noticeable when I stream video or audio 75% of the time. It is most notable when I play a video from my computer. About 95% of my video files would play with that horrid stuttered slow-mo echo.
Games: this one is strange, some games seem to exhibit the same symptoms as when the video files are played, some games are fine, and one game in particular is fine up to a point. Then I receive sever audio/video lag, but not the tradition lag. It's just like the video file stuttering by increased 10 fold. When I alt-tab out and back into the game, the game runs perfectly fine with no further interruptions.
It's worth mentioning that when I tried to capture this with fraps, it only caught the video lag, absolutely no audio lag.

I made this video to show what I have described.



It even seems to be effecting my video editing software in a small way. Certain things in the application are not behaving as they should. Such as when I change the opacity of a chosen clip, or if I motion stabilize a shot it renders as black/blank. If three shots are stacked on top of one another the top most will not appear until export, and even then it will flicker. This can be seen at the end of the video.

Specs:
Windows 7 x64
Intel i7-4790 (3.6GHz)
RAM 16Gb (upgraded the ram today just in case it was that. Didn't help)
XFX - R9 295x2
Res - 4800 x 1200
Realtek High Definition Audio
PSU - 850w


----------



## rdr09

Quote:


> Originally Posted by *BootPirate*
> 
> I've been having trouble with my audio and video. I think it may have something to do with the card. It seems to stutter, quiet badly at times and I have no idea what is causing it.
> I have searched high and low for a solution online to no avail. I'm asking for you help, maybe together we can figure out what has been plaguing my system.
> The problem is noticeable when I stream video or audio 75% of the time. It is most notable when I play a video from my computer. About 95% of my video files would play with that horrid stuttered slow-mo echo.
> Games: this one is strange, some games seem to exhibit the same symptoms as when the video files are played, some games are fine, and one game in particular is fine up to a point. Then I receive sever audio/video lag, but not the tradition lag. It's just like the video file stuttering by increased 10 fold. When I alt-tab out and back into the game, the game runs perfectly fine with no further interruptions.
> It's worth mentioning that when I tried to capture this with fraps, it only caught the video lag, absolutely no audio lag.
> 
> I made this video to show what I have described.
> 
> 
> 
> It even seems to be effecting my video editing software in a small way. Certain things in the application are not behaving as they should. Such as when I change the opacity of a chosen clip, or if I motion stabilize a shot it renders as black/blank. If three shots are stacked on top of one another the top most will not appear until export, and even then it will flicker. This can be seen at the end of the video.
> 
> Specs:
> Windows 7 x64
> Intel i7-4790 (3.6GHz)
> RAM 8Gb
> XFX - R9 295x2
> Res - 4800 x 1200
> Realtek High Definition Audio
> PSU - 850w


could be a conflict between realtek and amd hd audio. rightclick your speaker icon>playback devices and make sure realtek is the only one selected.


----------



## BootPirate

Just checked. It's set to realtek.


----------



## rdr09

Quote:


> Originally Posted by *BootPirate*
> 
> Just checked. It's set to realtek.


maybe you need to update realtek driver. not sure. or roll back,


----------



## StillClock1

Quote:


> Originally Posted by *blarty*
> 
> Anyone had any problems with the VRM fan? Playing the Witcher 2 with everything dialled up, after a couple of minutes I hear a clicking noise, come out of game and 15-20 seconds later it's gone, go into something like Valley or 3dMark and soon enough, there it is again.
> 
> It's not all the time while in game though so I don't believe its a wire or something else touching the fan, but it does only kick in under load, but even then it's not there 100% of the time.
> 
> I'm pretty sure it's the VRM fan, have considered it might be my bequiet! PSU as well, only because it was bought at the same time. Has anyone else encountered this issue, considering just popping the shroud and tightening the screws on the VRM fan and see if that makes a difference, but before I do that, thought I might ask if anyone else had any ideas to throw in.
> 
> Cheers


Blarty, did you finds a solution to the intermittent vrm fan clicking? I'm having the same issue


----------



## BootPirate

Quote:


> Originally Posted by *rdr09*
> 
> maybe you need to update realtek driver. not sure. or roll back,


Downloaded the new audio driver a minute ago, the problem persists. I'm thinking I'm going to have to reinstall windows, I really hope I don't have to RMA my card for some reason :/


----------



## rdr09

Quote:


> Originally Posted by *BootPirate*
> 
> Downloaded the new audio driver a minute ago, the problem persists. I'm thinking I'm going to have to reinstall windows, I really hope I don't have to RMA my card for some reason :/


is the realtek from the motherboard sound? what happens if you disable onbaord sound in bios? i still think it is conflicting drivers.


----------



## BootPirate

Quote:


> Originally Posted by *rdr09*
> 
> is the realtek from the motherboard sound? what happens if you disable onbaord sound in bios? i still think it is conflicting drivers.


Yes, it's on board.
I'll try that now.


----------



## BootPirate

No audio now (obviously) and the video stutter it still present.


----------



## rdr09

Quote:


> Originally Posted by *BootPirate*
> 
> No audio now (obviously) and the video stutter it still present.


not long ago i read of members with multi gpus and displays having issue like yours. keep your onboard sound disabled and reinstall driver. if you are using more than one monitor, test your system with a single display.

edit: you can seek help here too . . .

http://www.overclock.net/t/591413/official-amd-eyefinity-owners-club/3860


----------



## BootPirate

I'll give it a try.
Thanks for the link, I'll see what they have to say as well.


----------



## MIGhunter

Quote:


> Originally Posted by *BootPirate*
> 
> I'll give it a try.
> Thanks for the link, I'll see what they have to say as well.


Also, you might go into the hardware manager. I had an issue with the updates to win10 and my keyboard. The updates installed the driver like 50 times and they all conflicted with each other. I manually deleted them all and then installed just 1 driver and it fixed the issue. Might be worth it for your issue?


----------



## Davron

I finally got both of my R9 295X2 in a single box. I'm using a Asus ROG Maximus Gene VI with 4 sticks of EVGA DDR3 2400 and a 4790k. My power supply is a Corsair AX1500i. For some reason, I can only boot my computer with one monitor attached. If I have a second or heaven forbid all 5 monitors, the power button flashes on and then turns off immediately after spinning fans for half a second. I then have to flip the switch to turn off power to the power supply before it will start again and then it will only start with a single monitor. After it has started booting, I can connect all of the rest of the monitors and while it may take a reboot in Windows to get everything working it finally does work, though I have got a few odd driver crashes. Can anyone help me figure out what may be wrong?


----------



## z4StormS

I have a R9 295x2 and the Samsung 28" Freesync display and I'm using windows 10 and the 15.7.1 drivers and freesync isn't working :-( The free sync demo will not let me cut on freesync. But it says free sync is on and I cut free sync on in my display settings, And in CCC. Still no freesync. I've also tried 3 different cables. No luck. If anyone knows a fix for this problem please feel free to share.


----------



## fat4l

Btw guys, do u know that HIS iTurbo 1.6.6 allows even +400mV ? that's a lot of volts lol


----------



## Alex132

Quote:


> Originally Posted by *fat4l*
> 
> Btw guys, do u know that HIS iTurbo 1.6.6 allows even +400mV ? that's a lot of volts lol


RIP 295X2


----------



## Sgt Bilko

Quote:


> Originally Posted by *fat4l*
> 
> Btw guys, do u know that HIS iTurbo 1.6.6 allows even +400mV ? that's a lot of volts lol


Oh boy......my 295x2 starts to lose dsiplay signal around +250mV iirc....+400mV would just like popcorn going off in your case


----------



## StillClock1

Hey Guys,

I'm not quite ready to spend on the watercooling part, but I'm doing some aesthetic improvements to my case.

Is it possible to put a backplate of sorts on top of the GPU without removing the stock cooler? The top looks a little messy.

I want to be wary of metal on metal and potentially causing a short.

Thanks


----------



## Mega Man

It is. You can get a backplate and new screws. I know people have with the ek. But gl finding one

other brands I am sure you can as they all use the same holes


----------



## StillClock1

Would this one work? http://www.frozencpu.com/products/24798/ex-blc-1763/EK_R9-295X2_VGA_Liquid_Cooling_RAM_Backplate_-_Black_EK-FC_R9-295X2_Backplate_-_Black.html?tl=g57c599s2295&id=VCnvbNsT

I'm just concerned about those silver things near the GPU http://gadget-help.com/wp-content/uploads/2014/04/sm.back.800.jpg

Maybe I'm concerned for nothing (please correct me if I am wrong), but wouldn't putting this backplate on instead of the current one be exactly the same as what I would do if I were putting on a waterblock? I'd also add the thermal pads just to be safe.


----------



## Mega Man

1 FCPU is DEAD ( fyi )
2 most backplates ( this one included ) you remove that bracket and screw the blocks in direct to PCB, as this is a supposed to be used with a water block idk if the OEM blocks will work or not without them, although i dont see why not
3 the OEM screws are not the same thread as the ek block screws so idk how hard they will be for you to find


----------



## StillClock1

Very glad you told me that.

1. Good call in FCPU, I remember seeing stuff on that but it just didn't stick.
2. If you have a moment, if you scroll to 2:50 of this tutorial 



, it looks like the OEM PCB should work without any more parts than just the EK backplate.
3. Does their Backplate not come with screws, I found a manual that implies that they do? (assuming I can find another seller of the EK Backplate) https://shop.ekwb.com/EK-IM/EK-IM-3831109869093.pdf

Holy crap you weren't kidding, I'm going to have to look very hard to find one.

Thanks for your time.


----------



## StillClock1

Have you heard of an xspc backplate working? NCIX has some http://m.ncix.com/products/sku/98433

I just wonder if some of those screws are occupied by part of the current shroud, probably not a lot of people do this half-way like I'm proposing.


----------



## wermad

Any 295x2 backplate will work with any block albeit you make sure you get the right hardware. It just takes some figuring it on the screws you'll need to make it work. Ebay has screws for cheap.


----------



## xer0h0ur

Quote:


> Originally Posted by *wermad*
> 
> Any 295x2 backplate will work with any block albeit you make sure you get the right hardware. It just takes some figuring it on the screws you'll need to make it work. *Ebay has screws for cheap.*


( ͡° ʖ ͡°)


----------



## wermad

Mcmaster.com can get pricey for a dozen or so screws.


----------



## Roaches

I hope you're joking, I shop there often, their fasteners are dirt cheap for the quantity per pack; really depends on thread size and material/finish is when prices starts going up. They ship from UPS by default which defeat the purpose of buying from them price wise on small orders, unless you're buying multiple items which makes it well worth it.

Also since most popular waterblocks are sourced from Europe, its not hard to find matching DIN/ISO thread screws for them.


----------



## wermad

Quote:


> Originally Posted by *Roaches*
> 
> I hope you're joking, I shop there often, their fasteners are dirt cheap for the quantity per pack; really depends on thread size and material/finish is when prices starts going up. They ship from UPS by default which defeat the purpose of buying from them price wise on small orders, unless you're buying multiple items which makes it well worth it.
> 
> Also since most popular waterblocks are sourced from Europe, its not hard to find matching DIN/ISO thread screws for them.


Uhm, no, I'm not. Spending $10 on a pack of 100 can get expensive when you need only 25-50 or some low quantity. Plus, they never show shipping until you get (after being charged all in all). I was surprised by a few shipping charges once ($7 for a $5 order, ?). I have a ton of spare screws from them because I initially just kept buying from them for small quantity need. I gave ebay a shot after folks recommended that over spending more @ mcmastercarr. I love their site and they pretty much have everything, but it can get expensive for something that is usually inexpensive and low quantity.

I use ebay for small quantities (usually less then 100) and I end up paying a few dollars for a US shippers. If you don't mind a wait, the chinese sellers will sell you large quantities for a great price. I will admit, the mcmaster screws have endured better then ebay ones tbh (that's because i change hardware, or used to, quite often). When it came time to mate a backplate to a different make block, ebay screws were cheap and readily available. There's a few Vegas sellers that sell at pretty good prices and ship fast (well, for me in SoCal).

For the blocks, they usually come in w/ M3 or M2.5. DD (rip) used to use a very small size which is very surprising on how piggish their blocks were. Best to check the instructions for both and then you can figure how to mate plates with different make blocks. I've mated EVGA, EK, XSPC, custom (diy) plates with different blocks from ek, xspc, danger den, koolance, hk, ac, etc.. Its not too challenging but you can make it work. For those looking for active cooling on a backplate w/ block, you run into issues but remember the plate will do very little cooling compared to the block. I would recommend it for air coolers though. The EVGA plates are notorious for this (as well as some stock oem plates). EK plates tend to be very straight forward and usually use four or six outer perimeter holes or screws on the blocks. They usually use counter-sunk head screws (m3 or m2.5). Again, best to find the instructions for both and that will give you an idea of how they can be paired (though having both on hand will help even further).


----------



## StillClock1

Quote:


> Originally Posted by *wermad*
> 
> Uhm, no, I'm not. Spending $10 on a pack of 100 can get expensive when you need only 25-50 or some low quantity. Plus, they never show shipping until you get (after being charged all in all). I was surprised by a few shipping charges once ($7 for a $5 order, ?). I have a ton of spare screws from them because I initially just kept buying from them for small quantity need. I gave ebay a shot after folks recommended that over spending more @ mcmastercarr. I love their site and they pretty much have everything, but it can get expensive for something that is usually inexpensive and low quantity.
> 
> I use ebay for small quantities (usually less then 100) and I end up paying a few dollars for a US shippers. If you don't mind a wait, the chinese sellers will sell you large quantities for a great price. I will admit, the mcmaster screws have endured better then ebay ones tbh (that's because i change hardware, or used to, quite often). When it came time to mate a backplate to a different make block, ebay screws were cheap and readily available. There's a few Vegas sellers that sell at pretty good prices and ship fast (well, for me in SoCal).
> 
> For the blocks, they usually come in w/ M3 or M2.5. DD (rip) used to use a very small size which is very surprising on how piggish their blocks were. Best to check the instructions for both and then you can figure how to mate plates with different make blocks. I've mated EVGA, EK, XSPC, custom (diy) plates with different blocks from ek, xspc, danger den, koolance, hk, ac, etc.. Its not too challenging but you can make it work. For those looking for active cooling on a backplate w/ block, you run into issues but remember the plate will do very little cooling compared to the block. I would recommend it for air coolers though. The EVGA plates are notorious for this (as well as some stock oem plates). EK plates tend to be very straight forward and usually use four or six outer perimeter holes or screws on the blocks. They usually use counter-sunk head screws (m3 or m2.5). Again, best to find the instructions for both and that will give you an idea of how they can be paired (though having both on hand will help even further).


Thanks Wermad, not the first time you have helped me with this card.

Sounds like you're talking about mixing a backplate and a waterblock from two different manufacturers. I was just asking about putting a backplate on the stock cooler for aesthetics. Sounds like it should work, provided I find the right screws (which I would have thought come with the backplate, but I'll find out).

New question: I find my VRM fan makes a clicking noise, only sometimes, any ideas how to fix that?


----------



## axiumone

I remember when I took my shroud apart, the cables that route to the vrm fan can come out of their housing a little bit from under the fan. Take a look and make sure that they are flush. If they arent, you should be able to use a flat head screwdriver to push them back into the chanel. That _may_ cause a clicking noise when the fan hits the wires.


----------



## lordstryker

Hey everyone. Forgive me if I'm violating any policies but I could really use some help.

I recently picked up an R9 295x2 on ebay for my new build PC. I was a little leary about ebay, but the seller had outstanding reviews over several years so I took a chance.

I plug it in and everything seems fine on bootup. Except my PC isn't recognizing the card as a dual GPU. I got the latest catalyst beta drivers, but when I go to the main menu under "Gaming", the AMD Crossfire option doesn't even appear. I got GPU-z and again, only recognizes a single gpu.

I'm worried I got a bad card, but is it known to have a single core die, and yet the other gpu core runs fine with no problems at all? I gotta think there's some driver or bios setting causing the issue, but I'm stumped.

I have a new Gigabyte GA-Z170X-Gaming 3 motherboard. I've looked around the bios, but nothing jumping out at me that would cause anything to cause crossfire not to work.

Any ideas anyone could come up with would be greatly appreciated.


----------



## StillClock1

Quote:


> Originally Posted by *axiumone*
> 
> I remember when I took my shroud apart, the cables that route to the vrm fan can come out of their housing a little bit from under the fan. Take a look and make sure that they are flush. If they arent, you should be able to use a flat head screwdriver to push them back into the chanel. That _may_ cause a clicking noise when the fan hits the wires.


Axiumone - you're the man, I'll check that tonight. The picture is super helpful.


----------



## StillClock1

Quote:


> Originally Posted by *wermad*
> 
> Uhm, no, I'm not. Spending $10 on a pack of 100 can get expensive when you need only 25-50 or some low quantity. Plus, they never show shipping until you get (after being charged all in all). I was surprised by a few shipping charges once ($7 for a $5 order, ?). I have a ton of spare screws from them because I initially just kept buying from them for small quantity need. I gave ebay a shot after folks recommended that over spending more @ mcmastercarr. I love their site and they pretty much have everything, but it can get expensive for something that is usually inexpensive and low quantity.
> 
> I use ebay for small quantities (usually less then 100) and I end up paying a few dollars for a US shippers. If you don't mind a wait, the chinese sellers will sell you large quantities for a great price. I will admit, the mcmaster screws have endured better then ebay ones tbh (that's because i change hardware, or used to, quite often). When it came time to mate a backplate to a different make block, ebay screws were cheap and readily available. There's a few Vegas sellers that sell at pretty good prices and ship fast (well, for me in SoCal).
> 
> For the blocks, they usually come in w/ M3 or M2.5. DD (rip) used to use a very small size which is very surprising on how piggish their blocks were. Best to check the instructions for both and then you can figure how to mate plates with different make blocks. I've mated EVGA, EK, XSPC, custom (diy) plates with different blocks from ek, xspc, danger den, koolance, hk, ac, etc.. Its not too challenging but you can make it work. For those looking for active cooling on a backplate w/ block, you run into issues but remember the plate will do very little cooling compared to the block. I would recommend it for air coolers though. The EVGA plates are notorious for this (as well as some stock oem plates). EK plates tend to be very straight forward and usually use four or six outer perimeter holes or screws on the blocks. They usually use counter-sunk head screws (m3 or m2.5). Again, best to find the instructions for both and that will give you an idea of how they can be paired (though having both on hand will help even further).


I just got an XSPC backplate from NCIX, supplies are really thin and EKs are unfindable. I'll let you know how it goes.


----------



## NBrock

Quote:


> Originally Posted by *lordstryker*
> 
> Hey everyone. Forgive me if I'm violating any policies but I could really use some help.
> 
> I recently picked up an R9 295x2 on ebay for my new build PC. I was a little leary about ebay, but the seller had outstanding reviews over several years so I took a chance.
> 
> I plug it in and everything seems fine on bootup. Except my PC isn't recognizing the card as a dual GPU. I got the latest catalyst beta drivers, but when I go to the main menu under "Gaming", the AMD Crossfire option doesn't even appear. I got GPU-z and again, only recognizes a single gpu.
> 
> I'm worried I got a bad card, but is it known to have a single core die, and yet the other gpu core runs fine with no problems at all? I gotta think there's some driver or bios setting causing the issue, but I'm stumped.
> 
> I have a new Gigabyte GA-Z170X-Gaming 3 motherboard. I've looked around the bios, but nothing jumping out at me that would cause anything to cause crossfire not to work.
> 
> Any ideas anyone could come up with would be greatly appreciated.


Crossfire on these cards is by default. You can't turn it off unless you use a custom profile for a game/program. If you want to verify it is there us something like GPU-z. Have that running and do some bench tests that support crossfire and check the load for each GPU.


----------



## lordstryker

Quote:


> Originally Posted by *NBrock*
> 
> Crossfire on these cards is by default. You can't turn it off unless you use a custom profile for a game/program. If you want to verify it is there us something like GPU-z. Have that running and do some bench tests that support crossfire and check the load for each GPU.


As I said, i loaded up GPU-z and it only shows a single GPU and crossfire being disabled. Not sure what could be going on. I click the "Lookup" button on GPU-z and it brings me right to the R9 295x2 page. So it is correctly identifying it as an r9 295x2. Just with no crossfire and a single GPU


----------



## Sgt Bilko

Quote:


> Originally Posted by *lordstryker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NBrock*
> 
> Crossfire on these cards is by default. You can't turn it off unless you use a custom profile for a game/program. If you want to verify it is there us something like GPU-z. Have that running and do some bench tests that support crossfire and check the load for each GPU.
> 
> 
> 
> As I said, i loaded up GPU-z and it only shows a single GPU and crossfire being disabled. Not sure what could be going on. I click the "Lookup" button on GPU-z and it brings me right to the R9 295x2 page. So it is correctly identifying it as an r9 295x2. Just with no crossfire and a single GPU
Click to expand...

Use DDU: http://www.guru3d.com/files-details/display-driver-uninstaller-download.html

Uninstall your current drivers then install the most current WHQL driver (15.7.1 iirc)

I've had this issue on my 390x, I'm willing to bet if you plugged the card into another PCIe slot it will work fine


----------



## lordstryker

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Use DDU: http://www.guru3d.com/files-details/display-driver-uninstaller-download.html
> 
> Uninstall your current drivers then install the most current WHQL driver (15.7.1 iirc)
> 
> I've had this issue on my 390x, I'm willing to bet if you plugged the card into another PCIe slot it will work fine


Tried this and still no-go. (Well, haven;t tried a diferent slot, but Ii want to use my 16x one). I did use the DDU, uninstalled everything, then re-installed the 15.7.1 drivers. Same thing. GPU-z still shows single GPU with crossfire disabled.


----------



## wermad

Do you have a second system or a friend that can help you with this? Just pop it in their system and see what happens. Or if you have a frys or micro center near by, pick up a cheap z170 board and see what happens. My suspicion lies with the card. Even with no drivers my system can see the cores. How long ago did you buy the card (ebay buyer protection)?



Quote:


> Originally Posted by *Hoff248*
> 
> Thanks Wermad, not the first time you have helped me with this card.
> 
> Sounds like you're talking about mixing a backplate and a waterblock from two different manufacturers. I was just asking about putting a backplate on the stock cooler for aesthetics. Sounds like it should work, provided I find the right screws (which I would have thought come with the backplate, but I'll find out).
> 
> New question: I find my VRM fan makes a clicking noise, only sometimes, any ideas how to fix that?


Np, and the same goes for fitting a different backplate to the stock 295x2. Though I would caution as the stock reference backplate does cool and most waterblock backplates are purely looks. You may have a hard time finding screws for the stock cooler that can reach but its possible all in all.

for the clicking, it maybe a fan issue if the cable suggestion auxiome suggested does not work. I had occasional clicking on psu fans.


----------



## lordstryker

Quote:


> Originally Posted by *wermad*
> 
> Do you have a second system or a friend that can help you with this? Just pop it in their system and see what happens. Or if you have a frys or micro center near by, pick up a cheap z170 board and see what happens. My suspicion lies with the card. Even with no drivers my system can see the cores. How long ago did you buy the card (ebay buyer protection)?
> 
> Less than a week ago. So I can certainly explore that route if I got a bum card. I would not be happy to say the least. Seems strange the card seems to work fine though. No overheating, no glitches, anything like that.


Bought it less than a week ago. I do have another board. Its not a Z170, its my old PC but it should still work in case there's something up with the bios. I guess thats my next step.


----------



## fat4l

Try different slot. Some motherboards have a problem that some slots can't accept dual gpu cards.


----------



## lordstryker

Quote:


> Originally Posted by *wermad*
> 
> Do you have a second system or a friend that can help you with this? Just pop it in their system and see what happens. Or if you have a frys or micro center near by, pick up a cheap z170 board and see what happens. My suspicion lies with the card. Even with no drivers my system can see the cores. How long ago did you buy the card (ebay buyer protection)?
> 
> 
> Np, and the same goes for fitting a different backplate to the stock 295x2. Though I would caution as the stock reference backplate does cool and most waterblock backplates are purely looks. You may have a hard time finding screws for the stock cooler that can reach but its possible all in all.
> 
> for the clicking, it maybe a fan issue if the cable suggestion auxiome suggested does not work. I had occasional clicking on psu fans.


Quote:


> Originally Posted by *fat4l*
> 
> Try different slot. Some motherboards have a problem that some slots can't accept dual gpu cards.


Nope. Its a brand new Z170 Gigabyte gaming 3 MB. Even still, same thing. I then went back to my old broadwell MB and tried the 295x2 in there. Same thing. Only recognized it as a single GPU.

I decided to send it back on ebay. The seller accepted my return and I have my packing slip printed and ready to return. I'll find myself another 295x2 to use that actually works


----------



## wermad

Sucks







but glad you found and answer with a cooperative seller.


----------



## lordstryker

Quote:


> Originally Posted by *wermad*
> 
> Sucks
> 
> 
> 
> 
> 
> 
> 
> but glad you found and answer with a cooperative seller.


Thanks. Assuming no more hassles and the return goes smoothly and I get a refund. But so far seems straightforward.


----------



## wermad

Np







. Sent you a pm (private message) with more helpful info







.

Did some cleaning:


----------



## vieuxchnock

*how do we flash a 295X2. I tried to flash to the Sapphire bios but now one card is 1030/1300 and the other 1018/1250. I don't understand.

GPU 1 is 1030/1300

GPU 2 is 1018/1250*


----------



## fat4l

You should have 2 .rom files. One for gpu1-master and one for gpu2-slave.
U need to flash both


----------



## StillClock1

I'm interested in doing a flash of my card with a uefi compatible bios, and I want to make sure I have a backup bios on my card.

I've flipped the switch and took screenshots of the GPU-z reading of each and they're identical. Either I have 2 of the same bios, or it's just 1 bios and the switch is meaningless. If it is the latter, I'm not going to take the risk of bricking my card. If it is the former and I have an extra bios to fall back on, I'll give it a shot.

Is there any way to know for sure?

BiosComparison.docx 101k .docx file


----------



## Mega Man

It is the same bios file so you don't brick your card you have 2 bios but oem they have the same bios on them


----------



## vieuxchnock

Quote:


> Originally Posted by *fat4l*
> 
> You should have 2 .rom files. One for gpu1-master and one for gpu2-slave.
> U need to flash both


Do we flash on switch 1 with: Winflash -f -p 0 bios.rom and after on switch 2 with thw same command?


----------



## Sgt Bilko

Quote:


> Originally Posted by *vieuxchnock*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> You should have 2 .rom files. One for gpu1-master and one for gpu2-slave.
> U need to flash both
> 
> 
> 
> Do we flash on switch 1 with: Winflash -f -p 0 bios.rom and after on switch 2 with thw same command?
Click to expand...

Same command but Winflash -f -p 1 slave.rom instead of Winflash -f -p 0 master.rom

That's how i did mine

I used atiflash though and it was a while ago so my memory might be faulty in that regard









flicking the switch makes the card use the secondary bios on there which I've kept at the stock clocks for testing later on


----------



## vieuxchnock

But the roms are the same, aren't they?


----------



## Sgt Bilko

Quote:


> Originally Posted by *vieuxchnock*
> 
> But the roms are the same, aren't they?


One is for the Master GPU (GPU1) and the other is for Slave GPU (GPU2)

the 295x2 acts just like 2 individual cards in crossfire, therefore you need to designate which GPU you are flashing the Bios to ( GPU 0 and GPU 1)


----------



## vieuxchnock

So, if I understand my command should be:

Atiflash -f -p 0 bios.rom
Atiflash -f -p 1 bios.rom

right?

And I do that without moving the switch?


----------



## Mega Man

Correct.

The card has (switch 0) master.rom, slave,rom
And
(switch 1) master.rom, slave.rom


----------



## vieuxchnock

*Thanks, I'll try that.Do I need to restart between the 2 flash?*


----------



## Sgt Bilko

Quote:


> Originally Posted by *vieuxchnock*
> 
> So, if I understand my command should be:
> 
> Atiflash -f -p 0 bios.rom
> Atiflash -f -p 1 bios.rom
> 
> right?
> 
> And I do that without moving the switch?


That is correct,

As Mega said moving the switch would change to a different set of Bios'
Quote:


> Originally Posted by *vieuxchnock*
> 
> *Thanks, I'll try that.Do I need to restart between the 2 flash?*


No, just flash the Master then the Slave then restart your PC.


----------



## fat4l

Do don't restart between flashing. Do both at the same time.
Use only one switch position, do'nt flash both positions.
Position one-1 set for 2 gpus
Position two-1 set for 2 gpus


----------



## vieuxchnock

*Last news. I flash at switch #1 both bios and my card works OK now, both GPU have the same stock speed 1018/1250 but I have to flash it in windows as in Dos atiflash did not see my card
"no adaptor found"

And there nothing I can do on switch #2 the PC boot but there is no screen after the Win 10 logo. And I can't do nothing in dos.*


----------



## StillClock1

Quote:


> Originally Posted by *vieuxchnock*
> 
> *Last news. I flash at switch #1 both bios and my card works OK now, both GPU have the same stock speed 1018/1250 but I have to flash it in windows as in Dos atiflash did not see my card
> "no adaptor found"
> 
> And there nothing I can do on switch #2 the PC boot but there is no screen after the Win 10 logo. And I can't do nothing in dos.*


Not sure if I understand correctly. It sounds like it worked on switch 1. Sounds like switch 2 is useless, is there a chance there is no bios on switch 2?

Also, are you saying that using ATI flash from windows is the only way to do it?


----------



## vieuxchnock

*You're right. When I go in dos, atiflash -i returns me "no adaptor found" on switch 1 or 2. The same thing.
And on switch 2 I have no image when I boot in Windows after the logo. It's a black screen with the arrow of my mouse witch can be move around.*


----------



## Mega Man

Please tell Me you backed up stock bios? I would refoash the second position


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> Please tell Me you backed up stock bios? I would refoash the second position


^ What he said.

To re-flash it just load it up with the switch in the 1 position then just before you hit enter to flash change the switch to the 2 position so it flashes onto the second set instead of the first


----------



## SAFX

Quote:


> Originally Posted by *axiumone*
> 
> I remember when I took my shroud apart, the cables that route to the vrm fan can come out of their housing a little bit from under the fan. Take a look and make sure that they are flush. If they arent, you should be able to use a flat head screwdriver to push them back into the chanel. That _may_ cause a clicking noise when the fan hits the wires.


NICE! How did you mod that? Any improvements?


----------



## axiumone

Quote:


> Originally Posted by *SAFX*
> 
> NICE! How did you mod that? Any improvements?


That's just the stock photo showing the location of the cable. When I had my 295x2 cards, I took the shroud off in order to gain control of the vrm fan. You can see it on video here - 




I don't have any recorded data as far the difference that the mod had made, but I remember it was significant enough.


----------



## xer0h0ur

If I remember correctly there is still a way of flashing the BIOS if you fubar both switch positions. I don't remember the process off hand though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> If I remember correctly there is still a way of flashing the BIOS if you fubar both switch positions. I don't remember the process off hand though.


It's by using another GPU and making the commands:

Atiflash -f -p 1 Master.rom
Atiflash -f -p 2 Slave.rom

I've never had to do it thankfully but I'm pretty sure that's it


----------



## ff0000T34M

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's by using another GPU and making the commands:
> 
> Atiflash -f -p 1 Master.rom
> Atiflash -f -p 2 Slave.rom
> 
> I've never had to do it thankfully but I'm pretty sure that's it


Someone on here has mentioned you can boot on a good bios as primary gpu and then flip the bios switch to dead bios live and flash. I think it was rd09


----------



## Sgt Bilko

Quote:


> Originally Posted by *ff0000T34M*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's by using another GPU and making the commands:
> 
> Atiflash -f -p 1 Master.rom
> Atiflash -f -p 2 Slave.rom
> 
> I've never had to do it thankfully but I'm pretty sure that's it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Someone on here has mentioned you can boot on a good bios as primary gpu and then flip the bios switch to dead bios live and flash. I think it was rd09
Click to expand...

Yes that's correct, but this is if you screw up the bios on both switch positions


----------



## ff0000T34M

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yes that's correct, but this is if you screw up the bios on both switch positions


Ah, sorry i misunderstood. My speed reading skills failed me or was it reading basic skills?


----------



## Sgt Bilko

Quote:


> Originally Posted by *ff0000T34M*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Yes that's correct, but this is if you screw up the bios on both switch positions
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ah, sorry i misunderstood. My speed reading skills failed me or was it reading basic skills?
Click to expand...

It'd be speed reading, i do it myself alot


----------



## SAFX

Quote:


> Originally Posted by *axiumone*
> 
> That's just the stock photo showing the location of the cable. When I had my 295x2 cards, I took the shroud off in order to gain control of the vrm fan. You can see it on video here -
> 
> 
> 
> 
> I don't have any recorded data as far the difference that the mod had made, but I remember it was significant enough.


I thought it was modded!







, gigabyte logo threw me off


----------



## StillClock1

Quote:


> Originally Posted by *vieuxchnock*
> 
> *You're right. When I go in dos, atiflash -i returns me "no adaptor found" on switch 1 or 2. The same thing.
> And on switch 2 I have no image when I boot in Windows after the logo. It's a black screen with the arrow of my mouse witch can be move around.*


FYI - I'm trying to get Sapphire to flash my card for me, as the more I read about it the more sure that I will brick it.

Here is what I just received from Althon Micro (Sapphire's outsourced guys). My opinion is that they should do this for me and confirm that it works, since they should have done it before shipping it in the first place. Doubt I'll get very far with that argument though.

"Please use the instruction on this zip file. Once you create the pen drive, dump the bios file into the pen drive with the flash utility, and run the atiflash in DOS mode, type in command;

Atiflash -p 0 C67301MU.101 -f

Atiflash -p 0 C67301SU.101 -f

UEFIBIOSFlash.zip 3239k .zip file

And that will flash the card, once its done it will give you a message telling its successfully flashed."

EDIT: I wonder if the two "0"s they have in those Atiflash lines would have had me flash both the master and slave bios files to the same GPU, like I have read so many times on various forums. Then again, I'm clearly not the authority on this.

EDIT 2: I spoke to the guy on the phone (very nice guy) and he told me that my card BIOSs (4/28/14) and everything on the R9 290/390 series cards are "Hybrid". That means that if the windows installation is UEFI, the GPU card BIOS will pick that up and become compatible. He also told me that this is all a moot point for Windows 7 as it does not offer a "fast boot option".

I may have been fed a load of bull, but he seemed really nice about it so I am inclined to believe him. Anyway, you all know what I know so maybe this all will help someone.


----------



## Mega Man

it is bull fyi all my cards are non uefi , i am going to try to some new bios soon that are more recent then the last time i tried last time i have tried was in feb or march of this year all my pcs are gpt / uefi only, however i can not choose uefi only on mydesk tops because of the video cards


----------



## StillClock1

Quote:


> Originally Posted by *Mega Man*
> 
> it is bull fyi all my cards are non uefi , i am going to try to some new bios soon that are more recent then the last time i tried last time i have tried was in feb or march of this year all my pcs are gpt / uefi only, however i can not choose uefi only on mydesk tops because of the video cards


Are you on Windows 7, is that part of what he told me untrue?


----------



## Mega Man

I have tried on 7,8, and now 10


----------



## StillClock1

Quote:


> Originally Posted by *Mega Man*
> 
> I have tried on 7,8, and now 10


Do you know of anyone who has successfully done a UEFI installation of Windows 7 with the R9 295x2? If so, is it a materially faster boot?

I'm trying to decide how much more trouble this is worth. They want me to try to flash it all first and it goes wrong I can send it in. Still that' something like 4-6 weeks without the card.

EDIT: Certainly feel free to try the BIOS files I posted, it would be great to hear if they work for you or not.


----------



## Mega Man

You are making a mountain out of a mole hill. (In respect to flashing) it is really easy. The key is to only flash one side of the switch. And only that one.

Pick one. Then flash. If you mess it up you flip the switch and reboot.

Just back up the ones you flash


----------



## StillClock1

Quote:


> Originally Posted by *Mega Man*
> 
> You are making a mountain out of a mole hill. (In respect to flashing) it is really easy. The key is to only flash one side of the switch. And only that one.
> 
> Pick one. Then flash. If you mess it up you flip the switch and reboot.
> 
> Just back up the ones you flash


Fair enough, you're probably right. I've just read so many posts of people messing it up despite following instructions, though I suppose the people who flash it correctly don't post anything.

Still, it sounds like it doesn't make sense on Windows 7 as I won't see an improvement in boot speed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Hoff248*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> You are making a mountain out of a mole hill. (In respect to flashing) it is really easy. The key is to only flash one side of the switch. And only that one.
> 
> Pick one. Then flash. If you mess it up you flip the switch and reboot.
> 
> Just back up the ones you flash
> 
> 
> 
> Fair enough, you're probably right. I've just read so many posts of people messing it up despite following instructions, though I suppose the people who flash it correctly don't post anything.
> 
> Still, it sounds like it doesn't make sense on Windows 7 as I won't see an improvement in boot speed.
Click to expand...

First time i flashed mine i bricked it, i just flicked the switch and booted on the other bios then flicked it back just as i flashed again.

Fixed it right up


----------



## Mega Man

Quote:


> Originally Posted by *Hoff248*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> You are making a mountain out of a mole hill. (In respect to flashing) it is really easy. The key is to only flash one side of the switch. And only that one.
> 
> Pick one. Then flash. If you mess it up you flip the switch and reboot.
> 
> Just back up the ones you flash
> 
> 
> 
> Fair enough, you're probably right. I've just read so many posts of people messing it up despite following instructions, though I suppose the people who flash it correctly don't post anything.
> 
> Still, it sounds like it doesn't make sense on Windows 7 as I won't see an improvement in boot speed.
Click to expand...

It is np I am not trying to harass you. I want to get you to do it. As Sgt said above. It is easy. But it is frightening the first time


----------



## drm8627

is there a good place to get this card with a warranty? ( i wouldnt mind paying extra for the warranty)


----------



## wermad

If you contact the seller on used ones, most are willing to help with any warranty needs. Other then that, buy it new if you can find it. Most etailers are price gauging since its now a rare and eol item. PowerColor has the new devil which is essentially based on the old devil 290Xx2.


----------



## drm8627

Quote:


> Originally Posted by *wermad*
> 
> If you contact the seller on used ones, most are willing to help with any warranty needs. Other then that, buy it new if you can find it. Most etailers are price gauging since its now a rare and eol item. PowerColor has the new devil which is essentially based on the old devil 290Xx2.


yea its not watercooled though, the devil.


----------



## wermad

You're a few months late. Best time would have been right before the new gen launch and just after. They were going for ~$600-700

Good luck on new and if you have any q's buying used, some of use have lots of sales/buying experience, so don't hesitate to ask


----------



## fat4l

The new devil is 2x290 not 2x290X.


----------



## wermad

Quote:


> Originally Posted by *fat4l*
> 
> The new devil is 2x290 not 2x290X.


cores are 390, essentially reworked 290s, but the board is based of the devil 290Xx2 from the looks. So, its a reworked old devil. I doubt they would do a brand new board as they seemed to have lots of the old devils still available. By the reviews, the 390 is slightly better then the 290x, so it still better then the old devil imho.

so its not exactly the same as the old but its a bit better. Drm8627 is looking for a new reference but they're have appreciated now and it won't be worth twice the price of a used unit that still has warranty. The new devil would be a good alternative.


----------



## StillClock1

Quote:


> Originally Posted by *Mega Man*
> 
> It is np I am not trying to harass you. I want to get you to do it. As Sgt said above. It is easy. But it is frightening the first time


Certainly understood, I'll give it a go when all of the parts arrive.

I don't think I've heard an answer about Windows 7, is this going to make any difference as long as I am still running Windows 7 (in boot up time at least)? I'd like to give Win 10 a bit more time to work through the kinks.


----------



## joeh4384

Quote:


> Originally Posted by *vieuxchnock*
> 
> Do we flash on switch 1 with: Winflash -f -p 0 bios.rom and after on switch 2 with thw same command?


Each switch has 2 bios, one is master and one is slave.


----------



## Mega Man

Quote:


> Originally Posted by *Hoff248*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> It is np I am not trying to harass you. I want to get you to do it. As Sgt said above. It is easy. But it is frightening the first time
> 
> 
> 
> Certainly understood, I'll give it a go when all of the parts arrive.
> 
> I don't think I've heard an answer about Windows 7, is this going to make any difference as long as I am still running Windows 7 (in boot up time at least)? I'd like to give Win 10 a bit more time to work through the kinks.
Click to expand...

depends on mobi. There are 2 types of fast boots.

One the mobo does
The other windows does

Either or can be effected iirc cpuz has the ability to verify uefi


----------



## StillClock1

Quote:


> Originally Posted by *Mega Man*
> 
> depends on mobi. There are 2 types of fast boots.
> 
> One the mobo does
> The other windows does
> 
> Either or can be effected iirc cpuz has the ability to verify uefi


Motherboard (ROG Formula VI) supports fast boot, its just the Windows I am trying to find out.


----------



## m42orion09

Sorry for the poor quality picture.

Diamond R9 295x2, bought open box refurb from Microcenter for $510. Being both of those, I opted for the 2 year warranty









I've been working on getting my overclocks as high as possible, and so far I have it at +80mV, +50% power, 1125MHz core, 1625 MHz RAM. I'm still trying to push it further, and then I'll try to cut back some power and see where that leads me. This card is keeping me running quite well on three 1080p screens, I've only had a few issues with thermal throttling. Since moving it to the top of my case with a bitfenix spectre pro pwm pulling and the stock fan pushing, it's been staying a bit cooler.

One thing I've been wondering... Are there any perks to the Sapphire OC BIOS other than the stock speeds? I'm not too worried about flashing my bios if it so, but I'd rather not waste my time if not.


----------



## drm8627

Quote:


> Originally Posted by *m42orion09*
> 
> 
> 
> Sorry for the poor quality picture.
> 
> Diamond R9 295x2, bought open box refurb from Microcenter for $510. Being both of those, I opted for the 2 year warranty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been working on getting my overclocks as high as possible, and so far I have it at +80mV, +50% power, 1125MHz core, 1625 MHz RAM. I'm still trying to push it further, and then I'll try to cut back some power and see where that leads me. This card is keeping me running quite well on three 1080p screens, I've only had a few issues with thermal throttling. Since moving it to the top of my case with a bitfenix spectre pro pwm pulling and the stock fan pushing, it's been staying a bit cooler.
> 
> One thing I've been wondering... Are there any perks to the Sapphire OC BIOS other than the stock speeds? I'm not too worried about flashing my bios if it so, but I'd rather not waste my time if not.


holy cow thats the best deal ive seen!! thanks. i may get that.


----------



## drm8627

would the nex750g power supply work for this gpu? i am running a 4790k, not overclocked, and 32gb of mushkin RAM, one tb hdd, and one 250gb ssd.

http://www.newegg.com/Product/Product.aspx?Item=N82E16817438006

i would not be overclocking anything


----------



## lordstryker

Quote:


> Originally Posted by *lordstryker*
> 
> Thanks. Assuming no more hassles and the return goes smoothly and I get a refund. But so far seems straightforward.


Reporting back. I am happy to say that my return of my original 295x went smoothly through ebay and I received a full refund in my account. I am also happy to say the replacement I received is in fully functioning order, and crossfire enabled by default, with no hoops to jump through. I'm finally rocking a fully functional r9 295x!

Thanks everyone!


----------



## joeh4384

Quote:


> Originally Posted by *drm8627*
> 
> would the nex750g power supply work for this gpu? i am running a 4790k, not overclocked, and 32gb of mushkin RAM, one tb hdd, and one 250gb ssd.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438006
> 
> i would not be overclocking anything


I think you would have a lot of issues due to not having enough power due to rails.


----------



## wermad

Quote:


> Originally Posted by *lordstryker*
> 
> Reporting back. I am happy to say that my return of my original 295x went smoothly through ebay and I received a full refund in my account. I am also happy to say the replacement I received is in fully functioning order, and crossfire enabled by default, with no hoops to jump through. I'm finally rocking a fully functional r9 295x!
> 
> Thanks everyone!











Quote:


> Originally Posted by *drm8627*
> 
> would the nex750g power supply work for this gpu? i am running a 4790k, not overclocked, and 32gb of mushkin RAM, one tb hdd, and one 250gb ssd.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438006
> 
> i would not be overclocking anything


See this thread's op or my sig links


----------



## kaym911

I just trade my GTX 970 for R9 295x2 No cash added just straight trade that's the best deal I got right now. Almost double my performance and i can now actually play 4k games no more of that 3.5GB memory


----------



## wermad

Sweet


----------



## joeh4384

Quote:


> Originally Posted by *kaym911*
> 
> I just trade my GTX 970 for R9 295x2 No cash added just straight trade that's the best deal I got right now. Almost double my performance and i can now actually play 4k games no more of that 3.5GB memory


How the hell did you do that?


----------



## Alex132

Saying the 970 is 3.5GB is like saying 2x2GB dual channel memory is 2GB.


----------



## Mega Man

then again saying the 970 is 4gb is like.... a joke


----------



## kaym911

Quote:


> Originally Posted by *joeh4384*
> 
> How the hell did you do that?


Hey didn't need it because it drew to much power wanted something that wasn't so power hungry


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> then again saying the 970 is 4gb is like.... a joke


no real point in beating a dead horse. Its as much a 4GB card as the 980 in theory. If you dont understand why, then you havent been reading much bar the sensationalist fluff.


----------



## wermad

That's a very epic trade. Very lucky good sir


----------



## Mega Man

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> then again saying the 970 is 4gb is like.... a joke
> 
> 
> 
> no real point in beating a dead horse. Its as much a 4GB card as the 980 in theory. If you dont understand why, then you havent been reading much bar the sensationalist fluff.
Click to expand...

Ya your right. That extra 512 gb obviously didn't hurt performance at all.

I mean the only reason we found out about it was it hurt perfor........ oh


----------



## kaym911

Quote:


> Originally Posted by *Alex132*
> 
> no real point in beating a dead horse. Its as much a 4GB card as the 980 in theory. If you dont understand why, then you havent been reading much bar the sensationalist fluff.


As a previous 970 owner I am well aware of the memory issue or slower partition of memory, it was a slap in the face from nvidia im not a fan boy for any camp but information should be honest so us as the consumers can make the best purchese. But if I offend some of you I am sorry


----------



## Mega Man

Look I didn't even have to go anywhere for proof. ...


----------



## wermad

Quote:


> Originally Posted by *kaym911*
> 
> As a previous 970 owner I am well aware of the memory issue or slower partition of memory, it was a slap in the face from nvidia im not a fan boy for any camp but information should be honest so us as the consumers can make the best purchese. B*ut if I offend some of you I am sorry*


What....I didn't see anything from you. No need to apologize imho







. the 970 is what it is now. It still a solid card and I've seen it run some pretty intensive games. My nephew did get caught up in all the hoopla and ended up going w/ a 980 instead though @ 1080, as I predicted, it he didn't see a difference (gonna help him go 4k once he returns from duty).

Quote:


> Originally Posted by *Mega Man*
> 
> Look I didn't even have to go anywhere for proof. ...


Gonna hit you up in the CL club for some qs on your Godzilla (that you refuse to show moar of







).

edit: gonna play some more lost planet 2 and i'll await your reply good sir


----------



## fat4l

Hi








Guys, would you try this application for me please ?
This link or This link.

It shows the default voltage for your card that is stored in bios(that is used in 3d).
I would also be interested in seeing the difference in Sapphire OC bios vs stock bios.
If anyone could run this app on both bios switches if you have both bioses stated above(I hope u understand).
Also if you could add the asic values for your cores please. Can be seen in gpu-z, select one core, right click on gpu-z panel, show asic quality, then do this for the other core please.


----------



## kaym911

Oh I didn't upload any pictures awkward lol here you go link to imgur

http://i.imgur.com/u5oHfdO.jpg 295x2

next I'll show you the gtx 970 I had



http://imgur.com/Neupqg6


if you are wondering about the clear side panel, I broke it but I will be making a new one

It's in the second slot because on this board second pice is 8 x 8 which is weird I know top slot won't work with this card for some reason.


----------



## m42orion09

Sadly, my card died on me :/ not entirely sure what happened, it might be the Vrm. I'm exchanging it for a regular refurbished tomorrow morning. They dropped the price! It's now like $480. I definitely want to save up and get a second one and a new psu before they run out.

Since I thought it was my motherboard that was dead, I exchanged it and my cpu for a 4790k qnd the gigabyte z98x gaming 7. I can't wait to see how that pans out with a new card


----------



## SAFX

So after 295x2, what's your next card?


----------



## m42orion09

Same card. My apologies, meant new to me.

It was that or spend ~$150 for a 980 Ti, I just loved this card (and my wallet) too much. Yes, I'm getting the R9 295x2 again because it's cheaper hahaha


----------



## fat4l

Quote:


> Originally Posted by *fat4l*
> 
> Hi
> 
> 
> 
> 
> 
> 
> 
> 
> Guys, would you try this application for me please ?
> This link or This link.
> 
> It shows the default voltage for your card that is stored in bios(that is used in 3d).
> I would also be interested in seeing the difference in Sapphire OC bios vs stock bios.
> If anyone could run this app on both bios switches if you have both bioses stated above(I hope u understand).
> Also if you could add the asic values for your cores please. Can be seen in gpu-z, select one core, right click on gpu-z panel, show asic quality, then do this for the other core please.


Anyone ? Please. It's the app from Hawaii Bios Editing Thread and 100% harmless


----------



## PR-Imagery

That's not suspicious at all


----------



## fat4l

Quote:


> Originally Posted by *PR-Imagery*
> 
> That's not suspicious at all


huh.the application just shows the default voltage set in the bios.
Quote:


> Originally Posted by *gupsterg*
> 
> ***** *Under Construction Last update: 14/10/15* *****
> *Useful Links*
> Link:- The Stilt's VID APP


This is official application lol.

it looks like this


----------



## gupsterg

Quote:


> Originally Posted by *PR-Imagery*
> 
> That's not suspicious at all


Here's the original post by The Stilt, Link:- http://forums.guru3d.com/showthread.php?p=5119818#post5119818

I and some have run it without anything "suspicious" happening, even @ENTERPRISE run it for when I was modding him a rom. So far he hasn't reported that his bank account details, etc have been compromised







.

It allows DPM 7 VID for a card to be decoded, which isn't possible with anything else. AFAIK even if you used a multimeter on card you'd be getting VDDC not VID.


----------



## PR-Imagery

/s


----------



## fat4l

When I bought my 295X2 everyone was like, yeah lets try OC hard, sapphire OC bios here and there.
And now when we have the ability to rly look into bios and edit it noone seem to care









The more pll run the application and show the results the better for us all. We will get better understanding of these cards.
But, maybe, noone is interested in knowing









For now I can tell you, sapphire OC bios only differs in MHz and power limits.
(200W/200W/136A vs 202W/202W/137A)
Stock 290X is 208W/208W/200A.



Now if you want to know what is the default voltage for your card you need to run EVV application than shows you DPM7(3D) voltage.
This is variable voltage and depends on several things(Asic, default clock...).
Variable means, several cards can have different voltages.
In general, 295X2 have higher voltages than 290X and even higher than my Ares III. It's usually 1.25v. You will not thave this voltage when running 3D application due to Vdroop. It will maybe be 1.225 or so.
My EVV is
GPU1: Hawaii=1.19375v / 1030MHz(DPM7) / 0x32D (Lkg)
GPU2: Hawaii=1.225v / 1030MHz(DPM7) / 0x321 (Lkg)

Lkg value can be translated into Asic.
32D(hex)=813(decimal)
813/1023=79.5% asic.

321(hex)=801(decimal)
801/1023=78.3% asic.

Higher the asic->lower the DPM7 voltage
Lower the default clock->higher the DPM7 voltage


----------



## rakesh_sharma23

Unable to install AMD R9 295 x2 driver with all latest driver too.. OS Windows 10 , SYSTEM : i5-6600k, Gigabyte z170x Gaming7, 4x 4GB Hyperx Predator DDR-4 3000

I have also tried clean installation of W10 and even W8.1 with one display connected via DVI. Auto Installation of Catalyst 15.7.1 & 15.10 Beta show no driver installed.

Any issue with Card or Driver or BIOS ???


----------



## fat4l

Hm.
Are u rly serious with 750w psu for 295x2?


----------



## rakesh_sharma23

No this is just for Installation Will be running it with RM1000 PSU.

So even for installation do i need more power ???


----------



## wermad

Rm1000 is fine









Which drivers did you try?


----------



## SAFX

Which gpus (AMD only) currently outperform the 295x2?


----------



## axiumone

Quote:


> Originally Posted by *SAFX*
> 
> Which gpus (AMD only) currently outperform the 295x2?


If crossfire works in every title that you play, the 295x2 is still more powerful than any other gpu. If crossfire doesn't work and you are limited to one onboard gpu, then 390x, fury and fury x should outperform the 295x2.


----------



## kaym911

The
Quote:


> Originally Posted by *axiumone*
> 
> If crossfire works in every title that you play, the 295x2 is still more powerful than any other gpu. If crossfire doesn't work and you are limited to one onboard gpu, then 390x, fury and fury x should outperform the 295x2.


290x isn't that bad compared to the 390X they are pretty much the same card


----------



## gatygun

390x has better memory modules which clock higher then 290x, unless you got lucky on your 290x with memory clock. so the 390x will be faster on stock then 290x.


----------



## fat4l

Quote:


> Originally Posted by *gatygun*
> 
> 390x has better memory modules which clock higher then 290x, unless you got lucky on your 290x with memory clock. so the 390x will be faster on stock then 290x.


I dont think its about modules themselves. They use differen(better) timings thus better scores.
U can mod your 290x bios to use the better timigs as well...The difference is a few % in performance.
Also 390X has higher default voltage(afaik) and some cards have higher aux voltage too.

anyway...

After my win crashed due to AMD drivers I moved to win10.
I tried the newest beta 15.11 drivers but they not working fine for me. The card is not holding clocks at all. GPU usage is baaaaad.

I reverted back to official 15.7.1 and all is good. Even better than on win 7.
Very pleased.

Clocks are 100% now.
Voltage regulation is awesome.
27500GPU points in FS! yay



I will continue tweaking the card. Hopefully I can do 1250MHz, then I can start tweaking mems(with gupsterg's help







)










EDIT://
it actually looks like "freesync" is causing all the issues...


----------



## remedy1978

I am currently using the 15.7.1 drivers with Windows 10. Whenever my computer goes to sleep, the display does not turn back on after waking back up. I have tried multiple cables and monitors, but nothing seems to work. I disable hybrid sleep and still have the same issue. The only solution is to power off the computer and turn it back on, then the video output comes back.

Before I send the card into Althon Micro, I thought I would check here to see if anyone had any insight. Other than this issue, my Sapphire R295X2 has been rock solid.


----------



## fat4l

Quote:


> Originally Posted by *remedy1978*
> 
> I am currently using the 15.7.1 drivers with Windows 10. Whenever my computer goes to sleep, the display does not turn back on after waking back up. I have tried multiple cables and monitors, but nothing seems to work. I disable hybrid sleep and still have the same issue. The only solution is to power off the computer and turn it back on, then the video output comes back.
> 
> Before I send the card into Althon Micro, I thought I would check here to see if anyone had any insight. Other than this issue, my Sapphire R295X2 has been rock solid.


do u have Internal PLL Overvoltage enabled? (in bios)

This might be causing issues with sleep..


----------



## MIGhunter

Just redid my 295x2 ;p


----------



## SAFX

Quote:


> Originally Posted by *MIGhunter*
> 
> Just redid my 295x2 ;p


I love the smell of thermal pads in the morning.....smells like victory









Looks nice!


----------



## SAFX

Quote:


> Originally Posted by *remedy1978*
> 
> I am currently using the 15.7.1 drivers with Windows 10. Whenever my computer goes to sleep, the display does not turn back on after waking back up. I have tried multiple cables and monitors, but nothing seems to work. I disable hybrid sleep and still have the same issue. The only solution is to power off the computer and turn it back on, then the video output comes back.
> 
> Before I send the card into Althon Micro, I thought I would check here to see if anyone had any insight. Other than this issue, my Sapphire R295X2 has been rock solid.


Does it happen all the time or just intermittently?

I've got the same problem, but I think it's my monitor, not the card.
You disabled hybrid mode, good. I picked up one of these after reading something about cheap DP cables not sending correct signals to the monitor to wake up.
http://amzn.com/B00A7R9I22

Unfortunately, I still have the issue, doesn't happen all the time, but I love the monitor so I've learned to live with it. The only fix is unplugging the monitor's power cord, then reconnecting. If you have to power cycle your PC, that sounds like a bigger issue.

Nothing is more annoying than no picture, good luck


----------



## Alex132

I have never trusted my computer to go to sleep nor the monitor. My motherboard has a known issues where it royally messes things up if it goes into sleep - and my monitor has a known issue of it dying if it goes into sleep too much.

And honestly, I prefer it like this. Much less issues


----------



## MIGhunter

Quote:


> Originally Posted by *SAFX*
> 
> I love the smell of thermal pads in the morning.....smells like victory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks nice!












The rest of it is *HERE*


----------



## wermad

Try disabling your c-states in the bios and see if that helps w/ sleep. I used to have this issue with amd and nvidia, and it maybe your systems power states/management. Right now I'm on stock cpu clocks and all states are left in auto (GB board) but when I oc, I disable them if resuming sleep is not working. Also, check your windows power management. Some items may continue to have power and will cause hang or will resume your system (usb and network are notorious for doing this) and then hang. If you can't fix it, try a reformat.

Played Lost Planet 2 mp via Windows Live and it was pretty good. Some hardcore guys out there so I went with campaign mp. Still, I need to master the customization of this game. Its been long, long, long butt time since I've delved w/ mp. Solid game if you can find it on Steam for $10 but be warn Windows Live is a pain in the behind.


----------



## Ironjer

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hey guys,
> 
> I recently bought myself a Samsung U28E590 Freesync monitor, but somehow freesync does not work. The R9 295X2 and windows 10 are supported right (i'm on CCC 15.7.1)?
> So i enables freesync in monitros OSD and from CCC, but cannot enable freesync in windmill demo (the option is greyed out). I tried BF4 and still screentearing when Vsync if off. Freesync doest even work when i lock FPS on the freesync range (which is 40-64 for Samsung).
> Anyone have solutions?


Same problem here!! did you solved your problem?


----------



## NBAasDOGG

I solved the problems indeed. I send the Samsung back and bought the Asus MG279Q.


----------



## Ironjer

Quote:


> Originally Posted by *NBAasDOGG*
> 
> I solved the problems indeed. I send the Samsung back and bought the Asus MG279Q.


Good for you i cant do that. Samsung has a problem then. Damn!


----------



## fat4l

Quote:


> Originally Posted by *NBAasDOGG*
> 
> I solved the problems indeed. I send the Samsung back and bought the Asus MG279Q.


So its the monitor issue ?


----------



## fat4l

I think we have a problem with crossfire(295X2) + freesync guys...

Run afterburner....run 3dmark Firestrike demo. after it finishes check clocks in afterburner. They are not stable but fluctuating like crazy......

Anyone ?


----------



## Alex132

Try use Sapphire Trixx instead of MSI:AB.


----------



## xer0h0ur

I've been trusting Afterburner less and less lately. I believe it misreports clocks sometimes. In any event I have cleared up Afterburner showing clock throttling before by removing it without keeping settings then reinstalling it.


----------



## wermad

Has anyone gotten gpu-tweak to work? From my experience, it seems to work fine w/ nvidia but not amd (crashes). I wasn't sure if this was nvidia only (much like EVGA).


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> I've been trusting Afterburner less and less lately. I believe it misreports clocks sometimes. In any event I have cleared up Afterburner showing clock throttling before by removing it without keeping settings then reinstalling it.


I dont think thats the case as I can see the decreased performance/shuttering in games, 3dmarks etc..


----------



## xer0h0ur

Quote:


> Originally Posted by *fat4l*
> 
> I dont think thats the case as I can see the decreased performance/shuttering in games, 3dmarks etc..


Okay if you're using Windows 7/8.1 then I would suggest following BradleyW's manual driver removal guide and using DDU afterwards for redundancy. When I had driver based issues with clock throttling this did the trick. If you're on Windows 10 may the good lord bless you because there is nothing I can do for you.


----------



## xer0h0ur

Actually the light bulb just turned on as I just remembered you can force non-stop 3D clocks by setting "Unofficial overclocking mode" to "without powerplay support" in Afterburner. This is however more of a temporary solution than it is a fix.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Actually the light bulb just turned on as I just remembered you can force non-stop 3D clocks by setting "Unofficial overclocking mode" to "without powerplay support" in Afterburner. This is however more of a temporary solution than it is a fix.


Thanks for input.
It doesnt work for me...clocks are still 2D in 2D...


----------



## xer0h0ur

You may want to try removing afterburner without remembering settings and installing it again because I know everyone I have suggested this to has managed to force non-stop 3D clocks using that setting. Either way the last time this happened I did a manual registry key wipe using BradleyW's guide and that ended my clock throttling I was getting in AC:Unity and the crashing at loading Dying Light.

One would think that a driver remover like DDU which says it removes registry keys would actually remove all of the registry keys from driver installations but the truth is it doesn't. The accumulation of many driver installations leaving behind registry keys is what ultimately introduced oddball performance issues for me.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> You may want to try removing afterburner without remembering settings and installing it again because I know everyone I have suggested this to has managed to force non-stop 3D clocks using that setting. Either way the last time this happened I did a manual registry key wipe using BradleyW's guide and that ended my clock throttling I was getting in AC:Unity and the crashing at loading Dying Light.
> 
> One would think that a driver remover like DDU which says it removes registry keys would actually remove all of the registry keys from driver installations but the truth is it doesn't. The accumulation of many driver installations leaving behind registry keys is what ultimately introduced oddball performance issues for me.


aha








I will try it.
I believe this is the thread, if anyone is interested.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> You may want to try removing afterburner without remembering settings and installing it again because I know everyone I have suggested this to has managed to force non-stop 3D clocks using that setting. Either way the last time this happened I did a manual registry key wipe using BradleyW's guide and that ended my clock throttling I was getting in AC:Unity and the crashing at loading Dying Light.
> 
> One would think that a driver remover like DDU which says it removes registry keys would actually remove all of the registry keys from driver installations but the truth is it doesn't. The accumulation of many driver installations leaving behind registry keys is what ultimately introduced oddball performance issues for me.


I found out what is causing my clock dropping.... It's afterburner!
If you do, crossfire + freesync + install afterburner= dropping......
If I do a fresh install of drivers, + enable freesync but don't install afterburner, my clocks are 100% stable. Once I install AB it becomes broken...

Also, my scores in 3dmark are now even higher by about ~300 points.

Also, regarding freesync.
If enable freesync and run FS X or FS U, my scores are the same as if freesync was disabled. However this doesnt apply to "normal" FS. If freesync is enabled in normal FS, my scores are lower by about ~600 points than as if it would disabled.

Will post screens soon.


----------



## fat4l

Here it is...
This only applies to FS performance....Score in ultra and extreme stays the same with freesync on and off.

*FREESYNC ON*
*19753*


*FREESYNC OFF*
*20338*


Hmmm









But hey...28000+ points in graphics


----------



## jeffrey okereke

so i'm having a problem with my 295x2;
when i updated the driver for windows 10 and the new CCC when i'm trying to change the crossfire mode for my setup the drop down box dosen't appear as if the second gpu isn't their, im just looking for what the problem is and the solution to the problem if you know


----------



## xer0h0ur

Quote:


> Originally Posted by *jeffrey okereke*
> 
> so i'm having a problem with my 295x2;
> when i updated the driver for windows 10 and the new CCC when i'm trying to change the crossfire mode for my setup the drop down box dosen't appear as if the second gpu isn't their, im just looking for what the problem is and the solution to the problem if you know


There appears to be some sort of bug that keeps disabling crossfire under Windows 10. Someone had suggested a fix to it but I can't remember anymore what thread it was in.

Edit: Found it= http://www.overclock.net/t/1579472/amd-catalyst-15-11-beta-driver-for-windows-is-now-available#post_24605632


----------



## kayan

A run of both Firestrike Extreme and Ultra comparing a 295x2 and a Fury X, both at stock clocks. Same exact system in both runs.

Extreme:
http://www.3dmark.com/compare/fs/6523084/fs/6548220
Ultra:
http://www.3dmark.com/compare/fs/6523108/fs/6548653

Everything is at stock, except my CPU, which is at 4.5ghz. Also posted in Fury owner's club.


----------



## X41822N

Are there any AMD Graphics Adapters on the horizons to take the throne from 295X2?

Maybe smaller architecture in plans; since Zen processor is not that far off (with it's 14 nm)


----------



## Alex132

Fury X2


----------



## fat4l

Quote:


> Originally Posted by *kayan*
> 
> A run of both Firestrike Extreme and Ultra comparing a 295x2 and a Fury X, both at stock clocks. Same exact system in both runs.
> 
> Extreme:
> http://www.3dmark.com/compare/fs/6523084/fs/6548220
> Ultra:
> http://www.3dmark.com/compare/fs/6523108/fs/6548653
> 
> Everything is at stock, except my CPU, which is at 4.5ghz. Also posted in Fury owner's club.


Interesting








It's still nice to see that even the old 295X2 still has the horsepower to beat top end cards, for less money









Also I would recommend you bios "tuning" of 295X2 to get the extra performance out of it.
You could get free % gain just by memory timings tuning.
Comparing graphic scores in X:
(furyX->295X2->ares3_tuned)
7668->10309->12820 = 24% boost vs stock 295X2 and 67% boost vs Fury X.

Comparing graphic scores in U:
(furyX->295X2->ares3_tuned)
3935->5127->6326 = 23% boost vs stock 295X2 and 61% boost vs Fury X.

Would be nice to see your overclocked results








+rep


----------



## fat4l

New driver(crimson), new, improved, results








1200/1700MHz(1500Strap Timings)










Spoiler: Warning: 15.11 DRIVERS



*P*


*X*


*U*






Spoiler: Warning: CRIMSON DRIVERS



*P*


*X*


*U*




edit:// reuploaded images.
edit2://added comparison


----------



## xer0h0ur

I skipped benchmarking any drivers since the 15.7.1 so I will give this new Crimson edition driver a crack on all three FireStrike tests tonight when I get off work.

Fat, how would you say that monitor of yours is working? I have been eye-raping the prices everywhere considering buying that Freesync monitor.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> I skipped benchmarking any drivers since the 15.7.1 so I will give this new Crimson edition driver a crack on all three FireStrike tests tonight when I get off work.
> 
> Fat, how would you say that monitor of yours is working? I have been eye-raping the prices everywhere considering buying that Freesync monitor.


Well I would say, the monitor is great. The only problem I'm having with it is blackscreening with card clocked very high and using more than 60Hz.
So lets say, if I clock my gpu1 to 1250Mhz core, and I set my refresh rate to 144Hz(using DP) then the blackscreening is there. It just says "out of range" and goes black.
I will however try to play with bios a bit and maybe altering some dpm7 voltages will help.
I also think this is the 290X core relared issue not the monitor one. When I had my 295X2 in the system, the same happened just at different mhz.
When I do 60Hz, no problem there.


----------



## xer0h0ur

Doesn't sound like a problem I would run into then. My three GPUs refuse to clock anywhere nearly that high to begin with. I can't push past 1150MHz without losing stability.


----------



## bobbavet

Gday Guys

Been a while. Still lovin me 295x2. Considering adding another card to help with more candy @ 4k. Any noticeable difference between adding a 290 or a 290x?

Would be happy with either. My main prob would be finding a Koolance block to match up with my current block. I suppose my CM V1000 won't handle it either.

Interested in observations of AMD Crimson with the 295x2.

Cheers Bob


----------



## Dagamus NM

Get another 295x2. Then you can be very happy and never wonder what might have been.


----------



## wermad

Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys
> 
> Been a while. Still lovin me 295x2. Considering adding another card to help with more candy @ 4k. Any noticeable difference between adding a 290 or a 290x?
> 
> Would be happy with either. My main prob would be finding a Koolance block to match up with my current block. I suppose my CM V1000 won't handle it either.
> 
> Interested in observations of AMD Crimson with the 295x2.
> 
> Cheers Bob


http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2


----------



## bobbavet

Thanks Wermad, ya helped me with my current 295 and block.

I thought there were better scaling gains in tri fire over quad. Don't think I could budget another 295x2 given block and PSU, but it would be sweet.


----------



## wermad

Np







. I got both of my koolance blocks when ppcs.com had them for $107 each. I already had a koolance triple bridge from my previous tri 290 setup, I just made some block plates for the middle unused ports.

Scaling, you really wanna push your res to the most of the 4th when it can. I keep playing one game which works best with one card disabled. But newer games like bf3 and bf4 run much better with the second card enabled. If cost is a concern, tri-fire is a great choice. It may work better then quads at times but if you're not running crazy resolutions, you won't miss the 4th with some minor tweaking. Btw, i don't think the bridge aligns with a koolance 290/290x block and the 295x2. Might have to get creative with the loop.

I'm pulling my system soon. Going back to air for now since I'm completely redoing my loop in acrylic. Its gonna be a while, though stock cards will be a plus in the cold winter days and nights.


----------



## bobbavet

Quote:


> Originally Posted by *wermad*
> 
> . Btw, i don't think the bridge aligns with a koolance 290/290x block and the 295x2. Might have to get creative with the loop..


Yeh I was hopefully thinking that they may keep the front port at least aligned standard on all their WB. At least for those of the same gen.


----------



## xer0h0ur

I don't know if any of the block manufacturers made their 295X2's block terminals line up with their 290/X's block terminal. Its sort of a pain in the rear.


----------



## wermad

possibly the one close to the i/o plate may align.





if it does, then just run your loop in series.


----------



## bobbavet

Quote:


> Originally Posted by *wermad*
> 
> possibly the one close to the i/o plate may align.
> 
> 
> 
> 
> 
> if it does, then just run your loop in series.


Thats what I mean. I did some perspective and resize of the 2 WB in Photoshop and are close. I have sent of a tech support email asking.

Where did you get that picture of the 290x WB from. It is different to the one on Koolance site. http://koolance.com/video-card-vga-amd-radeon-r9-290x-water-block-vid-ar290x

Any thoughts on the tri with the CM V1000 PSU? Pretty sure 3x290X in tri would be a probem but 295x2/290?


----------



## wermad

I ran tri 290s on a V1000, though it may have been enough wattage, I think it don't have enough amps. On games that ran hard, such as Crysis 3, the V1000 kept shutting off. These were factory oc'd cards btw. One 295x2 ran smoothly with my old V1000. Test it, and see what happens. Remember, you need 50amps just for the beast and at least 20-25 for the mb/cpu/etc.. I would feel safer with a 1200w or 1300 unit with a single rail that can handle the 295x2 along w/ the rest of your system. Maybe the G1300?


----------



## bobbavet

Anyone else been playin Star Wars Battlefront?

It really puts my GPU and CPU temps up big time. Usually around Gpu mid 60s, Cpu mid 50's. Now I am getting Gpu High 70's Cpu High 80's.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Doesn't sound like a problem I would run into then. My three GPUs refuse to clock anywhere nearly that high to begin with. I can't push past 1150MHz without losing stability.


You should look into hawaii bios editing thread. U can sqeeze some nice performance out of 295x2 and also improve stability so you can clock higher. Feel free to pm me if interested


----------



## bobbavet

Heard back from Koolance Tech.









Bob,

Yep, the front ports will line up, but the second ones obviously don't.

-Dylan

Koolance Technical Support

On 11/24/2015 4:01 PM, Robert wrote:
> Gday Koolance
>
> I received my refurbished VID-AR295X2 WB. It is excellent!
>
> I have a question regarding your VID-AR290X product for the 290/290x
>
> I am now considering a tri-fire with a 290 or 290x.
>
> Do the front ports of the VID-AR295X2 match up with the front ports of the VID-AR290X so a "bridge" can be used?
>
> Thankyou Bob


----------



## bobbavet

*AMD Radeon R9 295X2 and XFX R9 290X DD TriFire Review*


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> I'm pulling my system soon. Going back to air for now since I'm completely redoing my loop in acrylic. Its gonna be a while, though stock cards will be a plus in the cold winter days and nights.


Yes, those San Diego ambient temps get quite low


----------



## fat4l

Guys, anyone willing to show the screenshot of EVV program showing default voltage of your card ?
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_30
Link:- The Stilt's VID APP
We need more results


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> Yes, those San Diego ambient temps get quite low


Yeah, I know right! Its brutal down here; just having survived 80°F weekend here







.

Ugh, its gonna be a big chore but I have to pull all the rads to remove the chrome/nickel fittings and and re-configure a few other items. Going with all black fittings so the chrome ones have to go. I may just switch one card to air or just run the igpu (







).


----------



## fat4l

RAM I/O bus voltage mod done!
GPU-z showing 1.047V @VDDCI now


----------



## fat4l

Quote:


> Originally Posted by *Darkstar757*
> Can you explain bios tuning? I have a saphire 295x2 and I would love to gain some performance increases.
> 
> Thanks,
> Darkstar757


Hi








Well there's a thread called Hawaii bios editing.
www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_30
In the thread there's all the info needed to mod the bios for all hawaii cards.
By doing mods you can get stable/higher clocks, more voltage, custom power limits and throttling limits, more performance out of your memory(timings mods) etc.
They require bios flashing of course.

I would start with EVV program. It shows your default voltage for your cards.
Link:- The Stilt's VID APP

Do you have sapphire normal or OC edition card ?

This is the example of how much performance you can get out of your card.
For me it makes a difference of ~20%.
http://www.3dmark.com/compare/fs/6593207/fs/6590481


----------



## bobbavet

My Crimson benches.

Big gains on 1080. Not so @ higher res with Firestrike.

Unigine great benefits to Min and Max FPS @4k


----------



## DrR0Ck

Hi folks,

I have an XFX R9 295X2. The fan rattles a bit (mounted horizontally) so I requested a replacement from XFX support. The replacement is clearly a different model fan, but as long as it works without rattling, I'm good.

During boot, the stock fan runs full blast until the drive kicks in and then spins down. It spins up when the card is under load. The replacement fan runs full blast constantly and never spins down. I asked XFX support why this would be, but they have not responded.

It's a simple 2 pin fan header. Power and ground. What could possibly cause this behavior? I would consider buying a fan replacement, but I'm concerned they will behave the same. Can anyone shed any light on what's happening, and what I could do to fix it?

Thanks!


----------



## xer0h0ur

Just as a general warning and observation with regard to the Crimson driver. The 295X2 does not seem to disable crossfire. Doesn't matter if you globally turn off crossfire, if you're doing it within a game profile or both. Although I admit I have only tested this so far with CS:GO so I can't say with any certainty if its affecting all games, DX9 games or just CS:GO. All I know is that CS:GO was exhibiting nasty hitching from framerate drops and ******ed amounts of frametime variance which is exactly what used to happen on the Catalyst drivers if Crossfire was enabled. So despite disabling crossfire globally and within the CS:GO game profile, Afterburner was confirming both of the 295X2's GPUs were clocking up and sharing a processing load. I was only able to work around this issue by disconnecting my monitor from the 295X2 and connecting it to the 290X so that when I played games it was GPU3 (aka the 290X) that was being used by itself to render the game. This was also confirmed through Afterburner, single GPU clocking up and processing a load.


----------



## F4ze0ne

Quote:


> Originally Posted by *DrR0Ck*
> 
> Hi folks,
> 
> I have an XFX R9 295X2. The fan rattles a bit (mounted horizontally) so I requested a replacement from XFX support. The replacement is clearly a different model fan, but as long as it works without rattling, I'm good.
> 
> During boot, the stock fan runs full blast until the drive kicks in and then spins down. It spins up when the card is under load. The replacement fan runs full blast constantly and never spins down. I asked XFX support why this would be, but they have not responded.
> 
> It's a simple 2 pin fan header. Power and ground. What could possibly cause this behavior? I would consider buying a fan replacement, but I'm concerned they will behave the same. Can anyone shed any light on what's happening, and what I could do to fix it?
> 
> Thanks!


I replaced my fan with a Venturi HP-12. No issues so far.


----------



## ljreyl

Hey Xero. Happy Thanksgiving.

After doing some testing, it seems DX9 games are the ones that can't disable crossfire. Everything else works.

Dying Light - Works
StarCraft 2 - Nope!
Call of Duty Black Ops 1 - Nope
Call of Duty Advanced Warfare - Works

I'm sure they'll fix this next time around. They added in freesync and frame rate control for DX9 games so I can't complain. Now I can fully utilize all of my games on this BEAUTIFUL XR341CK. Ultra Wide FTW if you haven't done so already.


----------



## bobbavet

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just as a general warning and observation with regard to the Crimson driver. The 295X2 does not seem to disable crossfire.


I don't have this problem. I have to disable x-fire to get "WarThunder" to run proper. It's been like this since new WT graphic engine.

It makes me sad WT used to run brilliant with Xfire. Hope I don't have to wait another 2 years for it to be solved again.









Not surprised to hear it happening though. Crimson as "buggy" as ever with different issues on same cards.

Same **** rolled in glitter.

Oh well, it was nice to dream AMD. *shrug*.


----------



## xer0h0ur

Quote:


> Originally Posted by *ljreyl*
> 
> Hey Xero. Happy Thanksgiving.
> 
> After doing some testing, it seems DX9 games are the ones that can't disable crossfire. Everything else works.
> 
> Dying Light - Works
> StarCraft 2 - Nope!
> Call of Duty Black Ops 1 - Nope
> Call of Duty Advanced Warfare - Works
> 
> I'm sure they'll fix this next time around. They added in freesync and frame rate control for DX9 games so I can't complain. Now I can fully utilize all of my games on this BEAUTIFUL XR341CK. Ultra Wide FTW if you haven't done so already.


Thank you for that. Its good to finally hear back from someone about this. You're the only person that has confirmed or tested anything else since I mentioned this. At least its isolated to DX9 games and yeah I sure hope they fix it. If its not too much of a bother can you use the bug report form in the Crimson software to report this as well? I already did but you know the name of the game, more reports more attention.

Edit: Forgot to say Happy Thanksgiving as well


----------



## NBrock

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just as a general warning and observation with regard to the Crimson driver. The 295X2 does not seem to disable crossfire. Doesn't matter if you globally turn off crossfire, if you're doing it within a game profile or both. Although I admit I have only tested this so far with CS:GO so I can't say with any certainty if its affecting all games, DX9 games or just CS:GO. All I know is that CS:GO was exhibiting nasty hitching from framerate drops and ******ed amounts of frametime variance which is exactly what used to happen on the Catalyst drivers if Crossfire was enabled. So despite disabling crossfire globally and within the CS:GO game profile, Afterburner was confirming both of the 295X2's GPUs were clocking up and sharing a processing load. I was only able to work around this issue by disconnecting my monitor from the 295X2 and connecting it to the 290X so that when I played games it was GPU3 (aka the 290X) that was being used by itself to render the game. This was also confirmed through Afterburner, single GPU clocking up and processing a load.


You know. I thought that's what was happening but never really tested it. Battle Front acted funny in Crossfire before the new drivers so I just ran it disabled. After it does the same exact thing even if I set it to disabled for the game or globally. I am going to test real quick and see if the second gpu is being put under load.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Thank you for that. Its good to finally hear back from someone about this. You're the only person that has confirmed or tested anything else since I mentioned this. At least its isolated to DX9 games and yeah I sure hope they fix it. If its not too much of a bother can you use the bug report form in the Crimson software to report this as well? I already did but you know the name of the game, more reports more attention.
> 
> Edit: Forgot to say Happy Thanksgiving as well


Done deal. Its good you at least found a work around.

Do you play any blizzard games? Kinda need a partner for Archon mode in SC2


----------



## NBrock

That is the case. In Battle Front it is not disabling crossfire.

I reported it. I am going to test more games when I have a chance.


----------



## DrR0Ck

Is anyone else having terrible performance in Pinball FX2 after updating to the new Crimson driver?


----------



## fat4l

Today I did some fraps benchmark of *Crysis 3*, Welcome to the Jungle.
134s Benchmark, Crossfire 290x (ares III).
All max details, 1440p, SMAA MGPU 2x.
My results are:
*Stock clocks, 1030/1250MHz:*
Min: 59
Max: 108
Avg: 71.127

*Overclocked clocks, 1200/1700MHz(1500Timings):*
Min: 62
Max: 132
Avg: 85.517

Difference: 20.2%

*3DMark Firestrike eXtreme:*
http://www.3dmark.com/compare/fs/6593207/fs/6590481


Difference in Graphics Score: 19.7%

Nice scaling! Start modding your bios ppl


----------



## DrR0Ck

Quote:


> Originally Posted by *F4ze0ne*
> 
> I replaced my fan with a Venturi HP-12. No issues so far.


Curious - The Venturi has a 4- pin connector. Do you have it plugged into the motherboard, a fan controller or the card?


----------



## Alex132

Not gonna mod my BIOS in summer, especially with no-AC and ~35-40'c ambient temperatures. Might undervolt it though









Plus I can achieve higher than that with +100mV... so no need. And I wouldn't go above +200mV anyway.


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> Not gonna mod my BIOS in summer, especially with no-AC and ~35-40'c ambient temperatures. Might undervolt it though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plus I can achieve higher than that with +100mV... so no need. And I wouldn't go above +200mV anyway.


Its about clock/performance ratio








Tweaking the bios a bit etc

So far I've seen only 2 ppl modding their 295X2. Idk why ppl aren't eager to mod their 295X2s


----------



## Mega Man

No time


----------



## bobbavet

Quote:


> Originally Posted by *DrR0Ck*
> 
> Is anyone else having terrible performance in Pinball FX2 after updating to the new Crimson driver?


I didn't even know this game existed other than on "mobile". I downloaded and yes it is poor ie laggy. Tried everything and no fix. How about you?


----------



## DrR0Ck

Quote:


> Originally Posted by *bobbavet*
> 
> I didn't even know this game existed other than on "mobile". I downloaded and yes it is poor ie laggy. Tried everything and no fix. How about you?


I'll probably roll back to the last CCC driver and see if it fixes. If you like pinball, FX2 is about as good as it gets on PC. I play a fair amount and it was smooth as silk until the crimson driver installed. When I get a chance, I will roll back and report the results.


----------



## F4ze0ne

Quote:


> Originally Posted by *DrR0Ck*
> 
> Curious - The Venturi has a 4- pin connector. Do you have it plugged into the motherboard, a fan controller or the card?


It's plugged into the card. I had to buy an adapter that went from 4pin > 3pin for this fan.


----------



## xer0h0ur

Quote:


> Originally Posted by *fat4l*
> 
> Its about clock/performance ratio
> 
> 
> 
> 
> 
> 
> 
> 
> Tweaking the bios a bit etc
> 
> So far I've seen only 2 ppl modding their 295X2. Idk why ppl aren't eager to mod their 295X2s


Honestly its a nightmare for me to mod my card. I don't feel like the reward is great enough for the risk and all that work to remove it from my open loop if I trashed it. I am taking less and less risks with overclocking as of late. I have been valuing stability over added performance more and more.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Honestly its a nightmare for me to mod my card. I don't feel like the reward is great enough for the risk and all that work to remove it from my open loop if I trashed it. I am taking less and less risks with overclocking as of late. I have been valuing stability over added performance more and more.


Well modding timings is not risky in my opinion.
290X cards use straps and different timings for them.
For example if you clock your mems to 1500mhz, they will use timings from 1376-1500mhz strap. If you clock ur mems then to 1510Mhz, ur performance will decrease, because the mems will use 1501-1625 strap timings.
From my testing and observations, you can easily clock ur mems 200-250MHz above the strap you are using.
This means, that for example, if you use 1500 strap timings(1376-1500) u can possibly clock ur mems to 1700Mhz+(if you can clock them this high normally).
If ur max is 1500Mhz, u can possibly use 1250 strap timings=>increased performance.

Another benefit is, you can adjust power limits or you can adjust your throttling limit(from 75->85C for example) so you don't need to use any software for oveclocking.

Another one is adjusting aux voltage(not 100% sure if 295X2 supports this) which may help you to improve stability in mem overclocking.

Then of course, core clock voltage and fan curve(not sure if 295X2 supports it).


----------



## steezebe

Quote:


> Originally Posted by *xer0h0ur*
> 
> Honestly its a nightmare for me to mod my card. I don't feel like the reward is great enough for the risk and all that work to remove it from my open loop if I trashed it. I am taking less and less risks with overclocking as of late. I have been valuing stability over added performance more and more.


Amen. I really hate the red-screen-of-something-went-wrong-or-got-too-hot when I'm two hours into a heavy fallout session. I cry inside a little bit when that happens...

right now I'm at 1058 MHz clock, 1358 Mem with no power increase, and I'm totally happy with it, mostly because I haven't had a stutter in any game. Good enough eh?

I was pushing much higher clocks, but for me, as long as I never drop below about 30ish FPS with an ENB I'm good to go! (FPS players may cringe at the thought







)


----------



## Alex132

Quote:


> Originally Posted by *fat4l*
> 
> *
> Another benefit is, you can adjust power limits or you can adjust your throttling limit(from 75->85C for example)* so you don't need to use any software for oveclocking.


That is a very bad idea, there is a reason AMD put this limiter in-place - and it's not to just be draconian.

- - - -

Anyone have a rough idea of what undervolting does to this card? I want to undervolt it for summer, I can't stand gaming with this card for too long


----------



## Dagamus NM

Quote:


> Originally Posted by *Mega Man*
> 
> No time


Ha, amen to that. I was up late doing my nuclear engineering homework the other night and sacrificed a few extra minutes of sleep to create folders somewhere I would remember them and backup up the bios from both chips on both cards and in both bios switch positions.

ColeriaX sent me a link to a modded bios that got him some pretty awesome synthetic results for the same cards as I have.

I will have to go review how to update the bios and probably download some programs to do it.

Anybody know where the step by step instructions for flashing the dual chip card might be located? I know I have seen it, probably in this thread but my searches came up with too many results that were dead ends so I figured I would just ask.


----------



## bobbavet

I am taking my first steps into some OC. I am liking this OC per Game feature of Crimson.

At present using my game War Thunder. Single GPU game atm.
I have had the GPU @1260 and mem @ 1350 @ stock power. My temps are around the mid 50C's. It is a 30C day inside.
Only ever glitch I had over 1/2 hr play was a couple of frames failed to render once. Literally a couple of seconds.
Were do I go from here? Do I star bumping up some power for further clocks?

I also noticed when I ran 3Dmark in OC and Xfire the GPU never went above stock of 1018.
My temps were in 60C so I see no reason it may have throttled back on stock.

Cheers Guys


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> That is a very bad idea, there is a reason AMD put this limiter in-place - and it's not to just be draconian.
> 
> - - - -
> 
> Anyone have a rough idea of what undervolting does to this card? I want to undervolt it for summer, I can't stand gaming with this card for too long


Well for example, my ares has a limit of 85C from stock.
Also some ppl live in hot countries so for them adjusting this limit might be beneficial(running the stock clocks + hot ambient for example).
So it depends...
Quote:


> Originally Posted by *Dagamus NM*
> 
> Ha, amen to that. I was up late doing my nuclear engineering homework the other night and sacrificed a few extra minutes of sleep to create folders somewhere I would remember them and backup up the bios from both chips on both cards and in both bios switch positions.
> 
> ColeriaX sent me a link to a modded bios that got him some pretty awesome synthetic results for the same cards as I have.
> 
> I will have to go review how to update the bios and probably download some programs to do it.
> 
> Anybody know where the step by step instructions for flashing the dual chip card might be located? I know I have seen it, probably in this thread but my searches came up with too many results that were dead ends so I figured I would just ask.


Modded 295X2 bios ? ...hmmm. I saw only 1 person on OCN forums that modded his 295X2.
What mods have been done in that bios ?

Going to bed. I will give you more info "how to" when I wake up.


----------



## Mega Man

your one card will actually have "2" gpus on it, floow the guide for multi gpu

http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards/0_100

ONLY FLASH ONE SWITCH it took a few tries to get mine to work, i keps attempting to flash main on slave and vise versa
below is what it would be on a 295x2, but i may have the master/slave backwards,

also you DO NOT want to flash BOTH with the same bios, master bios to master slave bios to slave !!!!

command ........................................ gpu, and slave or master
atiwinflash -f -p 0 7970XFX.rom . first gpu master
atiwinflash -f -p 1 7970XFX.rom . first gpu slave
atiwinflash -f -p 2 7970XFX.rom . second gpu master
atiwinflash -f -p 3 7970XFX.rom . second gpu slave

iirc you have to search for the newest version of ati flash and i had to make a boot disk ( i use rufus )


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> your one card will actually have "2" gpus on it, floow the guide for multi gpu
> 
> http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards/0_100
> 
> ONLY FLASH ONE SWITCH it took a few tries to get mine to work, i keps attempting to flash main on slave and vise versa
> below is what it would be on a 295x2, but i may have the master/slave backwards,
> 
> also you DO NOT want to flash BOTH with the same bios, master bios to master slave bios to slave !!!!
> 
> command ........................................ gpu, and slave or master
> atiwinflash -f -p 0 7970XFX.rom . first gpu master
> atiwinflash -f -p 1 7970XFX.rom . first gpu slave
> atiwinflash -f -p 2 7970XFX.rom . second gpu master
> atiwinflash -f -p 3 7970XFX.rom . second gpu slave
> 
> iirc you have to search for the newest version of ati flash and i had to make a boot disk ( i use rufus )


no.
u flash only one "switch position".
Basically, leave card as it is.
Use this method here:
http://forums.overclockers.co.uk/showthread.php?t=18558655

Then basically, if you booted from usb, u will see
C:
Then u type: atiflash -i
This will show you the "position" of your "cards"(U only have one card with 2 cores, but u get me right)
Something like 0- master, 1-slave

Then u do
atiflash -p -f 0 biosnamemaster.rom
atiflash -p -f 1 biosnameslave.rom
(0 and 1 above mean the position of your cards/cores=atiflash -i)

then ctrl + alt + del to restart. done.

Make sure u have both bioses, for master and slave "card"(u have 2 cards on 1 pcb)

In the thread I linked(about 7990), theres everything explained properly.
Any questions let me know.

Also one tip. if you download the latest atiflash, rename it to atiflash.exe.


----------



## Mega Man

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> your one card will actually have "2" gpus on it, floow the guide for multi gpu
> 
> http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards/0_100
> 
> *ONLY FLASH ONE SWITCH* it took a few tries to get mine to work, i keps attempting to flash main on slave and vise versa
> below is what it would be on a 295x2, but i may have the master/slave backwards,
> 
> also you DO NOT want to flash BOTH with the same bios, master bios to master slave bios to slave !!!!
> 
> command ........................................ gpu, and slave or master
> atiwinflash -f -p 0 7970XFX.rom . first gpu master
> atiwinflash -f -p 1 7970XFX.rom . first gpu slave
> atiwinflash -f -p 2 7970XFX.rom . second gpu master
> atiwinflash -f -p 3 7970XFX.rom . second gpu slave
> 
> iirc you have to search for the newest version of ati flash and i had to make a boot disk ( i use rufus )
> 
> 
> 
> no.
> u flash only one "switch position".
> Basically, leave card as it is.
> Use this method here:
> http://forums.overclockers.co.uk/showthread.php?t=18558655
> 
> Then basically, if you booted from usb, u will see
> C:
> Then u type: atiflash -i
> This will show you the "position" of your "cards"(U only have one card with 2 cores, but u get me right)
> Something like 0- master, 1-slave
> 
> Then u do
> atiflash -p -f 0 biosnamemaster.rom
> atiflash -p -f 1 biosnameslave.rom
> (0 and 1 above mean the position of your cards/cores=atiflash -i)
> 
> then ctrl + alt + del to restart. done.
> 
> Make sure u have both bioses, for master and slave "card"(u have 2 cards on 1 pcb)
> 
> In the thread I linked(about 7990), theres everything explained properly.
> Any questions let me know.
> 
> Also one tip. if you download the latest atiflash, rename it to atiflash.exe.
Click to expand...

i just said that. ONLY 1 switch but BOTH gpu bios


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> i just said that. ONLY 1 switch but BOTH gpu bios


This was the confusing part there...
Quote:


> Originally Posted by *Mega Man*
> 
> command ........................................ gpu, and slave or master
> atiwinflash -f -p 0 7970XFX.rom . first gpu master
> atiwinflash -f -p 1 7970XFX.rom . first gpu slave
> atiwinflash -f -p 2 7970XFX.rom . second gpu master
> atiwinflash -f -p 3 7970XFX.rom . second gpu slave


----------



## Mega Man

it was a copy paste from the thread i posted. above it it has directions to use 2 different bios


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> Anyone have a rough idea of what undervolting does to this card? I want to undervolt it for summer, I can't stand gaming with this card for too long


This can be done through bios, eassyyyyy.
Can help u with that, should u want me to


----------



## Alex132

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Anyone have a rough idea of what undervolting does to this card? I want to undervolt it for summer, I can't stand gaming with this card for too long
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This can be done through bios, eassyyyyy.
> Can help u with that, should u want me to
Click to expand...

I don't want to mod the BIOS, I just want to know if people have tried to undervolt these cards - and how well they do.


----------



## xer0h0ur

The only reason the 295X2's max temp is set to 75 is due to the original water cooling CLC. Its not meant to withstand higher temperatures. If you have it waterblocked its also pretty much pointless to raise the max temperature as well since you shouldn't be reaching those temps on an open loop.


----------



## Alex132

Quote:


> Originally Posted by *xer0h0ur*
> 
> The only reason the 295X2's max temp is set to 75 is due to the original water cooling CLC. Its not meant to withstand higher temperatures. If you have it waterblocked its also pretty much pointless to raise the max temperature as well since you shouldn't be reaching those temps on an open loop.


Hence why you can raise the temps, sure, but if you're on stock CLC you will damage the pump. And if you're on water and need to raise the temps - you're doing something wrong.

ie; it's completely pointless to raise the maximum temps and only potentially damaging.


----------



## Dagamus NM

Quote:


> Originally Posted by *Alex132*
> 
> Hence why you can raise the temps, sure, but if you're on stock CLC you will damage the pump. And if you're on water and need to raise the temps - you're doing something wrong.
> 
> ie; it's completely pointless to raise the maximum temps and only potentially damaging.


That is correct.


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> I don't want to mod the BIOS, I just want to know if people have tried to undervolt these cards - and how well they do.


in idle they can go as low as 0.8V(or so). U have to try this for yourself. I however believe that with such utilities as afterburner, u can only lower the volts by 100mv and this will move you to 0.868v.
The same applies to 3d. -100mv shoud be max when using afterburner. iTurbo allows -100mv as well. Not sure about trixx.
These utilities use offset voltage control. This means, if you decrease volts by 100mV, it will be decreased in 2d and 3d as well.
If you want to go lower, you need to mod the bios.

I also believe all* 295X2 use 1.25V as 3D voltage, which is higher than necessary. For example my ares3 is using 1.19375V and 1.225v for cores.

U can check this by using The Stilt's VID APP which shows you the voltage used by your card in 3D.


----------



## Alex132

Unless I am doing something wrong, I can't find the voltage control setting on the new Trixx. It was there on the old one...



Also my voltage under load is ~1.185v


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> Unless I am doing something wrong, I can't find the voltage control setting on the new Trixx. It was there on the old one...
> 
> 
> 
> Also my voltage under load is ~1.185v


@trixx, weird. use iTurbo. it's much better.
@voltage, run the program I linked. "Also my voltage under load is ~1.185v" is very inaccurate.

Link:- The Stilt's VID APP
Link:- HIS iTurbo v1.6.6


----------



## Mega Man

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> The only reason the 295X2's max temp is set to 75 is due to the original water cooling CLC. Its not meant to withstand higher temperatures. If you have it waterblocked its also pretty much pointless to raise the max temperature as well since you shouldn't be reaching those temps on an open loop.
> 
> 
> 
> Hence why you can raise the temps, sure, but if you're on stock CLC you will damage the pump. And if you're on water and need to raise the temps - you're doing something wrong.
> 
> ie; it's completely pointless to raise the maximum temps and only potentially damaging.
Click to expand...

thank you for making me laugh- not that you did/said anything wrong, but i have been having arguments with people recently on exactly this, how when water cooling the temp is too high or the cost is a joke compared to clc and it is the gpus fault, another proof for me and thanks
Quote:


> Originally Posted by *Alex132*
> 
> Unless I am doing something wrong, I can't find the voltage control setting on the new Trixx. It was there on the old one...
> 
> 
> 
> Also my voltage under load is ~1.185v


not the only one, i dont have it either, i had to go back to old version, which causes my pc to freeze when i press apply


----------



## fat4l

guys....srsly, try iTurbo...You'll love it...


----------



## Mega Man

i duuno does it allow for +300mv ???


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> i duuno does it allow for +300mv ???


it allows up to +400mV.
It also has VRM overheating option + AUX(VDDCI) voltage.


----------



## Mega Man

ok ill give it a shot, if you break my gpus though...... you buy em ( joking ) or am i

yea DL link seems to be broken :/

yep, broken

http://www.hisdigital.com/iturbo/download/iTurbo_v1.5.0.exe


----------



## Mega Man

@fat4l can you upload a zip ?

only +100mv on 295x2 and causes bsod on win 10


----------



## fat4l

I already linked it above








https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view
U MUST use 1.6.6 version


----------



## Mega Man

it gives me a error that it could not create driver file looks like i need to do some work, working on it now


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> it gives me a error that it could not create driver file looks like i need to do some work, working on it now


this happens only if you try to run it twice (afaik)


----------



## Mega Man

your right it was buried tyvm ! still only allows 100mv though


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> your right it was buried tyvm ! still only allows 100mv though


That's weird. For me it shows +400mV.
Are u looking at vddc?
Do u have vddci slider available too?


----------



## Mega Man

no i do not have the additional slider, i think this is because the ares 3 is not a 295x2, but 2 290xs -

with that said i can increase the power to +100mv on gpu 1a, switch to gpu 1b and do it again ( 200mv ), then back to 1 a and again ( 300mv) but my displays go 100% black which is a hard coded lock i think as i have seen this before on my 79xxs

i would buy 2 ares2s, but the outputs, suck, they should of just made it have 6 minidps


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> no i do not have the additional slider, i think this is because the ares 3 is not a 295x2, but 2 290xs -
> 
> with that said i can increase the power to +100mv on gpu 1a, switch to gpu 1b and do it again ( 200mv ), then back to 1 a and again ( 300mv) but my displays go 100% black which is a hard coded lock i think as i have seen this before on my 79xxs
> 
> i would buy 2 ares2s, but the outputs, suck, they should of just made it have 6 minidps


U can also use an older trixx which allows +200mV. Or bios modding


----------



## Mega Man

Or I can use the universal problem fixer. Beer!


----------



## DrR0Ck

Quote:


> Originally Posted by *bobbavet*
> 
> I didn't even know this game existed other than on "mobile". I downloaded and yes it is poor ie laggy. Tried everything and no fix. How about you?


Reverted back to the Catalyst 15.11.1 driver and all works fine. So definitely driver related.


----------



## xer0h0ur

Quote:


> Originally Posted by *Mega Man*
> 
> Or I can use the universal problem fixer. Beer!


Can confirm. Beer works.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> no i do not have the additional slider, i think this is because the ares 3 is not a 295x2, but 2 290xs -
> 
> with that said i can increase the power to +100mv on gpu 1a, switch to gpu 1b and do it again ( 200mv ), then back to 1 a and again ( 300mv) but my displays go 100% black which is a hard coded lock i think as i have seen this before on my 79xxs
> 
> i would buy 2 ares2s, but the outputs, suck, they should of just made it have 6 minidps
> 
> 
> 
> U can also use an older trixx which allows +200mV. Or bios modding
Click to expand...

Trixx with Sapphire OC Bios allows up to +300mV on the 295x2 and iTurbo allows up to +400mV on 290x's (which the Ares III is).

Trust me....+300mV is plenty on a 295x2, most cards gets display corruption around +250mV or so.

Ares and Devil both read and act like 290x's in CF, 295x2 is a different beast


----------



## Mega Man

mine does not corrupt anything. just black screens :/
i love the 290xs/7970s ! only safety is ME !!!!!!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> mine does not corrupt anything. just black screens :/
> i love the 290xs/7970s ! only safety is ME !!!!!!


Weird, most I've talked to about it (including my own) the display corrupts and only a reboot can fix it, then again these have all been using DVI, I assume you'd be using DP right?


----------



## Mega Man

Correct (and only dp) and only reboot works


----------



## Dagamus NM

Quote:


> Originally Posted by *Mega Man*
> 
> your one card will actually have "2" gpus on it, floow the guide for multi gpu
> 
> http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards/0_100
> 
> ONLY FLASH ONE SWITCH it took a few tries to get mine to work, i keps attempting to flash main on slave and vise versa
> below is what it would be on a 295x2, but i may have the master/slave backwards,
> 
> also you DO NOT want to flash BOTH with the same bios, master bios to master slave bios to slave !!!!
> 
> command ........................................ gpu, and slave or master
> atiwinflash -f -p 0 7970XFX.rom . first gpu master
> atiwinflash -f -p 1 7970XFX.rom . first gpu slave
> atiwinflash -f -p 2 7970XFX.rom . second gpu master
> atiwinflash -f -p 3 7970XFX.rom . second gpu slave
> 
> iirc you have to search for the newest version of ati flash and i had to make a boot disk ( i use rufus )


So you are saying that there should be a different Bios for Master and Slave?

ColeriaX gave me the bios link but it is only a single file. If I use it for both master and slave I suppose I will brick my card?


----------



## Mega Man

yea i have seen that issue , no it will not brick your card. you will just have to reflash it


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> Correct (and only dp) and only reboot works


That explains the blackscreen then, using DP + high voltage results in black screen when i use it as well but when I'm using DVI the display corrupts instead


----------



## fat4l

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That explains the blackscreen then, using DP + high voltage results in black screen when i use it as well but when I'm using DVI the display corrupts instead


the same here. but why is it happening ?
it's rly weird.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That explains the blackscreen then, using DP + high voltage results in black screen when i use it as well but when I'm using DVI the display corrupts instead
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the same here. but why is it happening ?
> it's rly weird.
Click to expand...

Voltage corrupts the display controller, that's all there is to it really

Most extreme overclockers use VGA monitors + cables for this reason.

I'm not 100% sure about this but i think this is right:

VGA, DVI, HDMI, DP

^ That's the order of most reliable to least reliable I/O ports when using higher level voltages, someone with a bit more knowledge than i might be able to explain the science behind it all though


----------



## Dagamus NM

I found the post where Cool Mike put the slave bios and was able to update both master and slave on both cards. It is running but I haven't tested anything yet. I had to ditch class to get this done as it is. Luckily the lectures are all posted online. I love technology.

Mainly because I have to go back and rewatch the lectures that I have sat through my instructor goes so fast and assumes a lot of fresh previous knowledge. I know it but fresh is something a lot of it is not.

A one hour lecture takes me about three to get through with pausing and replaying as well as looking things up.

Anyway, now both of my xfx cards have the sapphire OC bios on them. Just need a custom bios that opens these up a bit higher. Full custom loop and temps are at 34-36C idle with water temps 27.4C at the pump and 27.5C just after my cards. The 34-36C is from HWInfo and probably not completely accurate whereas the water temps are from my aquacomputer Airplex modularity system pump/res/rad combo connected to two 420mm rads with cougar fans in push/pull as well as the AC flow meter. I show a flow rate of 102.4LPH and this is through 7 water blocks (EK SUpremacy evo elite, RIVE VRM and SB, two memory and two r9 295x2 blocks).


----------



## DMatthewStewart

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Voltage corrupts the display controller, that's all there is to it really
> 
> Most extreme overclockers use VGA monitors + cables for this reason.
> 
> I'm not 100% sure about this but i think this is right:
> 
> VGA, DVI, HDMI, DP
> 
> ^ That's the order of most reliable to least reliable I/O ports when using higher level voltages, someone with a bit more knowledge than i might be able to explain the science behind it all though


For some reason, and Im not sure where I ever heard it, but Ive heard the same thing.
Quote:


> Originally Posted by *Mega Man*
> 
> yea i have seen that issue , no it will not brick your card. you will just have to reflash it


Also, my DP went black screen during an OC session and never recovered. I have that card sitting around. I wonder if a reflash will get DP working again. What do you think? BTW, this is for a 290 not 295x2


----------



## Mega Man

No idea. Worth a shot


----------



## Dagamus NM

After getting everything settled earlier today I ran GPUZ and saw that the exact same number is listed for all four GPUs. When I open the files in notepad it is all just gibberish to me so I really cannot tell if master and slave are the same. ColeriaX is saying that he only updated his master and left the slave alone.

I've updated GPU bios plenty in the past but this master/slave is really making me wonder what is actually necessary for mods on this card.


----------



## Mega Man

if you got them wrong it wouldnt boot. i know because i did it !


----------



## Dagamus NM

I'm not thinking I got them wrong, just wondering why there GPUz shows all four are the same bios.


----------



## Mega Man

mine do as well


----------



## fat4l

Quote:


> Originally Posted by *Dagamus NM*
> 
> I'm not thinking I got them wrong, just wondering why there GPUz shows all four are the same bios.


When you are flashing the bios in dos, it will show you, an "M" and an "S" at the end of the bios name. Gpu-z doesn't show this.


----------



## Dagamus NM

That explains it. Maybe after the semester is over I will have some time to toy with the bios editor.


----------



## ljreyl

So, little issue I found but doesn't matter much IMO.
Is anyone having problems with powertune sticking in a 295x2/290x trifire setup?
I can change powertune with the 295x2 but the 290x always reverts back to 0.
I've overclocked the 290x to 1030 core to match the sapphire OC bios I have. I don't need powertune really but decided to play with it and found this "small" issue.

Thoughts are appreciated.


----------



## fat4l

Quote:


> Originally Posted by *ljreyl*
> 
> So, little issue I found but doesn't matter much IMO.
> Is anyone having problems with powertune sticking in a 295x2/290x trifire setup?
> I can change powertune with the 295x2 but the 290x always reverts back to 0.
> I've overclocked the 290x to 1030 core to match the sapphire OC bios I have. I don't need powertune really but decided to play with it and found this "small" issue.
> 
> Thoughts are appreciated.


Well you can mod those values in bios so the cards will not need any software tweaks


----------



## xer0h0ur

Probably the weirdest thing that occurs with my cards in terms of performance is that when crossfire is globally disabled I can only overclock the vRAM on my 290X lightly to like 1300MHz but if crossfire is enabled I can bump the clock to 1500MHz without issue. Really makes no sense to me. The Hynix vRAM on my 295X2 can push 1700MHz without problems but that Elpida vRAM gives me black screen if I am connected to the 290X with crossfire disabled and clock it to 1500MHz. *shrug* I just leave it now at default 1250MHz to avoid any problems.


----------



## ljreyl

Quote:


> Originally Posted by *fat4l*
> 
> Well you can mod those values in bios so the cards will not need any software tweaks


Including powertune?
I'm kinda lazy to do more modding and bios flashing.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Probably the weirdest thing that occurs with my cards in terms of performance is that when crossfire is globally disabled I can only overclock the vRAM on my 290X lightly to like 1300MHz but if crossfire is enabled I can bump the clock to 1500MHz without issue. Really makes no sense to me. The Hynix vRAM on my 295X2 can push 1700MHz without problems but that Elpida vRAM gives me black screen if I am connected to the 290X with crossfire disabled and clock it to 1500MHz. *shrug* I just leave it now at default 1250MHz to avoid any problems.


I believe it's because the VRam of the main GPU is the one you're using. I experienced that same thing too.
I just clock mine at 1450. It's honestly fast enough, even for 3440x1440p gaming.


----------



## xer0h0ur

Its mirroring the vRAM so as far as I know it is using the same amount of vRAM per GPU. So if its 3.5GBs of vRAM usage then on 3 GPUs you have 10.5GB of 12GB in use.

Edit: While crossfired of course.


----------



## ljreyl

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its mirroring the vRAM so as far as I know it is using the same amount of vRAM per GPU. So if its 3.5GBs of vRAM usage then on 3 GPUs you have 10.5GB of 12GB in use.
> 
> Edit: While crossfired of course.


True. Maybe it's because bandwidth is tripled and less stress/load per GPU/Mem allow for higher overclocks? (just putting something out there lol)


----------



## fat4l

Quote:


> Originally Posted by *ljreyl*
> 
> Including powertune?
> I'm kinda lazy to do more modding and bios flashing.


Yep, including power limits.
*TDP
PL
TDC*
These are the 3 limits.

Stock 290X has limits of 208/208/200.
Stock 295X2 has 200/200/136.
Sapphire OC 295X2 has 202/202/137.

Adding "power limit" in, lets say afterburner, increases the first two limits, ie TDP, PL.
TDC("PowerTune limit for maximum thermally sustainable current by VDDC regulator that can be supplied")-cannot be changed by afterburner.
You must be careful with this one.
295X2 has 6 phases per "card"(2x6 for the whole card). I has 4 phases for gpu, 1 phase for I/O bus and 1 phase for memory so 4+1+1.
Each phase can supply 40A, thus 4x40A=160A max. The stock limit is 136A(137A for OC bios respectively). To avoid any throttling you can increase this value to lets say, 150-155A max and still be in a safe range.
For the first two values, you can just increase them to lets say, 300-350W(when using +50% in ab and using stock limits, you increase the limits from 200->300W(202->303W) anyway).
This doesn't mean the card will be using this amount of watts. It just means it won't throttle down when you hit the limits.


----------



## ljreyl

Quote:


> Originally Posted by *fat4l*
> 
> Yep, including power limits.
> *TDP
> PL
> TDC*
> These are the 3 limits.
> 
> Stock 290X has limits of 208/208/200.
> Stock 295X2 has 200/200/136.
> Sapphire OC 295X2 has 202/202/137.
> 
> Adding "power limit" in, lets say afterburner, increases the first two limits, ie TDP, PL.
> TDC("PowerTune limit for maximum thermally sustainable current by VDDC regulator that can be supplied")-cannot be changed by afterburner.
> You must be careful with this one.
> 295X2 has 6 phases per "card"(2x6 for the whole card). I has 4 phases for gpu, 1 phase for I/O bus and 1 phase for memory so 4+1+1.
> Each phase can supply 40A, thus 4x40A=160A max. The stock limit is 136A(137A for OC bios respectively). To avoid any throttling you can increase this value to lets say, 150-155A max and still be in a safe range.
> For the first two values, you can just increase them to lets say, 300-350W(when using +50% in ab and using stock limits, you increase the limits from 200->300W(202->303W) anyway).
> This doesn't mean the card will be using this amount of watts. It just means it won't throttle down when you hit the limits.


Sent you a PM, I'm interested now lol


----------



## fat4l

Quote:


> Originally Posted by *ljreyl*
> 
> Sent you a PM, I'm interested now lol


Let me know if you need more info


----------



## cmoney408

Quote:


> Originally Posted by *ljreyl*
> 
> I took apart my 295x2, cleaned off the paste, reapplied thermal paste to the GPUs AND the PLX, reinstalled and BOOM


Quote:


> Originally Posted by *Dagamus NM*
> 
> I put it on there. When I pulled it this morning the stock plate definitely made strong contact with the 1.5mm pad over the back of the PLX. Exactly what you have marked in red.


I want to replace the thermal paste with Grizzly Kryonaut. i was thinking since i am in there i might as well replace the thermal pads (and add one to the PLX). i want to do this 1 time and i want to do it right. i have a few questions i hope you can help with.

1) should i even replace the pads or is it a wast of time.

2) what thickness of pad is stock and/or recommended for optimum cooling (i see pads comes in .5mm, 1mm , 1.5mm and 2mm?

3) does it matter what w/mk i go with? grizzly has pads rated at 8 w/mk, but fujipoly has 17 w/mk?

4) how many total mm's do i need? fujipoly has a 60x50mm pack and a 100 x 15mm pack. Grizzly has 120 x 20mm and 100 x 100 mm?

any other tips, advice or brands are appreciated. it seems like if i will need a lot of padding it will be much more cost effective to go with grizzly, but at the same time if i am going to replace the pads i want to actually have a gain in cooling performance to make it worth it.

thank you,

Carlos


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> I want to replace the thermal paste with Grizzly Kryonaut. i was thinking since i am in there i might as well replace the thermal pads (and add one to the PLX). i want to do this 1 time and i want to do it right. i have a few questions i hope you can help with.
> 
> 1) should i even replace the pads or is it a wast of time.
> 
> 2) what thickness of pad is stock and/or recommended for optimum cooling (i see pads comes in .5mm, 1mm , 1.5mm and 2mm?
> 
> 3) does it matter what w/mk i go with? grizzly has pads rated at 8 w/mk, but fujipoly has 17 w/mk?
> 
> 4) how many total mm's do i need? fujipoly has a 60x50mm pack and a 100 x 15mm pack. Grizzly has 120 x 20mm and 100 x 100 mm?
> 
> any other tips, advice or brands are appreciated. it seems like if i will need a lot of padding it will be much more cost effective to go with grizzly, but at the same time if i am going to replace the pads i want to actually have a gain in cooling performance to make it worth it.
> 
> thank you,
> 
> Carlos


1. vrm pads- replace, mem pads-dont replace(waste of money)
2.not sure what dimennsions but make sure that u use the same thickness as stock ones.
3.IT MATTERS A LOT. Don't buy those cheap ones. I tried 7W/mK ones and they were crap. CRAP. The only ones I would recommend is fujipoly Extreme Plus or fujipoly Ultra Extreme(The best!).
Read these 2 threads:
http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/0_30
http://www.overclock.net/t/1576426/fujipoly-thermal-pads-vrm-insanely-low-temps-must-have-recommended/0_30
4.not sure. check some info in this thread about it.

Tips:Go grizzly kryonaut on the cores. put the paste on the cores, spread it + heat it up with a hair dryer and put the block on it and screw it tightly so the amount of paste between the core and cooler is minimal.
I would recommend using kryonaut + pad combo on vrms too(not just pads) as it improves the heat transfer even more(on both sides of pads).
Don;t worry about mems too much, better cooling(pads) doens't rly help them in OC and they don't run hot so no need to cool them better.


----------



## cmoney408

Quote:


> Originally Posted by *fat4l*
> 
> 1. vrm pads- replace, mem pads-dont replace(waste of money)
> 2.not sure what dimennsions but make sure that u use the same thickness as stock ones.
> 3.IT MATTERS A LOT. Don't buy those cheap ones. I tried 7W/mK ones and they were crap. CRAP. The only ones I would recommend is fujipoly Extreme Plus or fujipoly Ultra Extreme(The best!).
> Read these 2 threads:
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/0_30
> http://www.overclock.net/t/1576426/fujipoly-thermal-pads-vrm-insanely-low-temps-must-have-recommended/0_30
> 4.not sure. check some info in this thread about it.
> 
> Tips:Go grizzly kryonaut on the cores. put the paste on the cores, spread it + heat it up with a hair dryer and put the block on it and screw it tightly so the amount of paste between the core and cooler is minimal.
> I would recommend using kryonaut + pad combo on vrms too(not just pads) as it improves the heat transfer even more(on both sides of pads).
> Don;t worry about mems too much, better cooling(pads) doens't rly help them in OC and they don't run hot so no need to cool them better.


good info, i will stick with fujipoly. i will also try applying kryonaut to both sides of the pads!

from photos i can find. i am guessing the memory pads are the 8 squares around each gpu (so i guess i just need to be careful and not try to disturb the pads since i will be leaving them in place)?

which are vrm pads (the 22 very tiny squares in the center of the card - row of 12 next to a row of 10)?

looks like i just need to figure out the thickness of the stock VRM pads!


----------



## Mega Man

Quote:


> Originally Posted by *cmoney408*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ljreyl*
> 
> I took apart my 295x2, cleaned off the paste, reapplied thermal paste to the GPUs AND the PLX, reinstalled and BOOM
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> I put it on there. When I pulled it this morning the stock plate definitely made strong contact with the 1.5mm pad over the back of the PLX. Exactly what you have marked in red.
> 
> Click to expand...
> 
> I want to replace the thermal paste with Grizzly Kryonaut. i was thinking since i am in there i might as well replace the thermal pads (and add one to the PLX). i want to do this 1 time and i want to do it right. i have a few questions i hope you can help with.
> 
> 1) should i even replace the pads or is it a wast of time.
> 
> 2) what thickness of pad is stock and/or recommended for optimum cooling (i see pads comes in .5mm, 1mm , 1.5mm and 2mm?
> 
> 3) does it matter what w/mk i go with? grizzly has pads rated at 8 w/mk, but fujipoly has 17 w/mk?
> 
> 4) how many total mm's do i need? fujipoly has a 60x50mm pack and a 100 x 15mm pack. Grizzly has 120 x 20mm and 100 x 100 mm?
> 
> any other tips, advice or brands are appreciated. it seems like if i will need a lot of padding it will be much more cost effective to go with grizzly, but at the same time if i am going to replace the pads i want to actually have a gain in cooling performance to make it worth it.
> 
> thank you,
> 
> Carlos
Click to expand...

I don't know the size for the stock cooler.

Don't waste your money. Get name brand but you don't need 11 or 17 w/mk. They are not worth the extra money unless you are trying for golden card. You would not of bought a dual gpu for that. Also if you are keeping stock cooking then really no purpose. If you take off the cooler make sure you have some pads imo. They can and will tear

Iirc you can change the tim without removing the heatsink. Just the aios.

The 11w/mk are not as bad as the 17 (premium wise) but still not worth it especially due to lack of ocability with this card you generally won't be pushing 1300/1700


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> good info, i will stick with fujipoly. i will also try applying kryonaut to both sides of the pads!
> 
> from photos i can find. i am guessing the memory pads are the 8 squares around each gpu (so i guess i just need to be careful and not try to disturb the pads since i will be leaving them in place)?
> 
> which are vrm pads (the 22 very tiny squares in the center of the card - row of 12 next to a row of 10)?
> 
> looks like i just need to figure out the thickness of the stock VRM pads!


Mems-16 squares(8 on each side) for each gpu.
For vrm location, I would look at https://shop.ekwb.com/EK-IM/EK-IM-3831109869093.pdf and https://shop.ekwb.com/EK-IM/EK-IM-3831109869086.pdf
Not sure if they use the same thickness as the oem but location and dimensions should be the same.


----------



## cmoney408

Quote:


> Originally Posted by *Mega Man*
> 
> Don't waste your money. Get name brand but you don't need 11 or 17 w/mk. If you take off the cooler make sure you have some pads imo. They can and will tear
> 
> Iirc you can change the tim without removing the heatsink. Just the aios.
> 
> The 11w/mk are not as bad as the 17 (premium wise) but still not worth it especially due to lack of ocability with this card you generally won't be pushing 1300/1700


i figure for the vrm i will go all out and get Fujipoly just to have piece of mind.

but i know what you mean about the pads, i ripped some up when re doing the thermal paste on my PS3.

if i can find out the thickness for the memory pads i might buy a sheet of cheaper pads; just in case i rip or dirty any stock ones. or maybe i might just replace them all for the hell of it.

do you have any recommendations for a brand (or specific model/line) for the memory pads, something cheap (but better or equal to stock) since it looks like there are 32 total (16 front and 16back) 1x1 inch squares on this damn card. lol. though, i know i could probably skip replacing the back ones if i can keep the backplate in position while doing all the other upkeep.


----------



## Mega Man

Iirc anything ppcs sells is 6w/mk or better

In my first post I forgot to mention. Get the bigger sheet /half sheet or w.e.

Beget to have to much imo then to little. But for just the VRM the 15x100 should be ok


----------



## cmoney408

Quote:


> Originally Posted by *Mega Man*
> 
> Iirc anything ppcs sells is 6w/mk or better
> Beget to have to much imo then to little. But for just the VRM the 15x100 should be ok


Quote:


> Originally Posted by *fat4l*
> 
> Tips:Go grizzly kryonaut on the cores. put the paste on the cores, spread it + heat it up with a hair dryer and put the block on it and screw it tightly so the amount of paste between the core and cooler is minimal.
> I would recommend using kryonaut + pad combo on vrms too(not just pads) as it improves the heat transfer even more(on both sides of pads).
> Don;t worry about mems too much, better cooling(pads) doens't rly help them in OC and they don't run hot so no need to cool them better.


just a heads up guys. performance-pcs and grizzly have a promotion you can get 1 paste and 1 pad for free. you have to review the item and pay up front then they reimburse you (not including tax/shipping)

http://www.performance-pcs.com/grizzly-cashback

so i ended up getting the largest tube of kryonaut and a large 100x100 .5mm pad to cover all 32 chips (front and back). i should get about $50 back for these 2 items.

i went with 100x15 1.5mm and 100x15 1.0mm Fujipoly SARCON XR-m for the 2 vram sections shown in this guide:
https://shop.ekwb.com/EK-IM/EK-IM-3831109869086.pdf


----------



## Mega Man

that is for the Ek waterblock not the stock cooler. The ek waterblock is made to Ek standards. The stock cooler is made to amds...


----------



## cmoney408

yeah, i figured it was the closest i was going to get to finding what thickness pads i need.


----------



## SAFX

Hey guys, what's the deal with Crimson driver; worth the upgrade? I'm interested in bug fixes for GTA 5, per release notes, may give it a try tonight.


----------



## caste1200

if you want my personal experience:
https://community.amd.com/message/2689998#2689998
I hate crimson


----------



## SAFX

Quote:


> Originally Posted by *caste1200*
> 
> if you want my personal experience:
> https://community.amd.com/message/2689998#2689998
> I hate crimson


Thanks, caste1200, will consider that before upgrading


----------



## Sgt Bilko

Quote:


> Originally Posted by *caste1200*
> 
> if you want my personal experience:
> https://community.amd.com/message/2689998#2689998
> I hate crimson


While I do share your opinion of Crimson's faults I have to say that threatening to jump ship and go Green shows a certain level of immaturity and doesn't exactly motivate the company to fix it any sooner.


----------



## cmoney408

Quote:


> Originally Posted by *electro2u*
> 
> That sounds spot on. These are the kind of observations I bet AMD software people could make good use out of to fix these dumb issues (seriously? the VRM fans are 100% until you get the drivers installed and then they are... pretty much not running). It's very quiet on the stock setting once the drivers kick in, but when the system boots the VRM fan on my 295x2 system kicks up to 100% and it's LOUD. That would definitely help with people's throttling issues if it were dynamic...
> 
> Can the fan/s on the rad be controlled even? I can't hear my Noctua NF-P12's on it. I have them both running off the card using a fan splitter but I think they are stuck at low speed possibly.


did you ever run into issues running 2 fans off of the singe header on the 295x2? i am considering going push pull. i am going to see how much a single corsair sp120 helps first. i am also replacing the thermal paste with grizzly kryonaut, vrm pads with fujipoly and ram pads with Grizzly minus 8 pads.


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> did you ever run into issues running 2 fans off of the singe header on the 295x2? i am considering going push pull. i am going to see how much a single corsair sp120 helps first. i am also replacing the thermal paste with grizzly kryonaut, vrm pads with fujipoly and ram pads with Grizzly minus 8 pads.


there should Not be any issues with 2 fans running from a single header. I have seen ppl doing it without any issues


----------



## rakesh27

Not a problem running 2 fans of 1 header, I've owned my powercolor 295x2 for a year not one problem.... I've done the same thing with my evga 980ti hybrid, again no problems there....

What's awesome is that I'm running both cards in my rig at the same time, when ever I wanna play a game with a particular card I just move the monitor cable to the corresponding graphic card...

It's wicked, u get the best of amd and nvidia, and graphic drivers work without any issues, people should try this...


----------



## cmoney408

Quote:


> Originally Posted by *electro2u*
> 
> That sounds spot on. These are the kind of observations I bet AMD software people could make good use out of to fix these dumb issues (seriously? the VRM fans are 100% until you get the drivers installed and then they are... pretty much not running). It's very quiet on the stock setting once the drivers kick in, but when the system boots the VRM fan on my 295x2 system kicks up to 100% and it's LOUD. That would definitely help with people's throttling issues if it were dynamic...
> 
> Can the fan/s on the rad be controlled even? I can't hear my Noctua NF-P12's on it. I have them both running off the card using a fan splitter but I think they are stuck at low speed possibly.


thanks for the info! i might try it.

dual cards is cool. but i think most people would rather have $1000 - $1500 in cards working together to get more for their money (i would stick with 980ti in sli). plus the space it takes to have 2 individually water cooled gpu's means you either have a mess/(crowded PC) or have to have a full tower case.well, thats just my view point.

who knows, in the future with DX12 you might be able to use them together.


----------



## Dagamus NM

To each their own. While it is cool to have both working in a single rig the whole pulling out the cable to use the other card just seems nuts to me. I like having a bit of both AMD and nvidia, just like I like having some AMD and intel but that is what having multiple rigs is about.

The builds I have are pretty well split 50/50 on processors and GPUs, actually exactly 50/50 over the past six years.


----------



## cmoney408

really! to each their own. i cant stand having an nvidea and amd build in the same house. so i bought 2 houses next door to each other. when i play a nvidia optimized game i move into my nvidia house. when i am playing a amd optimized game i moved into the amd house. i try to keep both houses identical in all other aspects. why dont more people do this?


----------



## xer0h0ur

Quote:


> Originally Posted by *cmoney408*
> 
> really! to each their own. i cant stand having an nvidea and amd build in the same house. so i bought 2 houses next door to each other. when i play a nvidia optimized game i move into my nvidia house. when i am playing a amd optimized game i moved into the amd house. i try to keep both houses identical in all other aspects. why dont more people do this?


I wasn't able to find two houses in the Hamptons directly next to eachother for sale so I bought my Nvidia house in Dubai and I take my private jet there when I want to game on my Nvidia rig.


----------



## rakesh27

For me personally, everyone was raving about how the 980ti is a great card, so i thought what the hell i want to go back to the green team to see what everyone was shouting about.

And space, Corsair 900d case can easily fit these two cards and a seperate aio cooler for my cpu without any problems...

I wanted the best of both worlds... so i thought let me try this.

Its pointless having 2 rigs, to run 2 seperate gpu manufactures, where nowadays you could run it all in one rig and get the benefits of both.


----------



## cmoney408

some people might say its pointless to have 2 different gpus to run the same games when a single strong gpu can muscle even the worse optimized games. but again, its all preference.


----------



## wermad

The lucid hydra failed to deliver on the promise of making both sides work together in harmony. Wishful thinking me guess. I wouldn't mind four Ti's but nvidia is notorious for having poor scaling beyond two cards.


----------



## Mega Man

Quote:


> Originally Posted by *rakesh27*
> 
> For me personally, everyone was raving about how the 980ti is a great card, so i thought what the hell i want to go back to the green team to see what everyone was shouting about.
> 
> And space, Corsair 900d case can easily fit these two cards and a seperate aio cooler for my cpu without any problems...
> 
> I wanted the best of both worlds... so i thought let me try this.
> 
> Its pointless having 2 rigs, to run 2 seperate gpu manufactures, where nowadays you could run it all in one rig and get the benefits of both.


sure there is. If nvidia physx driver detects an amd card it kills physical iirc


----------



## xer0h0ur

I prefer not being forced into upgrading every generation but of course every person picks their pros and cons and rolls with it.


----------



## caste1200

Quote:


> Originally Posted by *Sgt Bilko*
> 
> While I do share your opinion of Crimson's faults I have to say that threatening to jump ship and go Green shows a certain level of immaturity and doesn't exactly motivate the company to fix it any sooner.


It's not a threat, I am doing it, ordered 2 evga 980ti hydro







just sick of and always doing stuff like this, I want to go premium now!


----------



## Sgt Bilko

Quote:


> Originally Posted by *caste1200*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> While I do share your opinion of Crimson's faults I have to say that threatening to jump ship and go Green shows a certain level of immaturity and doesn't exactly motivate the company to fix it any sooner.
> 
> 
> 
> It's not a threat, I am doing it, ordered 2 evga 980ti hydro
> 
> 
> 
> 
> 
> 
> 
> just sick of and always doing stuff like this, I want to go premium now!
Click to expand...

Then it's not an idle threat is all.

And good for you, 980Ti's are beastly cards, i hope they serve you well


----------



## wermad

New loop still wip. Miss my rig


----------



## SAFX

*TOTAL NIGHTMARE with Crimson*

*Attempt #1*
Tried installing Crimson 15.11/.net45 on W7/64, performed standard DDD clean up first, ran installer as admin, crashes while installing display driver with message...

_"AMD install manager has stopped working"._

*Attempt #2*
Let's try Crimson BETA; nope! crashed again, same place, same error message!

*Reverting to Catalyst...*
Gave up on Crimson, it's evil. Switched back to Catalyst 15.8beta (previous drivers), and now I'm screwed because _that_ installer is crashing!









Questions
1) Anyone having similar issues?
2) Does AMD save crash reports I can read? I found a few under AppData/Local/Crashdumps, but they're in binary.
3) AMDCleanupUtility... I only used this after the first attempt failed (does it help or do more than DDD?)


----------



## SAFX

Quote:


> Originally Posted by *SAFX*
> 
> *TOTAL NIGHTMARE with Crimson*
> 
> *Attempt #1*
> Tried installing Crimson 15.11/.net45 on W7/64, performed standard DDD clean up first, ran installer as admin, crashes while installing display driver with message...
> 
> _"AMD install manager has stopped working"._
> 
> *Attempt #2*
> Let's try Crimson BETA; nope! crashed again, same place, same error message!
> 
> *Reverting to Catalyst...*
> Gave up on Crimson, it's evil. Switched back to Catalyst 15.8beta (previous drivers), and now I'm screwed because _that_ installer is crashing!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Questions
> 1) Anyone having similar issues?
> 2) Does AMD save crash reports I can read? I found a few under AppData/Local/Crashdumps, but they're in binary.
> 3) AMDCleanupUtility... I only used this after the first attempt failed (does it help or do more than DDD?)


OK, false alarm









Forgot I enabled this policy a few weeks back to prevent Windows from installing USB drivers when connecting/removing thumb drives.
All is well after setting this to *Not Configured*


----------



## xer0h0ur

Breh. Cmon breh.

All ribbing aside, the Crimson driver is so bad for 295X2s. You are completely unable to disable crossfire. Doesn't matter if you do it globally in the additional radeon settings section or in the specific game profile. It simply doesn't stick the disabling of crossfire.


----------



## Mega Man

Quote:


> Originally Posted by *caste1200*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> While I do share your opinion of Crimson's faults I have to say that threatening to jump ship and go Green shows a certain level of immaturity and doesn't exactly motivate the company to fix it any sooner.
> 
> 
> 
> It's not a threat, I am doing it, ordered 2 evga 980ti hydro
> 
> 
> 
> 
> 
> 
> 
> just sick of and always doing stuff like this, I want to go premium now!
Click to expand...

Quote:


> Originally Posted by *SAFX*
> 
> *TOTAL NIGHTMARE with Crimson*
> 
> *Attempt #1*
> Tried installing Crimson 15.11/.net45 on W7/64, performed standard DDD clean up first, ran installer as admin, crashes while installing display driver with message...
> 
> _"AMD install manager has stopped working"._
> 
> *Attempt #2*
> Let's try Crimson BETA; nope! crashed again, same place, same error message!
> 
> *Reverting to Catalyst...*
> Gave up on Crimson, it's evil. Switched back to Catalyst 15.8beta (previous drivers), and now I'm screwed because _that_ installer is crashing!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Questions
> 1) Anyone having similar issues?
> 2) Does AMD save crash reports I can read? I found a few under AppData/Local/Crashdumps, but they're in binary.
> 3) AMDCleanupUtility... I only used this after the first attempt failed (does it help or do more than DDD?)


i dont understand i have had zero problems with crimson

as to caste1200, you wanted to go premium but went generic, they are good cards, but 980tis are still meh imo


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Breh. Cmon breh.
> 
> All ribbing aside, the Crimson driver is so bad for 295X2s. You are completely unable to disable crossfire. Doesn't matter if you do it globally in the additional radeon settings section or in the specific game profile. It simply doesn't stick the disabling of crossfire.


Enabled globally, so not a problem for me









I'm liking Crimson so far, *1%* performance boost isn't much, but at least it performs _better_ and not worse compared to previous catalyst drivers.


----------



## xer0h0ur

I didn't say enabling crossfire is a problem. In fact crossfire is forced whether you like it or not. I said its not possible to disable crossfire in games that don't support it or play like trash with it enabled.


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Breh. Cmon breh.
> 
> All ribbing aside, the Crimson driver is so bad for 295X2s. You are completely unable to disable crossfire. Doesn't matter if you do it globally in the additional radeon settings section or in the specific game profile. It simply doesn't stick the disabling of crossfire.
> 
> 
> 
> Enabled globally, so not a problem for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm liking Crimson so far, *1%* performance boost isn't much, but at least it performs _better_ and not worse compared to previous catalyst drivers.
Click to expand...

The actual display driver is the same one thats in the last beta








Quote:


> Originally Posted by *xer0h0ur*
> 
> I didn't say enabling crossfire is a problem. In fact crossfire is forced whether you like it or not. I said its not possible to disable crossfire in games that don't support it or play like trash with it enabled.


Pretty much this.

Its not an issue on games on that work well with CF obviously but on titles such as Battlefront which is pretty broken on crossfire meaning you basically cannot have crimson installed if you want to play it.


----------



## xer0h0ur

Makes zero sense too. The CCC was not pretty but the manually created "application profiles" worked like a charm at individually disabling crossfire altogether for any given piece of software.


----------



## SAFX

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The actual display driver is the same one thats in the last beta


Thanks for the tip, did not know that








Quote:


> Originally Posted by *xer0h0ur*
> 
> I didn't say enabling crossfire is a problem. In fact crossfire is forced whether you like it or not. I said its not possible to disable crossfire in games that don't support it or play like trash with it enabled.


Gotcha....still not an issue for me until I get a game that puts me in that situation, I guess.


----------



## Mega Man

Agreed. Just don't support trashy programmers.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> Agreed. Just don't support trashy programmers.


SLI works just fine from what I've seen and Single GPU on Battlefront runs flawlessly....what I'm annoyed about is not being able to use a 295x2 to play Battlefront unless i go back to CCC (so i can disable Crossfire).

Mind you the game (and everything else) works just fine on my 390x with Crimson


----------



## caste1200

Quote:


> Originally Posted by *Mega Man*
> 
> i dont understand i have had zero problems with crimson
> 
> as to caste1200, you wanted to go premium but went generic, they are good cards, but 980tis are still meh imo


why meh? a single 980ti outperformas in some cases the 295x2..


----------



## wermad

Anandtech has all the game benchmarks won by the 295x2 @ 4k for the exception of Civilization (doesn't look like this game likes crossfire as the 290x beat the 295x2).

I'm on mobile and ca t really get to all the other reviews. Both Cards are nice and if you've made your decision on the gm200 card, you can take further discussion in its respective club/thread tbh.


----------



## SAFX

Found some great deals during Black Friday, figured it's time to build a custom water cooled rig this winter.
Still on the fence for gpu, 295x2 is going strong against nvidia's lineup, seems to be the best bang for buck, amazing since in gpu terms it's old given its release date.

here's what I got so far,

NZXT H440 black/blue ($115)
Crucial MX200 500Gb ($95)
EVGA G2 1000w ($110)


----------



## DMatthewStewart

Quote:


> Originally Posted by *SAFX*
> 
> Found some great deals during Black Friday, figured it's time to build a custom water cooled rig this winter.
> Still on the fence for gpu, 295x2 is going strong against nvidia's lineup, seems to be the best bang for buck, amazing since in gpu terms it's old given its release date.
> 
> here's what I got so far,
> 
> NZXT H440 black/blue ($115)
> Crucial MX200 500Gb ($95)
> EVGA G2 1000w ($110)


BTW, I have the same PSU. I love it. Best one Ive owned yet


----------



## remedy1978

I was having issues with my R295X2 so I send it in to Althon Micro. The replacement R295X2 cannot maintain consistent clock speeds. Temps are in the 60's, and with 100% load the clock fluctuates from 1018 to 992-960 and back up to 1018.

This is with Windows 10 64 bit and the beta Crimson drivers.

I increased the power limit 25% but it didn't help. Do I have a dud?


----------



## SAFX

Oh yeah, I've got one powering my current rig too, evga rocks


----------



## xer0h0ur

Quote:


> Originally Posted by *remedy1978*
> 
> I was having issues with my R295X2 so I send it in to Althon Micro. The replacement R295X2 cannot maintain consistent clock speeds. Temps are in the 60's, and with 100% load the clock fluctuates from 1018 to 992-960 and back up to 1018.
> 
> This is with Windows 10 64 bit and the beta Crimson drivers.
> 
> I increased the power limit 25% but it didn't help. Do I have a dud?


I highly suggest you do not use the Crimson drivers to begin with. They are causing clock fluctuation issues on nearly all of AMD's cards with some users. Furthermore you can't disable crossfire on the 295X2 with these drivers. I would highly suggest you go back to the Catalyst 15.11.1 beta instead until AMD sorts out the bugs in the Crimson software. For what its worth, both of these drivers use the exact same video driver. The difference is one uses the new "Crimson Software" and the other uses the old Catalyst Control Center which while being less pretty is at least working perfectly fine.


----------



## remedy1978

Quote:


> Originally Posted by *xer0h0ur*
> 
> I highly suggest you do not use the Crimson drivers to begin with. They are causing clock fluctuation issues on nearly all of AMD's cards with some users. Furthermore you can't disable crossfire on the 295X2 with these drivers. I would highly suggest you go back to the Catalyst 15.11.1 beta instead until AMD sorts out the bugs in the Crimson software. For what its worth, both of these drivers use the exact same video driver. The difference is one uses the new "Crimson Software" and the other uses the old Catalyst Control Center which while being less pretty is at least working perfectly fine.


Could it be my PSU? I have a Cooler Master V1000. I tried the earlier drivers and got the same results.


----------



## wermad

That psu can handle one 295x2


----------



## themasterpiece1

You guys are having problems disabling crossfire and I am having problems getting it enabled. It should be enabled by default but on device manager my computer only sees one GPU and when I run benchmarks I get the same scores as a single R9 290X.

Basically bought a dual GPU card but can't even use it as such. I have tried clean installs of Windows 7, 8.1, and 10. With latest drivers and 15.7. Always same result, only one GPU being active.

Seems there are quite a few people with the same issue per this thread:
https://community.amd.com/thread/185600


----------



## remedy1978

Quote:


> Originally Posted by *xer0h0ur*
> 
> I highly suggest you do not use the Crimson drivers to begin with. They are causing clock fluctuation issues on nearly all of AMD's cards with some users. Furthermore you can't disable crossfire on the 295X2 with these drivers. I would highly suggest you go back to the Catalyst 15.11.1 beta instead until AMD sorts out the bugs in the Crimson software. For what its worth, both of these drivers use the exact same video driver. The difference is one uses the new "Crimson Software" and the other uses the old Catalyst Control Center which while being less pretty is at least working perfectly fine.


I figured it out. When I have Freesync disabled, my clocks are 100% stable with Crossfire enabled. When I have Crossfire and Freesync enabled, my clocks jump all over the place.

Has anyone else in the forums noticed this behavior?


----------



## wermad

Quote:


> Originally Posted by *themasterpiece1*
> 
> You guys are having problems disabling crossfire and I am having problems getting it enabled. It should be enabled by default but on device manager my computer only sees one GPU and when I run benchmarks I get the same scores as a single R9 290X.
> 
> Basically bought a dual GPU card but can't even use it as such. I have tried clean installs of Windows 7, 8.1, and 10. With latest drivers and 15.7. Always same result, only one GPU being active.
> 
> Seems there are quite a few people with the same issue per this thread:
> https://community.amd.com/thread/185600


Could be a bad card, unfortunately. Of all the "only one core detected" posts I've seen here, it generally turns out to be a bad core and why you only get single core performance. Even if you have unstable drivers, the device manager and gpu-z should see two cores. If not, the last option i suggest is switching to the backup bios on the card. If that don't work, check if you have any recourse or warranty to get it swapped/returned. Good luck


----------



## fat4l

Quote:


> Originally Posted by *remedy1978*
> 
> I figured it out. When I have Freesync disabled, my clocks are 100% stable with Crossfire enabled. When I have Crossfire and Freesync enabled, my clocks jump all over the place.
> 
> Has anyone else in the forums noticed this behavior?


Yep. The same here. Free sync + crossfire=fluctuation


----------



## xer0h0ur

I was about to say the same thing wermad said. Even if crossfire is being stubborn to enable its odd that you can't see the other gpu in the device manager. Does GPU-Z also not report the 2nd GPU?


----------



## remedy1978

Quote:


> Originally Posted by *fat4l*
> 
> Yep. The same here. Free sync + crossfire=fluctuation


Great! Thanks for the input. I tried using "Force Constant Voltage" in AfterBurner. Would this damage the card?


----------



## xer0h0ur

You would have to hex edit Afterburner to push more voltage to even have a legit chance of damaging it in my opinion.


----------



## SAFX

Quote:


> Originally Posted by *xer0h0ur*
> 
> You would have to hex edit Afterburner to push more voltage to even have a legit chance of damaging it in my opinion.


hex edit? ....damn, that's some serious bad-a$$ness,


----------



## xer0h0ur

I can't take any credit on that cool factor lol. The Fury/X boys were the ones that gave the explanation on how to hex edit the voltage offset in Afterburner to crank up the additional voltage. While I haven't personally tried it myself yet, its presumably just as easy for the Hawaii cards.


----------



## F4ze0ne

Quote:


> Originally Posted by *remedy1978*
> 
> I figured it out. When I have Freesync disabled, my clocks are 100% stable with Crossfire enabled. When I have Crossfire and Freesync enabled, my clocks jump all over the place.
> 
> Has anyone else in the forums noticed this behavior?


It's been happening for me since I got my monitor last August.

AMD is aware though, but has no fix at the moment. It's been a known issue since 15.10 drivers.

*[59528] Core clock fluctuations may be experienced when FreeSync™ and FRTC are both enabled on some AMD CrossFire™ systems*


----------



## fat4l

Quote:


> Originally Posted by *F4ze0ne*
> 
> It's been happening for me since I got my monitor last August.
> 
> AMD is aware though, but has no fix at the moment. It's been a known issue since 15.10 drivers.
> 
> *[59528] Core clock fluctuations may be experienced when FreeSync™ and FRTC are both enabled on some AMD CrossFire™ systems*


I dont think we are using RFTC tho.


----------



## remedy1978

Quote:


> Originally Posted by *F4ze0ne*
> 
> It's been happening for me since I got my monitor last August.
> 
> AMD is aware though, but has no fix at the moment. It's been a known issue since 15.10 drivers.
> 
> *[59528] Core clock fluctuations may be experienced when FreeSync™ and FRTC are both enabled on some AMD CrossFire™ systems*


This is happening without FRTC on though.


----------



## F4ze0ne

Quote:


> Originally Posted by *fat4l*
> 
> I dont think we are using RFTC tho.


Quote:


> Originally Posted by *remedy1978*
> 
> This is happening without FRTC on though.


I tried testing with FRTC disabled and it's still fluctuating for me.

I think the known issue could be related to what we're experiencing even though FRTC isn't enabled.


----------



## cmoney408

update to anyone who is curious - THANK YOU TO EVERYONE WHO HELPED WITH INFORMATION!!!!:

so i re did the thermals on my 295x2. Grizzly kryonaut on the GPU's, Minus 8 pads on the memory (#1 box's) and Fujipoly on the vram (1.5mm on #3 Box's, 1mm on #2 Box's)

check out these pictures for reference of #1, #2 and #3 Box's i referred to above (yes i know this is for an EK block but it helped):
https://shop.ekwb.com/EK-IM/EK-IM-3831109869086.pdf
https://shop.ekwb.com/EK-IM/EK-IM-3831109869093.pdf

what exactly i bought (all from performance-pcs):
Fujipoly Sarcon XR-m thermal pads 100x15x1.5mm = $18 (a narrow strip left over)
Fujipoly Sarcon XR-m thermal pads 100x15x1.0mm = $16 (very little left over)
Thermal Grizzly Minus Pad 8 pads 100x100x.5mm = $25 (1/5 of the sheet left over)
Thermal Grizzly Kryonaut 3ml = $25 (a lot left over for future applications)

tested with 10 minutes of unigine Heaven benchmarking software (4K, 2xAA, Ultra quality, Moderate Tessellation)
before grizzy pads and paste: gpu 1 = 74c, gpu 2 = 72c
after grizzly pads and paste: gpu 1 = 61c, gpu 2 = 58c

thats without any burn in time, im sure the thermal paste will get better results with a few more hours of setting in.

one tip for the minus 8 pads:

the pads dont stick very well. so using a VERY SMALL amount of kryonaut on the chip first helps hold the pad in place while you put your card together. otherwise some pads will slide off the card while you are trying to tighten it back together (it happened to me a few times, can be very frustrating)

real world numbers. in unigine i went from 31FPS before paste, to 33fps after. thats stock. not too big of a deal. but with the cooler temps i was able to OC my card bringing me to 36FPS while staying cooler then stock before i changed the pads. thats where it counts!

fujipoly tips:

i also applied kryonaut very lightly to the Vram and PLX chips to hold the fujipoly in place and to increase it thermal ability (hopefully).

P.S. - grizzly and performancepcs have a promotion where you can get a refund on 1 grizzly paste and 1 grizzly pad item if you post your results/review in a forum.


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> update to anyone who is curious - THANK YOU TO EVERYONE WHO HELPED WITH INFORMATION!!!!:
> 
> so i re did the thermals on my 295x2. Grizzly kryonaut on the GPU's, Minus 8 pads on the memory (#1 box's) and Fujipoly on the vram (1.5mm on #3 Box's, 1mm on #2 Box's)
> 
> check out these pictures for reference of #1, #2 and #3 Box's i referred to above (yes i know this is for an EK block but it helped):
> https://shop.ekwb.com/EK-IM/EK-IM-3831109869086.pdf
> https://shop.ekwb.com/EK-IM/EK-IM-3831109869093.pdf
> 
> what exactly i bought (all from performance-pcs):
> Fujipoly Sarcon XR-m thermal pads 100x15x1.5mm = $18 (a narrow strip left over)
> Fujipoly Sarcon XR-m thermal pads 100x15x1.0mm = $16 (very little left over)
> Thermal Grizzly Minus Pad 8 pads 100x100x.5mm = $25 (1/5 of the sheet left over)
> Thermal Grizzly Kryonaut 3ml = $25 (a lot left over for future applications)
> 
> tested with 10 minutes of unigine Heaven benchmarking software (4K, 2xAA, Ultra quality, Moderate Tessellation)
> before grizzy pads and paste: gpu 1 = 74c, gpu 2 = 72c
> after grizzly pads and paste: gpu 1 = 61c, gpu 2 = 58c
> 
> thats without any burn in time, im sure the thermal paste will get better results with a few more hours of setting in.
> 
> one tip for the minus 8 pads:
> 
> the pads dont stick very well. so using a VERY SMALL amount of kryonaut on the chip first helps hold the pad in place while you put your card together. otherwise some pads will slide off the card while you are trying to tighten it back together (it happened to me a few times, can be very frustrating)
> 
> real world numbers. in unigine i went from 31FPS before paste, to 33fps after. thats stock. not too big of a deal. but with the cooler temps i was able to OC my card bringing me to 36FPS while staying cooler then stock before i changed the pads. thats where it counts!
> 
> fujipoly tips:
> 
> i also applied kryonaut very lightly to the Vram and PLX chips to hold the fujipoly in place and to increase it thermal ability (hopefully).
> 
> P.S. - grizzly and performancepcs have a promotion where you can get a refund on 1 grizzly paste and 1 grizzly pad item if you post your results/review in a forum.


Nice. I think the that you will see the best results on VRMs, however there's no way how to track vrm temps on the card so...


----------



## SAFX

Wow, are those temperature drops accurate? seems too good to be true, but if it is, that's amazing.....


----------



## cmoney408

i used open hardware monitor to track the temps. i basically ran unigine for a couple minutes. then clicked the benchmark button, that takes a few minutes. then i would let it run until 10 minutes had passed. i would quickly alt-shift-tab to the open hardware monitor and check the temps asap (they would drop within seconds of switching out of unigine.

you might be able to add a degree, but that degree would have to be added to both the before and after results.

also just in case anyone wants to know. corsair link showed my hx850i pulling a max of 783 watts in and 712 watts out. thats with the card and my 4770k overclocked.

just ran it again, with the 295x2 Overclocked using crimson (+8% clock, +50% power, 1600mhz)

i let it run for 2 minutes (to warm up a bit) then ran benchmark (takes like 6 min), then let it continue to run until 10-11 minute mark.

cpu cores = 45-47c (cores were running at 4100mhz
GPU1 = 67c
gpu2 = 64c
774w in, 704w out - hx850i platinum rated PSU


----------



## cmoney408

after running unigine for 12+ minutes. last one i promise. i jsut wanted to add a picture.
792w in, 720w out
cpu temps 47-49c
gpu1 68c, gpu2 66c - overclocked with settings above



http://imgur.com/IFUaNiH


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> after running unigine for 12+ minutes. last one i promise. i jsut wanted to add a picture.
> 792w in, 720w out
> cpu temps 47-49c
> gpu1 68c, gpu2 66c - overclocked with settings above
> 
> 
> 
> http://imgur.com/IFUaNiH


that was well worth it








btw using the EK block manual, did all the thermal pads fit correctly(are the dimensions/thickness correct) ?


----------



## cmoney408

yeah. it was like $80 for the stuff. buy performancepcs is processing a refund for $50 of it (because of the grizzly promotion). so $30 bucks and a couple hours, not too bad!

YES. thank you for the EK info. i used it to plan out what i needed and it all seemed to fit. sizes were accurate.


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> yeah. it was like $80 for the stuff. buy performancepcs is processing a refund for $50 of it (because of the grizzly promotion). so $30 bucks and a couple hours, not too bad!
> 
> YES. thank you for the EK info. i used it to plan out what i needed and it all seemed to fit. sizes were accurate.


That was pretty cheap then. Well invested money!
+rep


----------



## ebaw95

Hello everybody,

First, sorry for my bad english









I have bought a Saphire 295X2 one year ago, all was fine (noisy







) before i want to reinstall my PC (some viruses).

So I downloaded the crimson 15.11 drivers, installed it and when it detected the card the fans became very slow (not noisy at all).

After that, installing all the drivers on the AMD website do the same thing (the fan became slow when the driver is installed, for each driver change i used DDU to remove the older ones)

For reference :


http://imgur.com/oKaeT

 (first image is the second GPU on the bottom dropdown menu)

I don't want to test my card in game (don't want to burn it)

Did someone experienced this problem ?


----------



## wermad

Anyone running 4k Eyefinity? I'm tossing the idea switching monitors (mainly to fit my new desk). I'm wondering how quads can cope with 24M pixels







. I know about DeadlyDNA's and Baasha's testing (quad 290X and quad Titan BE's). The Lg 27 4k ips monitor fits perfectly and i have room for three


----------



## xer0h0ur

Quote:


> Originally Posted by *ebaw95*
> 
> Hello everybody,
> 
> First, sorry for my bad english
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have bought a Saphire 295X2 one year ago, all was fine (noisy
> 
> 
> 
> 
> 
> 
> 
> ) before i want to reinstall my PC (some viruses).
> 
> So I downloaded the crimson 15.11 drivers, installed it and when it detected the card the fans became very slow (not noisy at all).
> 
> After that, installing all the drivers on the AMD website do the same thing (the fan became slow when the driver is installed, for each driver change i used DDU to remove the older ones)
> 
> For reference :
> 
> 
> http://imgur.com/oKaeT
> 
> (first image is the second GPU on the bottom dropdown menu)
> 
> I don't want to test my card in game (don't want to burn it)
> 
> Did someone experienced this problem ?


The 15.11 has bugs with regard to fan control. I don't know why you even downloaded the 15.11 when there is the 15.11.1 and the 15.12 that don't have this problem.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Anyone running 4k Eyefinity? I'm tossing the idea switching monitors (mainly to fit my new desk). I'm wondering how quads can cope with 24M pixels
> 
> 
> 
> 
> 
> 
> 
> . I know about DeadlyDNA's and Baasha's testing (quad 290X and quad Titan BE's). The Lg 27 4k ips monitor fits perfectly and i have room for three


Single Core on the 295x2 (Single 290x) was running Battlefront (High Preset) on the 27MU67 around 40-45fps avg with some occasional dips to 35fps iirc so I'd imagine you could run 4k eyefinity pretty nicely on Quad


----------



## wermad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Single Core on the 295x2 (Single 290x) was running Battlefront (High Preset) on the 27MU67 around 40-45fps avg with some occasional dips to 35fps iirc so I'd imagine you could run 4k eyefinity pretty nicely on Quad


Thanks







. After doing a bit more reasearch, I've zero'd in on the Dell P2715Q. I decided to least get one Dell for now as it can easily fit under my desk and I can find this monitor much easier (vs the LG). I'll have to measure a bit more as my new desk has a slightly skewed position to the right rather then being symmetrical to the physical corner of the overall "L" shape.

Alternatively, I can raise the hutch about 3/4"-1" using a bit of wood and tying everything together. This should allow me to clear the hutch w/ my current Samsung.

I'm gonna check out the local Fry's electronic store and see if they have the Dell. For sure the LG is there but the Dell can be sourced slightly cheaper and no state tax and recycle fee through ebay.

So far, most of the 4k eyefinity/surround reviews have been universal in saying that you need a lot of hardware. I'm glad my 295x2's have all displayport.


----------



## Dagamus NM

The issue is that you only have 6' of cord length from the back of your gpu to your monitor. The price on that dell is pretty sweet though. $499 and possibly lower if you can find a good hookup. I recently tried a 10' mdp to dp cable and got nothing but a black screen at 60Hz. I am going to order some of those active mdp to HDMI 2.0 from club 3d whenever they are available so that I can run my asus pb287q's on HDMI instead of DP in an attempt to have my computer on the side of my desk and not in front of it. It will not fit underneath.

I don't know if you already have 4k monitors and have dealth with this or not, but figured I would mention it. While I would like to add one more pb287q just to have three, these are not the best monitors for watching a movie from across the room. Great for doing work and look great in games when sitting right in front but go to far off at an angle and all of the contrast falls flat. It is a TN panel after all, that dell however is ips (lucky..Napoleon Dynamite voice).

The other complaint that I have is how small certain fonts are. I can enlarge most things to be able to read them fine but there are certain things like pop up menus in programs (trying to pick a special character in word) that are super small and I have to lean in quite close to the screen to see the small differences in characters. Mostly when writing lab reports and inserting greek alphabet.

So, aside from wanting a third pb287q just to have three I have learned that TN is not so good and the 28" 4K is a bit too small. I would like to string together three 36" 4k monitors. I would need a wider desk but it would be nice.


----------



## wermad

Quote:


> Originally Posted by *Dagamus NM*
> 
> The issue is that you only have 6' of cord length from the back of your gpu to your monitor. The price on that dell is pretty sweet though. $499 and possibly lower if you can find a good hookup. I recently tried a 10' mdp to dp cable and got nothing but a black screen at 60Hz. I am going to order some of those active mdp to HDMI 2.0 from club 3d whenever they are available so that I can run my asus pb287q's on HDMI instead of DP in an attempt to have my computer on the side of my desk and not in front of it. It will not fit underneath.
> 
> I don't know if you already have 4k monitors and have dealth with this or not, but figured I would mention it. While I would like to add one more pb287q just to have three, these are not the best monitors for watching a movie from across the room. Great for doing work and look great in games when sitting right in front but go to far off at an angle and all of the contrast falls flat. It is a TN panel after all, that dell however is ips (lucky..Napoleon Dynamite voice).
> 
> The other complaint that I have is how small certain fonts are. I can enlarge most things to be able to read them fine but there are certain things like pop up menus in programs (trying to pick a special character in word) that are super small and I have to lean in quite close to the screen to see the small differences in characters. Mostly when writing lab reports and inserting greek alphabet.
> 
> So, aside from wanting a third pb287q just to have three I have learned that TN is not so good and the 28" 4K is a bit too small. I would like to string together three 36" 4k monitors. I would need a wider desk but it would be nice.


I've had my Sammy 4k 28 for almost a year now (using a 6' Startech dp to mini dp). Windows 7 scaling is pretty decent and i got used to it over time. I do have a few 10' dp cables and I'll test them once I get my rig up and running. I may jump on win10 which I hear (along with 8.1) have better scaling for 4k monitors. I have limited space with this new desk, but its more practical for the other stuff I have with cabinets vs my old diy big L desk (made this for 5x1 24" dells).


----------



## xer0h0ur

Ironically I am getting ready to go backwards to a 1440p screen but likely going IPS instead of TN this time. I am keeping the 4K monitor but will primarily be gaming on the 144Hz 1440p monitor as I have become grossly infatuated with Counter-Strike again and 120/144Hz monitors reign supreme in FPS titles. Will continue to 4K game on everything else though.


----------



## ebaw95

Quote:


> Originally Posted by *xer0h0ur*
> 
> The 15.11 has bugs with regard to fan control. I don't know why you even downloaded the 15.11 when there is the 15.11.1 and the 15.12 that don't have this problem.


Hello xer0h0ur,

Because when i reinstall my PC, this version was the AMD stable version on the AMD website and i didn't know their is a problem with the "stable" driver at this time.

But the newer version didn't resolved the problem for me : when the driver (14.4 to 15.12) is installing, the GPU fan goes to 20%...


----------



## ebaw95

Quote:


> Originally Posted by *ebaw95*
> 
> Hello xer0h0ur,
> 
> Because when i reinstall my PC, this version was the AMD stable version on the AMD website and i didn't know their is a problem with the "stable" driver at this time.
> 
> But the newer version didn't resolved the problem for me : when the driver (14.4 to 15.12) is installing, the GPU fan goes to 20%...


And at 65°C, GPU FAN 30% (very slow : the card didn't work like this before)...


----------



## xer0h0ur

So take a look at the fan setting in the Crimson Software. Also alt-tab out of game or play windowed and verify the same setting hasn't changed while gaming.

At the very least while you're trying to figure out what the hell is going on, you can use Afterburner to set a custom fan curve for your card.


----------



## pgetueza

My last AMD GPU was a MSI 5850 five years ago. Coming from an MSI GTX 980 Ti Gaming (disposed it for Pascal later in 2016)... what's the best driver for 295X2? I never had an issues with nvidia. I built a new PC with Maximus VIII Extreme and 6700K (new OS installation), the Crimson Drivers (v15.12) isn't working as expected. I need to setup a profile for BF4 while Battlefront ran fine and BFH freezes my system no matter what. Ha ha, it's kinda frustrating. Anyway, happy holidays to everyone!|

*EDIT:* I think the issue has something to do with the crossfire. I never had micro-stuttering like hell before with SLi. When I turned crossfire off, I was able to play BFH but still with stuttering. When BFH loads to the game itself, my system would automatically restart if crossfire is enabled. Same with BF4 while Battlefront works with crossfire with decent performance, less stuttering.


----------



## xer0h0ur

Quote:


> Originally Posted by *pgetueza*
> 
> My last AMD GPU was a MSI 5850 five years ago. Coming from an MSI GTX 980 Ti Gaming (disposed it for Pascal later in 2016)... what's the best driver for 295X2? I never had an issues with nvidia. I built a new PC with Maximus VIII Extreme and 6700K (new OS installation), the Crimson Drivers (v15.12) isn't working as expected. I need to setup a profile for BF4 while Battlefront ran fine and BFH freezes my system no matter what. Ha ha, it's kinda frustrating. Anyway, happy holidays to everyone!|
> 
> *EDIT:* I think the issue has something to do with the crossfire. I never had micro-stuttering like hell before with SLi. When I turned crossfire off, I was able to play BFH but still with stuttering. When BFH loads to the game itself, my system would automatically restart if crossfire is enabled. Same with BF4 while Battlefront works with crossfire with decent performance, less stuttering.


Either of the Crimson Software drivers (15.11/15.11.1 or 15.12 which is just 15.11.1 that passed WHQL) WILL NOT DISABLE CROSSFIRE on the 295X2. I suggest you use the Catalyst 15.11.1 driver as that is the most up to date (same video driver as the 15.11/15.11.1/15.12) that uses the Catalyst Control Center instead of the Crimson Software. Something is severely broken with the Crimson Software that its not disabling crossfire if you disable it in the global profile or in the individual game profiles. If you instead use the CCC and create an application profile for the game then it will in fact disable crossfire on your 295X2 so its only using one GPU and it should eliminate any micro stutter.


----------



## wermad

Quote:


> Originally Posted by *xer0h0ur*
> 
> Ironically I am getting ready to go backwards to a 1440p screen but likely going IPS instead of TN this time. I am keeping the 4K monitor but will primarily be gaming on the 144Hz 1440p monitor as I have become grossly infatuated with Counter-Strike again and 120/144Hz monitors reign supreme in FPS titles. Will continue to 4K game on everything else though.


I was reading some of the reviews for the 15' dp cables and most are saying it works for 4k. Though, they don't specify @ 60hz. I really need to test this and sadly I still have no rig (may just go air temporarily). If triple 4k is no go, I'll probably jump on 5x1 using some dell 2560x1440 ips monitors. I've measured and I barely have enough room for 5x1. Amazon has the accell mst hub for $45 and I have the adapters and cables from my previous 5x1 setup (24" 1200 Dells). The only drawback to 5x1 is the older games that don't run in eyefinity and I have to switch the portrait monitor to landscape. With 3x1 4k, I can just assign the center monitor (it would be in landscape, as it wouldn't fit in portrait).

Sigh....my loop is still delayed. Dhl picked up the package just today and with the holiday rush and delays, I'm looking at next week to get my final pieces.


----------



## Dagamus NM

Unless they specify [email protected] I would stay away. That said, if you find some that do work please share the source. I only need an extra 18" beyond the standard 6' cable and that has been a no go so far. I didn't bother at 30Hz because who cares.


----------



## fat4l

Guys.
I decided to sell my Ares 3.








Not sure what I will be moving to tho...









Any ideas ?


----------



## wermad

two titan x's and a new psu: ax1500 p1600 or superflower 2kw. Seems like all titan x owners are needing these high end psu's for their single or dual card setups, for some reason (







) edit (same for the Ti owners







)

Personally, a couple of Ti's, though I would skip the overhyped (imho) KPE version.


----------



## Alex132

Quote:


> Originally Posted by *wermad*
> 
> two titan x's and a new psu: ax1500 p1600 or superflower 2kw. Seems like all titan x owners are needing these high end psu's for their single or dual card setups, for some reason (
> 
> 
> 
> 
> 
> 
> 
> ) edit (same for the Ti owners
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Personally, a couple of Ti's, though I would skip the overhyped (imho) KPE version.


Gotta get that epeen PSU to go with your epeen GPUs


----------



## Roxycon

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> I decided to sell my Ares 3.
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure what I will be moving to tho...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any ideas ?


I am personally pretty ready for the fiji gemini, my wallet not so much


----------



## xer0h0ur

Quote:


> Originally Posted by *Roxycon*
> 
> I am personally pretty ready for the fiji gemini, my wallet not so much


Well your wallet has time to be ready. Its delayed. Was supposed to be Jan/Feb but now there is no mention of when. All due to the VR push not being ready yet. They are trying to release it for the VR crowd.


----------



## F4ze0ne

The Fury x2 better wipe the floor with the 295x2. It has a lot to live up too.


----------



## fat4l

Quote:


> Originally Posted by *F4ze0ne*
> 
> The Fury x2 better wipe the floor with the 295x2. It has a lot to live up too.


No more dual cards for me. Don;t like them anymore.


----------



## wermad

Get two Ti's. Two cards seem to play very well for both camps. It should be easier to shutoff Sli or Crossfire with two discrete cards. The 295x2 isn't very easy to switch off crossfire so it kinda gets a bad rep for being difficult to work with.

I luv playing Lost planet 2 and the game immediately crashes with quad-fire. I end up turning off crossfire and just use one card (both cores).

I'm really torn on what to get:

-5x Dell P2416Ds 2560x1440 ips
or
-3x Dell P2715Q 4k 3840x2160 ips

The first is slightly cheaper and its should be more manageable in terms of gpu load. Though, its more complicated but I have experience with 5x1 setups.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> I've had my Sammy 4k 28 for almost a year now (using a 6' Startech dp to mini dp). Windows 7 scaling is pretty decent and i got used to it over time. I do have a few 10' dp cables and I'll test them once I get my rig up and running. I may jump on win10 which I hear (along with 8.1) have better scaling for 4k monitors. I have limited space with this new desk, but its more practical for the other stuff I have with cabinets vs my old diy big L desk (made this for 5x1 24" dells).


Good luck with all of that. I am on win10 with the two PB287Qs. 99% of the time it is cool but when I bring up the menu for equations or special characters in word they are smaller than I like. Some characters are hard to tell apart from one another.

I have the same cables as you are successfully using. Hurry up and test your 10'rs already








Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> I decided to sell my Ares 3.
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure what I will be moving to tho...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any ideas ?


Ti's are really the only place left for you to go at the moment, that or furries. So what Ti's to get? I would wait to sell the Ares until the RoG matrix platinum version comes out. I have four 980Tis (reference), I would drop them in a heartbeat for the matrix plats.

As far as the PSU, go with the P1600. I have four of them and love them. I can't speak to the others, superflower 2K would be cool if it is the same build as the P1600, I wonder if it needs a 5-20p outlet?


----------



## wermad

I'm waiting on dhl to give me an eta on my package. If its gonna be soon, I won't bother switching to air. If its gonna be a week, I'll see if I have time to switch one of the cards to the stock cooler. The cables I got a while back for my 5x1 were off ebay and apparently are oem cables. I have two 10' cables and a few 6' cables. I bought the startech m-dp to dp 6' cable since I didn't have any dongles and the adapter I felt was too flimsy.

Has anyone had any experience with 120hz? There's an asus wqhd on sale (freesync, tn, 1440) and its very tempting.

edit: well....took some measurements and 4k 3x1 is a no







. Monitors are just too long in landscape and portrait they won't fit at all. Guess I'm down to either one 4k screen (Asus 28 tn looks pretty solid and it should fit w/ the adjustable height or the Dell 27" ips) or 5x1 24 wqhd.


----------



## wermad

Sorry for the double post:

I found my cables:

-5x Bizlink 6' (oem apparently from HP) purchased from ebay. Really thick cable
-2x oem style 10' cables purchased on ebay. third thickest cables. One was being used on the monitor as I'm posting this.
-1x oem samsung 6' cable w/ catch "anchors" . second thickest
-1x Startech mini to full 6' cable. thinnest of all

All cables triggered 4k @ 60hz using the onboard gpu of my mb. Since my 10' cables work, I'm leaning on a 5x1 setup but then again, a single asus 4k 28 (tn) is vastly less expensive and will do what my sammy does for me now. I'll have to think it over







.



Edit: keep in mind I was only able to use the igpu of my mb and I could not test any graphics intensive applications. Switching to my second system ( and ancient but still functional ddr2 amd oem setup







).


----------



## pgetueza

Quote:


> Originally Posted by *xer0h0ur*
> 
> Either of the Crimson Software drivers (15.11/15.11.1 or 15.12 which is just 15.11.1 that passed WHQL) WILL NOT DISABLE CROSSFIRE on the 295X2. I suggest you use the Catalyst 15.11.1 driver as that is the most up to date (same video driver as the 15.11/15.11.1/15.12) that uses the Catalyst Control Center instead of the Crimson Software. Something is severely broken with the Crimson Software that its not disabling crossfire if you disable it in the global profile or in the individual game profiles. If you instead use the CCC and create an application profile for the game then it will in fact disable crossfire on your 295X2 so its only using one GPU and it should eliminate any micro stutter.


Before I tried using the latest Catalyst drivers, I've began experiencing driver crashes while loading any game or D3D apps. Shortly after, I began experiencing BSODs. The message was 'THREAD_STUCK_IN_DEVICE_DRIVER'. I guess this has something to do with the Crimson drivers especially when I reinstalled it. I already used DDU to remove any trace of drivers under safe mode but still experienced it. What I did next was to reset my PC (Win 10), install drivers for my board, then Catalyst 15.11.1. I tried running the following:

*Furmark* - No issues. Stable for 20 minutes before I quit.
*Battlefront* - Max settings with VSync on. No issues but performance was way better with Crimson Drivers v 15.12 with no issues especially if crossfire is enable. Not much of micro stuttering.
*Crysis 3* - Max settings with VSync on and FXAA. No issues but a few occasions of FPS drop from 60 to 25-30 FPS. Crossfire is enabled and no other issues.
*Battlefied 4* - Max settings with VSync on. No issues and crossfire is enabled.
*GTA V* - Max settings with VSync on with Crimson Drivers, it ran flawlessly with Crossfire enabled. After 20 minutes of gameplay, system restarted. When I tried to run the game again, after loading shortly. the system crashed then restarted.
*GTA V* - Max settings with VSync on under Catalyst 15.11.1, performance is bad. It micro stutters especially when crossfire is enabled.
*Battlefield Hardline* - Max settings with VSync on. If crossfire is enabled, system will reset after loading the campaign mission.
*Battlefield Hardline* - Max settings with VSync on. If crossfire is disabled, I was able to play for 10-15 minutes before system automatically reset.

*Crossfire is globally enabled. I didn't setup the profiles individually.

I've exhausted my options. I have no other system unit to test this monster card. While it ran great with both Crimson or Catalyst drivers to most of my games, it usually crashes with GTA V and consistently with Battlefield Hardline. I tried to stress the system like hell with BF4 using Catalyst Drivers 15.11.1 while running Furmark and Prime95, the PSU was peaking at 500-650W (through Corsair Link), it was still stable.

My specs are 6700K/Maximus VIII Extreme/2x8GB Trident Z/Corsair AX860i (using the 2x8pin cable). I have no issues with temperature. All the cables are perfectly connected to one another. The PSU was able to handle all gpu configs: 680 SLi, GTX 690, 780 SLi, 970 SLi and a 980 Ti. All at stock settings.

What might be causing this issue? Is the AMD driver at fault here? Since the card ran fine with Furmark, I think it's not a hardware issue. What do you think?


----------



## xer0h0ur

So you have not tried disabling crossfire by creating an application profile in the CCC for GTAV? If it still crashes without crossfire enabled then at least you would know its not crossfire causing the issue and if it stops crashing then you know it was crossfire.


----------



## Dagamus NM

Anybody have a profile for Mad Max that doesn't play like garbage?

There is not a predefined profile for it. With settings on the default max for [email protected] I get a solid. 60+ fps but when driving it drops down to 5-7fps then bumps back up. The stuttering and fps drops are maddening given that the majority of the game is driving.

I followed some guy's generic tips for Mad Max on YouTube reducing a couple settings which I am very annoyed at having to do having two 295X2s in the build. Also a 3930K, 64gb mem, game is on raid 0 Intel 750 drive.

Settings reduced I have 140fps just walking around with less drops than before but they are still there and it still crashes occasionally.

WTH?


----------



## Alex132

So new drivers (crimson) seems like I can't under/over volt anymore.

Anyone have a guide for flashing a bios onto this card with stock settings, but -20mV?


----------



## fat4l

Quote:


> Originally Posted by *Alex132*
> 
> So new drivers (crimson) seems like I can't under/over volt anymore.
> 
> Anyone have a guide for flashing a bios onto this card with stock settings, but -20mV?


http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_30


----------



## fat4l

Guys, I wasnt satisfied with the core temps on my ares 3.....
I even use *Thermal Grizzly Kryonaut*(which is supposed to be the best non-conductive paste) and I was still getting max 65C on gpu1 and 57C on gpu2 with the water temp of 26C(after 2 runs of heaven).

Now I used *Coolaboratory Liquid Pro.*
Results are amazing.
GPU 1 went from 65C to 50C.
GPU 2 went from 57C to 43C.

That's hell of a difference......
......*14-15C difference!!!*


----------



## steezebe

Quote:


> Originally Posted by *fat4l*
> 
> Guys, I wasnt satisfied with the core temps on my ares 3.....
> I even use *Thermal Grizzly Kryonaut*(which is supposed to be the best non-conductive paste) and I was still getting max 65C on gpu1 and 57C on gpu2 with the water temp of 26C(after 2 runs of heaven).
> 
> Now I used *Coolaboratory Liquid Pro.*
> Results are amazing.
> GPU 1 went from 65C to 50C.
> GPU 2 went from 57C to 43C.
> 
> That's hell of a difference......
> ......*14-15C difference!!!*


 that's quite a drop!

Why were you not satisfied with 65/57 C? I get mid-50's with PK-1, and don't see much reason to change.


----------



## cmoney408

it could have been garbage information. but i read in some forums that liquid pro does have some long term effects on degrading even copper.

i probably wouldnt mind it on a cheap copper plate or cheap aio cooler since it would be easy to replace a $50 aio with a new $50 aio cooler. but it may be a bit harder to repair/replace copper (or entire cooler) on a manufacturer specific part for your card.


----------



## fat4l

Quote:


> Originally Posted by *steezebe*
> 
> that's quite a drop!
> 
> Why were you not satisfied with 65/57 C? I get mid-50's with PK-1, and don't see much reason to change.


Simply because my custom wcooling can do much better. And I proved it by 15C drop








You know it sad to see ppl with low cost cooling to have similar temps while my cooling cost me a lot... I spent my money for a reason


----------



## cmoney408

the real test would be to do the same setup (using liquid pro) but with a "low cost cooling" to see what the difference is. im curious if the numbers are because of the liquid or water cooling system. i guess im just curios to see how much better a custom loop is then my simple h100i. i too delidded my cpu (used ultra pro), but i just used kryonaut between the cpu lid and waterblock (since i am scared to use liquid metal, even on copper)

Also, i just received the CAC-1170 today!!! mini dp to hdmi 2.0!!!

update - the adapter works!


----------



## Mega Man

Congrats


----------



## eqc6

Guys I need help and hoping some of you can assist! So here's the thing: I run 295x2 + 290x in trifire mode. When I plug in the 295x2 as the top card, I get TERRIBLE stuttering and games are unplayable. BUT when I switch the gpus position and have the 290x as the top card instead, everything is perfect. Any idea why I can't use the 295x2 as the top card at all? I tried disabling ULPS, cleaning and reinstalling drivers, and I just keep getting unplayable stutter all the time, even during firestrikes. I really have no idea what's going on. I tried googling but can't seem to find a solution, it's just so weird that I can't use the 295x2 as the top card at all!


----------



## wermad

Need the rest of your specs: mb, cpu, etc.???

Fill out the rig builder under your profile.

Could be the 295x2 is triggering 4x on that slot. Check gpuz. Also, make sure the 295x2 isn't hitting its 75c thermal limit easily when its on top.


----------



## eqc6

Quote:


> Originally Posted by *wermad*
> 
> Need the rest of your specs: mb, cpu, etc.???
> 
> Fill out the rig builder under your profile.
> 
> Could be the 295x2 is triggering 4x on that slot. Check gpuz. Also, make sure the 295x2 isn't hitting its 75c thermal limit easily when its on top.


thanks for the quick response!

My specs:

i7-5930
asrock x99 fatality killer
16gb ballistic sport ddr4 2400
xfx 295x2 + xfx 290x double dissipation
corsair ax1500i psu

Thermal is perfectly fine, and my mobo supports three pci 3.0x16.
It's really weird because it only has terrible stutter when the 295x2 is the top card. I tried disabling the 3rd card (290x) and still stuttering bad so the culprit is the 295x2 itself. So I tried playing a game with just 295x2 and used borderless window (to deactivate the 2nd gpu) and it fixes it. But why is it doing this only when 295x2 is at the top pci slot??

And like I said, swapping the gpu placement where the 290x is the top card, everything runs perfect!


----------



## wermad

make sure you plug in the auxiliary power (molex) right above the top most pcie sot. That sometimes helps with power hungry cards like the 295x2. Could be bios, check if you have the latest. Tbh, the 295x2 would do better on the bottom as the 290x has a higher thermal limit. It could be a weird thing your mb does as well. Some boards recommend plugging your card into a specific slot which some cases, its not the top most slot. In terms of lanes, the 295x2 works fine @ 8x 3.0 (though your mb does 16x/16x w/ your cpu). Open up monitoring tool (like msi afterburner) and expand the monitoring window. reduce the poll rate and run some games for a good 15-30 minutes. see what temps your 295x2 cores are hitting.

edit: forgot to mention your ax1500, make sure its not setup to multi rail protection.


----------



## eqc6

Quote:


> Originally Posted by *wermad*
> 
> make sure you plug in the auxiliary power (molex) right above the top most pcie sot. That sometimes helps with power hungry cards like the 295x2. Could be bios, check if you have the latest. Tbh, the 295x2 would do better on the bottom as the 290x has a higher thermal limit. It could be a weird thing your mb does as well. Some boards recommend plugging your card into a specific slot which some cases, its not the top most slot. In terms of lanes, the 295x2 works fine @ 8x 3.0 (though your mb does 16x/16x w/ your cpu). Open up monitoring tool (like msi afterburner) and expand the monitoring window. reduce the poll rate and run some games for a good 15-30 minutes. see what temps your 295x2 cores are hitting.
> 
> edit: forgot to mention your ax1500, make sure its not setup to multi rail protection.


Was googling my issue and saw this guy who has same problem but he has dual 295x2. The video clip is exactly what's happening with mine 



I also read somewhere someone having similar issue with their 295x2 and his solution was to install it on the bottom pci slot.

Also, I have the same exact issue when I first got my gpus. I actually tried it on two other brands of X99 motherboards thinking it was a motherboard issue. It wasn't until the asrock when I finally thought of placing the 295x2 at the bottom slot. I'll check my multi rail protection again and your poll rate suggestion and if that still doesn't work then I'll have no choice but to go back to my original setup with 290x on top slot.

Really appreciate the quick response, thanks!


----------



## F4ze0ne

Wow... that's really bad stuttering in that video. I never experienced that with the 295x2 as the top card.

The card did however throttle because it was sucking in the hot air from the 290x below it causing it to overheat. But, once I moved it below the 290x it no longer throttles.


----------



## eqc6

hmmm..interesting...so I discovered something while messing around with my connections. It seems that the problem only exists when I make the 295x2 my main monitor! I tried plugging my monitor to the bottom gpu (290x) and it runs fine. But when I plug in my monitor to the 295x2 is when I get the terrible stutter. I tried all 4 mini display ports on the 295x2 and all same result - bad stutter. So, it seems that the issue has nothing to do with which pci-e slot it's installed on, so at least we can scratch that! SO, now I have to figure out why I can't use my 295x2 as my main monitor.


----------



## eqc6

Quote:


> Originally Posted by *F4ze0ne*
> 
> Wow... that's really bad stuttering in that video. I never experienced that with the 295x2 as the top card.
> 
> The card did however throttle because it was sucking in the hot air from the 290x below it causing it to overheat. But, once I moved it below the 290x it no longer throttles.


Just realize you have same gpu setup as I do. Have you tried plugging your main display on the 295x2 and did it work fine?


----------



## wermad

Maybe an x99 issue? I got a RVE coming and I hope there's no problems







(edit: picked up a z170 setup instead, couldn't decide on a cpu for the x99).


----------



## Roaches

Welp. It seems I've run into a second round of bad luck. My RMA Devil 13 replacement card from roughly 6 months ago decided to give itself to the coffin. Started playing Homeworld Remaster after out of work and suddenly randomly colored lines and streaks started appearing across the screen and the game/OS hanged. Hard reset the system only to able to boot in safe mode but no boot to desktop on normal mode due to no display on all monitors.









I'm back to a single Devil13 again.

They've treated me nice by fairly quickly sending a replacement. Time to try again....But this time I feel I want a different card as I hardly utilize the full potential of a 4 way Crossfire setup since I've shifted towards more casual gaming than currently playing anything graphically heavy. Don't think they'll give me a PCS+ in exchange of dual GPU card if I ask?


----------



## F4ze0ne

Quote:


> Originally Posted by *eqc6*
> 
> Just realize you have same gpu setup as I do. Have you tried plugging your main display on the 295x2 and did it work fine?


I've tried both and didn't have any issues. I'm currently using the 290X as the main card for the display.


----------



## eqc6

So, after spending hours troubleshooting, I finally was able to fix the problem and I needed to update my Asrock's bios. Interesting since they specifically have a updates for my 5930k cpu and also improving pci-e card compatibility. Now, the main reason it took me hours to troubleshoot is because after flashing the bios, my pc stopped posting and booting. I spent HOURS last night and this morning troubleshooting and for a moment gave up and accept that I bricked my mobo - I was ready to go buy a new one lol. But for some miracle it started posting again, and also found out one of my ram sticks died and prevernting my pc to boot up.

Anywho! With the new bios now my 295x2 works fine. Someone mentioned that Witcher 3 has negative scaling with 3 gpus, and now that I can use my 295x2 on its own, this is absolutely true! It's crazy how unoptimized Witcher 3 is on 3gpus. Really a shame since Witcher looks amazing in 4k.

Thanks Wermad and F4ze0ne for your suggestions!


----------



## wermad




----------



## lCornholio

Hello fellow 295x2 owners! I need some help!

My 295x2 seem to be throttling, what are your advice to fix this? It seems it cannot maintain 100% utilization for more than a few minutes in for example a 3D Mark test. By the way here is my 3D Mark score. Bad? Good? http://www.3dmark.com/fs/7177201
I have a mITX build so it's probablt a no brainer that my air flow is the main issue here.

Some other questions are, what are the best CrossfireX settings? Is it different for every game or any setting that's good for all games?

Ty for your help!


----------



## wermad

Cores are thermal limited to 75c. If your temps are this high, it will throttle back your clocks. Open any monitoring program and run more tests/benchmarks.


----------



## fat4l

Quote:


> Originally Posted by *lCornholio*
> 
> Yea I already know this and already said my GPU is throttling so ofc it has met the 75c limit, tell me something I don't know or read my reply a litte more carefully.


You can repaste the card + change thermal pads and replace the fan on the radiator.
Or change the throttling limits in the bios. I would go with the first option first...


----------



## lCornholio

Quote:


> Originally Posted by *fat4l*
> 
> You can repaste the card + change thermal pads and replace the fan on the radiator.
> Or change the throttling limits in the bios. I would go with the first option first...


Repaste card and change thermal pads, how???
Replace fan yea I got that!









Is it OK for the card to go higher than 75c and can I actually change that in BIOS? I mean the 295x2 is supposedly two 290x in one card and one 290x temperature limit is somewhere around 90c right? I don't see why they set the limit at 75c...


----------



## NBrock

Quote:


> Originally Posted by *lCornholio*
> 
> Repaste card and change thermal pads, how???
> Replace fan yea I got that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is it OK for the card to go higher than 75c and can I actually change that in BIOS? I mean the 295x2 is supposedly two 290x in one card and one 290x temperature limit is somewhere around 90c right? I don't see why they set the limit at 75c...


They limited the temp to 75*c to protect the liquid cooling pumps. I replaced the thermal paste, put two good fans on the radiator and did the VRM fan mod so you can run it at a higher rpm and temps were not a problem even over clocked.


----------



## fat4l

Quote:


> Originally Posted by *lCornholio*
> 
> Repaste card and change thermal pads, how???
> Replace fan yea I got that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is it OK for the card to go higher than 75c and can I actually change that in BIOS? I mean the 295x2 is supposedly two 290x in one card and one 290x temperature limit is somewhere around 90c right? I don't see why they set the limit at 75c...


Look at this post here:
http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/8280_30#post_24713287
We helped @cmoney408 with his 295X2.
By chaning the paste and pads he achieved 13-14C lower temps.
I'm not sure what fans he is using but, you cannot go wrong with Corsair SP120 Twin pack, so you can do push/pull config or EK Vardar/Noctua static pressure fans.









EDIT://
You can also mod your bios so it uses less volts if you are not overclocking.
Ameding throttle limit is not good as it may harm the pump(as said above).


----------



## cmoney408

i was using stock at first, but then i switched to a sp120 high performance edition.
http://www.amazon.com/gp/product/B007RESFVI

i moved it so its pulling. not sure if its just a load fan, or the fact that its in a pull configuration, but it is pretty loud compared to the stock fan.

you can probably skip the fujipoly pads if you want to save a few bucks. on top of that you can get the grizzly pads for the ram and the kryonaut paste for free (minus shipping and tax).

performancepcs has a promotion where you get a full refund on 1 thermal pad from grizzly and 1 paste from grizzly. i did receive my refund already! for like $10 in tax and shipping you might as well do it.


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> i was using stock at first, but then i switched to a sp120 high performance edition.
> http://www.amazon.com/gp/product/B007RESFVI
> 
> i moved it so its pulling. not sure if its just a load fan, or the fact that its in a pull configuration, but it is pretty loud compared to the stock fan.
> 
> you can probably skip the fujipoly pads if you want to save a few bucks. on top of that you can get the grizzly pads for the ram and the kryonaut paste for free (minus shipping and tax).
> 
> performancepcs has a promotion where you get a full refund on 1 thermal pad from grizzly and 1 paste from grizzly. i did receive my refund already! for like $10 in tax and shipping you might as well do it.


I would say that buying new pads for ram is pretty much useless. For vrms it's necessary(at least my point of view)








I believe he just needs 1 strip for vrms which would cost him 20$ ?

Also you can use some fan splitter so you can power up both Corsair SP120 Fans from you card directly.
@lCornholio, also don't forget about correct mounting position of the rad.


On the side note.....
Guys, I will be doing modified bios for you to get increased performance with no extra voltage or anything. Let me know what you think.
It will be based on Sapphire OC bios.


----------



## cmoney408

i dont know which is better, i just suggested what was easier/cheaper. i wanted to cover all bases so i just did it all. in the end with the rebates its only $45 bucks to re do everything. i think its money well spent and you will have peace of mind replacing everything.


----------



## fat4l

lolololol..I finally found fujipoly 17w/mK in the EU(UK).
Let's get the party started









http://www.aquatuning.co.uk/thermal-pads-und-paste/thermal-pads/?p=1&o=5&n=12&f=580


----------



## fat4l




----------



## wermad

Quote:


> Originally Posted by *lCornholio*
> 
> Yea I already know this and already said my GPU is throttling so ofc it has met the 75c limit, tell me something I don't know or read my reply a litte more carefully.


Some ppl have reported throttling and they were under the thermal limit.

If you know the answer to your question (poor air management) then why ask? If you chose to go w/ an itx build and this card, you're looking for lots of heat as hawaii is known to be very hot and power hungry. If you're looking for alternatives besides the obvious, they have been suggested by other members, but this is like putting a band-aid on a gun-shot wound imho. You're priority is to get good air flow to the card. Using different paste or pads may work but overall, you're still in the same boat. And not to mention you're also probably suffocating the rest of your system. If you want more suggestions, why not get a Ti? It's very close to the 295x2, uses less power and generates a lot less heat. There's no pump(s) to worry about as well as crossfire issues. Other then this, the Fury X does have an aio but I would side w/ the Ti for the extra ram.

edit:

on a side note, I got a new water cooling part







:


----------



## fat4l

@wermad,








Was it hard to mount those koolance shrouds on the top of X9? Rly like your build and thinking of buying X9


----------



## wermad

Pretty straight forward, just just gotta get the alignment right as there's little room for the mounting points. I still have mine(the shrouds), I haven't found a new owner for them. As for the X9, its hard to choose between the standard black or the "snow" edition


----------



## fat4l

Quote:


> Originally Posted by *wermad*
> 
> Pretty straight forward, just just gotta get the alignment right as there's little room for the mounting points. I still have mine(the shrouds), I haven't found a new owner for them. As for the X9, its hard to choose between the standard black or the "snow" edition


I would go with black ofc.
Black with red lights!








I also have an external radiator, MO-RA3 with 8x200mm red led fans in push pull.
Not sure how I would loop it all together with internal rads and components hmmm










Spoiler: PICS


----------



## Mega Man

Quote:


> Originally Posted by *lCornholio*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Cores are thermal limited to 75c. If your temps are this high, it will throttle back your clocks. Open any monitoring program and run more tests/benchmarks.
> 
> 
> 
> Yea I already know this and already said my GPU is throttling so ofc it has met the 75c limit, tell me something I don't know or read my reply a litte more carefully.
Click to expand...

You obviously know everything, and so my question is why are you asking us for help? Frankly you treated we are mad so poorly when he did suggest a VALID fix,

Let's break down your first post ( it is broken and throttling )

We have no proof, nor evidence. Should we just tell every one to go buy a phase change unit?


----------



## lCornholio

Quote:


> Originally Posted by *cmoney408*
> 
> i was using stock at first, but then i switched to a sp120 high performance edition.
> http://www.amazon.com/gp/product/B007RESFVI
> 
> i moved it so its pulling. not sure if its just a load fan, or the fact that its in a pull configuration, but it is pretty loud compared to the stock fan.
> 
> you can probably skip the fujipoly pads if you want to save a few bucks. on top of that you can get the grizzly pads for the ram and the kryonaut paste for free (minus shipping and tax).
> 
> performancepcs has a promotion where you get a full refund on 1 thermal pad from grizzly and 1 paste from grizzly. i did receive my refund already! for like $10 in tax and shipping you might as well do it.


All the pads stuff seems a bit complicated to me, but replacing the fan might do the trick to at least avoid throttling.
I wonder which one is best for the 295x2, Push or Pull? Also I have my 295x2 rad set as exhaust because I'm running an mITX compact case and setting it as intake made the case very hot, so I think it's better this way.


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> You obviously know everything, and so my question is why are you asking us for help? Frankly you treated we are mad so poorly when he did suggest a VALID fix,
> 
> Let's break down your first post ( it is broken and throttling )
> 
> We have no proof, nor evidence. Should we just tell every one to go buy a phase change unit?


Meh, i don't mind.

New gpu setup:


----------



## cmoney408

Quote:


> Originally Posted by *lCornholio*
> 
> All the pads stuff seems a bit complicated to me, but replacing the fan might do the trick to at least avoid throttling.
> I wonder which one is best for the 295x2, Push or Pull? Also I have my 295x2 rad set as exhaust because I'm running an mITX compact case and setting it as intake made the case very hot, so I think it's better this way.


the pad stuff is really easy if you look at my first comment about it. i could always help if you had questions too. plus with the performance-pcs deal its a great time to do it.

i usually use noctua, at least for my h100i and other case fans. but i went with corsair sp120 high performance for the 295x2 because of this video





push or pull doesnt really effect performance. pull males it easier to dust the rad (spray rad from inside case to blow dust out the back). but pull puts the fan closer to the outside of the case which i think is louder in my room (vs the fan noise being behind a rad in a push setup).





i would not recommend it as an intake (in push or pull). only exhaust, but thats my personal preference.


----------



## fat4l

push-pull tha best!


----------



## remedy1978

Debating selling my Sapphire R9 295X2. Picked up a 980 TI for a good price. Just not happy with the driver support and issues with Freesync. Can anyone convince me otherwise? Why should I keep the card?


----------



## cmoney408

i would only trade for a 980ti if it was this version:





just, kidding, but not kidding.

i thought about it too a few months back when i was clamoring for 4k 60hz and no mini dp adapter was available, i thought i should just change to nvidia for the hdmi 2.0.

card for card they are the same with AAA titles. the 295x2 wins in some games the 980ti wins in others.

but really, it comes down to cost. when i looked into it a few months back the 295x2 was going for less then $500 on ebay. I would NEVER pay $200 for the same performance (after ebay fees/tax)

but on ebay right now there is a 295x2 going for $540 with a day left, so thats a more difficult choice.

personally, i put the benefit of single card compatibility at a price of $100, meaning if it cost me $100 or less out the door i probably would switch back to nvidia and get a 980ti.


----------



## Dagamus NM

Quote:


> Originally Posted by *remedy1978*
> 
> Debating selling my Sapphire R9 295X2. Picked up a 980 TI for a good price. Just not happy with the driver support and issues with Freesync. Can anyone convince me otherwise? Why should I keep the card?


To build a second computer. If that is not of interest then just sell.


----------



## fat4l

Quote:


> Originally Posted by *remedy1978*
> 
> Debating selling my Sapphire R9 295X2. Picked up a 980 TI for a good price. Just not happy with the driver support and issues with Freesync. Can anyone convince me otherwise? Why should I keep the card?


or get to the bios modding and squeeze the most of the card.
I'm getting about 30% more performance than 980Ti. I honestly believe that with 295X2 you can easily get 20%+ assuming that CF is supported


----------



## NBrock

Quote:


> Originally Posted by *remedy1978*
> 
> Debating selling my Sapphire R9 295X2. Picked up a 980 TI for a good price. Just not happy with the driver support and issues with Freesync. Can anyone convince me otherwise? Why should I keep the card?


You could always get a second rig and dedicate that baby to running [email protected] and join the OC Folding club.


----------



## cmoney408

Quote:


> Originally Posted by *fat4l*
> 
> or get to the bios modding and squeeze the most of the card.
> I'm getting about 30% more performance than 980Ti. I honestly believe that with 295X2 you can easily get 20%+ assuming that CF is supported


i am kinda nervous messing with my bios, but:

1) whats the difference between changing the bios and just using crimson to OC (right now i have +8% on GPU clock, +50% on power limit, and memory clock set to 1600mhz)
1b) if your bios gives ~30% more performance from stock, what do you think it would be from my OC if any improvement?

2) would your bios be in addition to a crimson OC? if i did the bios change would i leave the crimson settings at stock or still OC on top of the bios?

sorry if these are newb questions, i just have pretty much 0 experience with changing a GPU bios for gains.


----------



## fat4l

Quote:


> Originally Posted by *cmoney408*
> 
> i am kinda nervous messing with my bios, but:
> 
> 1) whats the difference between changing the bios and just using crimson to OC (right now i have +8% on GPU clock, +50% on power limit, and memory clock set to 1600mhz)
> 1b) if your bios gives ~30% more performance from stock, what do you think it would be from my OC if any improvement?
> 
> 2) would your bios be in addition to a crimson OC? if i did the bios change would i leave the crimson settings at stock or still OC on top of the bios?
> 
> sorry if these are newb questions, i just have pretty much 0 experience with changing a GPU bios for gains.


*1.* Well with bios you can do pretty much anything u want. Mainly change timings of memory and get up to 5% free performance gain.
*1b.*my bios gives me(including the whole OC) about 20% boost, see here: stock vs modded bios OC http://www.3dmark.com/compare/fs/6593207/fs/6590481
my graphics score is ~13000 and 980Ti Overclocked to ~1530/8100MHz can get about ~10000points, which is 30% difference in performance.

*2.*you can do whatever you want. however, you can set everything in bios, including power limits, clocks, volts,timings,aux volts(which you cant change in software) so you dont even need to use the software/crimson to OC.










On the other hand, I understand that my card is a little bit different, but like I said, if oced 980Ti is scoring ~10000, your card with mods woul easily do 12000+ which is still 20% over 980Ti. But hey...its "amd and their crossfire support" right


----------



## EpitaphUnmei

Can anyone point me to the right direction for a good xfx R9 295x2 bios


----------



## cmoney408

Quote:


> Originally Posted by *EpitaphUnmei*
> 
> Can anyone point me to the right direction for a good xfx R9 295x2 bio


its short, but its the best one i could find

https://en.wikipedia.org/wiki/AMD_Radeon_Rx_200_series#Radeon_R9_295X2

or do you mean bios that is mentioned in the comment directly before yours (read the last 10 or so comments)?


----------



## EpitaphUnmei

Quote:


> Originally Posted by *cmoney408*
> 
> its short, but its the best one i could find
> 
> https://en.wikipedia.org/wiki/AMD_Radeon_Rx_200_series#Radeon_R9_295X2
> 
> or do you mean bios that is mentioned in the comment directly before yours (read the last 10 or so comments)?


I meant bios that comment was talking about an sapphire brand will it work for an xfx?


----------



## fat4l

Quote:


> Originally Posted by *EpitaphUnmei*
> 
> I meant bios that comment was talking about an sapphire brand will it work for an xfx?


yes it will.
sapphire OC bios is the best. You can also try the modded one, posted by me a few posts above.


----------



## xer0h0ur

For the 295X2 brethren that are dealing with Crimson's quirks of not disabling crossfire when doing so in game profiles I suggest you install the latest Crimson driver then use the Catalyst 15.11.1 Beta driver to only install the CCC. That way you can use the CCC to create an application profile that will disable Crossfire for you.


----------



## Wrecker66

If i don't overclock two 295x cards with x99 board and 5930k is hx1200i PSU good?


----------



## wermad

Quote:


> Originally Posted by *Wrecker66*
> 
> If i don't overclock two 295x cards with x99 board and 5930k is hx1200i PSU good?


Read the first post. I made a list that gives you a general outline to this common concern with the 295x2. Its not a question of wattage, its a question of amperage. Unfortunately, this psu does not meet the amd specs:

1x 295x2: 50amps or two 28amps per connector
2x 295x2: double above
+ your system, fans, pumps, accessories, hdd/ssd, etc. (budge ~20-30 amps)

I pulled 1200w max *at the wall* on a z97 setup stock clocks all around.

Quote:


> Corsair HX1200i High-Performance ATX Power Supply
> AC Input Rating DC Output Rating
> AC Input: 100V - 240V DC Output +3.3V +5V *+12V* -12V +5Vsb
> Current: 15A - 7.5A Max Load 30A 30A *100A* 0.8A 3.5A
> Frequency: 47Hz - 63Hz Maximum Combined Wattage 150W 1200W 9.6W 17.5W
> Total Power: 1200W


http://www.corsair.com/en-us/hxi-series-hx1200i-high-performance-atx-power-supply-1200-watt-80-plus-platinum-certified-psu

edit: if you plan to get the ax1500i, make sure it doesn't run in multi rail protection mode.

Alternatively: if you're setup/case can permit dual psu's, you can run two in tandem to power your setup.

double edit: if you read the HardOcp review on a Z77 board, they couldn't get quads to run on an Enermax 1350w. They resorted in using a second psu for the second card.


----------



## fat4l

It would be much cheaper to go superflower/evga 1600W than Corsair ax1500i while the quality is still there


----------



## Wrecker66

Now i run ax1200i with 295x2 and 290x...peak power when i look at corsair link is around 800 - 850w.

The thing is that i want ax psu to put into other system and wanted to buy hx for this with two 295x2 cards.


----------



## wermad

Quote:


> Originally Posted by *Wrecker66*
> 
> Now i run ax1200i with 295x2 and 290x...peak power when i look at corsair link is around 800 - 850w.
> 
> The thing is that i want ax psu to put into other system and wanted to buy hx for this with two 295x2 cards.


Again, wattage is not the problem, it's amperage. If you must have a single corsair power supply for quad 295x2's, the ax1500i is your only option. If you want to do it on your own with your own method, just know you're not meeting the amd recommended criteria. Every 295x2 comes with this warning about amperage.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Again, wattage is not the problem, it's amperage. If you must have a single corsair power supply for quad 295x2's, the ax1500i is your only option. If you want to do it on your own with your own method, just know you're not meeting the amd recommended criteria. Every 295x2 comes with this warning about amperage.


Only option? Super flower and EVGA g1600/p1600/t1600 or super flower 2KW should give enough amperage per cable off of the single 12V rail.

I run a g1600 with mine and have no power related issues.


----------



## wermad

Quote:


> Originally Posted by *wermad*
> 
> Again, wattage is not the problem, it's amperage. If you must have a single *corsair* power supply for quad 295x2's, the ax1500i is your only option. If you want to do it on your own with your own method, just know you're not meeting the amd recommended criteria. Every 295x2 comes with this warning about amperage.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> Only option? Super flower and EVGA g1600/p1600/t1600 or super flower 2KW should give enough amperage per cable off of the single 12V rail.
> 
> I run a g1600 with mine and have no power related issues.
Click to expand...

Seems like this chap is set on corsair only. Not sure why, but he insists. My list, which i've cited a billion times, has a few of other units that can do crossfire 295x2. If you shop used, you can buy a couple of good 1000w or 1300w units and its much cheaper then a used G/P1600 or AX1500. Only catch, you need to accommodate two psu's. Other then this, the G1600 from Lepa can also run two of these and used ones can be had for under $150. Not the most practical or best rail distribution, but a cheap alternative. Since my water loop grew, I'm not able to get dual psu's as I have no more space for it (lol, this is funny indeed for a TX10







).

And the superflowers, gotta double check to make sure their for US households (standard 115v outlet), otherwise, you need to make some modifications to your outlet and breaker. I'm currently set on a standalone 20amp breaker (good for 2000w).


----------



## Wrecker66

Quote:


> Originally Posted by *wermad*
> 
> Again, wattage is not the problem, it's amperage. If you must have a single corsair power supply for quad 295x2's, the ax1500i is your only option. If you want to do it on your own with your own method, just know you're not meeting the amd recommended criteria. Every 295x2 comes with this warning about amperage.












i understand. that's why i asked here what can happen if i buy hxi for 2 cards. I didn't know the details like i do now. There is no way that i'm going to risk blowing my sistem becouse PSU.









i don't want to be set on Corsair only but my options to buy here are limited...i want to buy a new unit and i saw the list of compatible PSU. i also like corsair and have used many of their PSU. i have ax1200i, ax760, rm1000, tx750, and had few CS and CX's. they all worked fine.

my options are:
Corsair 1500axi
EVGA SuperNOVA P2 1600W
ENERMAX Platimax 1500W
SUPER-FLOWER Leadex Platinum 2000W (8 Pack)
ENERMAX Lepa P1700 1700W
SUPER-FLOWER Leadex Gold 1600W
ENERMAX Lepa G Series 1600W

what do you think i should get? i put them in order that is most $ to less $

thx for the help!


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Seems like this chap is set on corsair only. Not sure why, but he insists. My list, which i've cited a billion times, has a few of other units that can do crossfire 295x2. If you shop used, you can buy a couple of good 1000w or 1300w units and its much cheaper then a used G/P1600 or AX1500. Only catch, you need to accommodate two psu's. Other then this, the G1600 from Lepa can also run two of these and used ones can be had for under $150. Not the most practical or best rail distribution, but a cheap alternative. Since my water loop grew, I'm not able to get dual psu's as I have no more space for it (lol, this is funny indeed for a TX10
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> And the superflowers, gotta double check to make sure their for US households (standard 115v outlet), otherwise, you need to make some modifications to your outlet and breaker. I'm currently set on a standalone 20amp breaker (good for 2000w).


Ha, sorry. Totally overlooked the corsair only part. My bad.

I had to mod my outlet for my ups. No big deal unless of course I burn my house down and that is the cause. I don't see that happening though.

Depending on variables in your system, you could pull greater than 1500W in some situations. 1600W or greater, single rail if you have the extra dough.


----------



## wermad

Quote:


> Originally Posted by *Wrecker66*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i understand. that's why i asked here what can happen if i buy hxi for 2 cards. I didn't know the details like i do now. There is no way that i'm going to risk blowing my sistem becouse PSU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i don't want to be set on Corsair only but my options to buy here are limited...i want to buy a new unit and i saw the list of compatible PSU. i also like corsair and have used many of their PSU. i have ax1200i, ax760, rm1000, tx750, and had few CS and CX's. they all worked fine.
> 
> my options are:
> Corsair 1500axi
> EVGA SuperNOVA P2 1600W
> ENERMAX Platimax 1500W
> SUPER-FLOWER Leadex Platinum 2000W (8 Pack)
> ENERMAX Lepa P1700 1700W
> SUPER-FLOWER Leadex Gold 1600W
> ENERMAX Lepa G Series 1600W
> 
> what do you think i should get? i put them in order that is most $ to less $
> 
> thx for the help!


Corsair 1500axi
EVGA SuperNOVA P2 1600W
ENERMAX Platimax 1500W
SUPER-FLOWER Leadex Platinum 2000W (8 Pack)
ENERMAX Lepa P1700 1700W
SUPER-FLOWER Leadex Gold 1600W
ENERMAX Lepa G Series 1600W

The three enermax/lepa units are all six rail units and as long as you connect it properly, its pretty much the same. You may find the G1600 on sale or a used unit.

The Corsair, just make sure its not set to multi rail protection.

EVGA should be fine too. I know a guy who ran one w/ two 295x2.

Leadex, the 1600w seems good, similar to the EVGA. I would skip the 2kw unless you need some epeen flavah added (







).


----------



## Mega Man

i want the 2kw, but i wont buy from ocuk or w.e.,

also the superflower IS the EVGA fyi ( superflower is the oem )


----------



## wermad

Hmmm, thought you said you had one?

Yeah, they look similar so I had a hunch it was the same platform.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> EVGA should be fine too. I know a guy who ran one w/ two 295x2.


I don't know about the 1600 P2, but I am using the 1600 G2 with my quad 295x2s. Just got done playing mad max for about three hours on my Asus PB287Qs and boy is my office warm. Making me sleepy.


----------



## wermad

Same with my g1600, it's going hard at 4k but mainly my heat is coming from the radiator exhaust from the case

Edit: anyone want a g1600, they're ~$100-150 used on ebay right now.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Same with my g1600, it's going hard at 4k but mainly my heat is coming from the radiator exhaust from the case
> 
> Edit: anyone want a g1600, they're ~$100-150 used on ebay right now.


Same, exhausts into room though. I also have a couple of fans on the Intel 750s and those run a lot hotter than I thought they would. Waterblocks should be here soon though so I will tear this one down and try and see if that thermal grizzly stuff gives me better temps on the cards. I used a tube of gelid gc extreme and it was hard as clay so .i don't think I got a good spread.

I want a new case now too so that is a conundrum.


----------



## ColeriaX

Thinking of selling my 2x 295X2's. Want to be prepared to upgrade for polaris while still getting a decent deal. Whattdya guys think my 2 295's with aquacomputer blocks and active backplates would fetch







?


----------



## wermad

Head to the appraisal as well.

~500-600 with blocks and og coolers each. Though I've seen them sit at 400 stock.

if dx12 allows you add more gpu's then four and would get two more for octo-fire







.


----------



## xer0h0ur

My hardware is never leaving this rig. I would only be building a new one if anything but that is a ways away and its going to be a high end Zen FX CPU and a single Polaris GPU. Never had a fully AMD build before.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> Head to the appraisal as well.
> 
> ~500-600 with blocks and og coolers each. Though I've seen them sit at 400 stock.
> 
> if dx12 allows you add more gpu's then four and would get two more for octo-fire
> 
> 
> 
> 
> 
> 
> 
> .


That would be fun. I can only imagine how warm my office would get.


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> Head to the appraisal as well.
> 
> ~500-600 with blocks and og coolers each. Though I've seen them sit at 400 stock.
> 
> if dx12 allows you add more gpu's then four and would get two more for octo-fire
> 
> 
> 
> 
> 
> 
> 
> .


Nah more like 5 $ shipped, each.

I'll make you a deal I'll pay 5 each plus I'll even pay shipping for you, what do you say


----------



## ColeriaX

Heh. I know they arent worth what i paid for them but 500 each with blocks? Id have to say no to that my friend. Blocks alone were almost 500.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ColeriaX*
> 
> Heh. I know they arent worth what i paid for them but 500 each with blocks? Id have to say no to that my friend. *Blocks alone were almost 500*.


You got ripped off if that's the case.....


----------



## Mega Man

Depends on location, CA maybe, line u said though, $5 plus ill even pay shipping, each


----------



## ColeriaX

How so? 2x blocks with 2x active backplates and bridge was roughly 500 dollars. Not sure how that means i got ripped off *shrug*


----------



## Mega Man

I think he miss read your statement, if you read it wrong it sounds like you paid 500 each block.

But yes 500 with blocks sounds about right, (each, for a total of 1k for both)

You may get more, but I doubt it


----------



## Sgt Bilko

Quote:


> Originally Posted by *ColeriaX*
> 
> How so? 2x blocks with 2x active backplates and bridge was roughly 500 dollars. Not sure how that means i got ripped off *shrug*


Didn't realise you got the Aquacomputer ones, had to dig through your pics to find em









if that's the case then yes the price does sound about right if a little high imo

I'd say $600 each would be fair


----------



## ColeriaX

Thats what i was thinking as well. Thanks!


----------



## wermad

I've been buying a bunch of blocks for a while now (since I have a bad habit of upgrading too frequently







), and blocks are one of the worst depreciating parts around once things move on (next gen cards). Since koolance sells their block for ~$100 and I've seen them on ebay for ~$100-120, I wouldn't price the ac blocks too high. If you do, then you'll end up w/ ppl asking to part the cards and blocks. One reason why I don't spend a ton on blocks. $600 would be a good price to start off but I wouldn't take any less then $550 w/ the blocks. Again, since I've seen these cards hit $400 on their own, its hard to justify a $200 used waterblock. I was lucky enough to find my koolance blocks for $100 each bnib as it was damn hard to find anything under $200 bnib. Also, think about the price bracket now; ppl are less inclined to spend more on a water block at this range since the stock cooler is pretty decent. If you want to, split them up and ask the buyers to cover shipping. Good luck w/ this









My


----------



## StillClock1

Hey guys, does anyone know of a website that is selling an EK waterblock for this card?

Alternatively, would anyone who has a properly functioning R9 295x2 fitted with any waterblockbe interested in selling theirs?

Apologies if this is not the correct place for this post.

Thanks


----------



## Mega Man

Quote:


> Originally Posted by *Hoff248*
> 
> Hey guys, does anyone know of a website that is selling an EK waterblock for this card?
> 
> Alternatively, would anyone who has a properly functioning R9 295x2 fitted with any waterblockbe interested in selling theirs?
> 
> Apologies if this is not the correct place for this post.
> 
> Thanks


i would like to introduce you !
Quote:


> Originally Posted by *ColeriaX*
> 
> Thinking of selling my 2x 295X2's. Want to be prepared to upgrade for polaris while still getting a decent deal. Whattdya guys think my 2 295's with aquacomputer blocks and active backplates would fetch
> 
> 
> 
> 
> 
> 
> 
> ?


dont worry i only take 10% finders fee ! ( i kidd )


----------



## StillClock1

Quote:


> Originally Posted by *Mega Man*
> 
> i would like to introduce you !
> dont worry i only take 10% finders fee ! ( i kidd )


You've certainly earned your cut, thanks. I'm PM him.


----------



## Dagamus NM

Quote:


> Originally Posted by *Hoff248*
> 
> You've certainly earned your cut, thanks. I'm PM him.


Are you looking for just the block or the block and the card?

I don't have either available, just looking for clarification.


----------



## fat4l

Quote:


> Originally Posted by *Hoff248*
> 
> You've certainly earned your cut, thanks. I'm PM him.


I'm also selling my Asus Ares 3, but it's more costy than 295X2.....and UK only


----------



## StillClock1

Quote:


> Originally Posted by *Dagamus NM*
> 
> Are you looking for just the block or the block and the card?
> 
> I don't have either available, just looking for clarification.


It will have to depend on the pricing. I already have an R9 295x2 and it bugs me that it's fan runs full time. I still need to do more work on whether quadfire is a good idea for these cards - I've heard more bad than good.


----------



## StillClock1

Quote:


> Originally Posted by *fat4l*
> 
> I'm also selling my Asus Ares 3, but it's more costy than 295X2.....and UK only


Thanks a lot for letting me know, unfortunately I'm in the US. What is your reason for not shipping internationally?


----------



## fat4l

Quote:


> Originally Posted by *Hoff248*
> 
> Thanks a lot for letting me know, unfortunately I'm in the US. What is your reason for not shipping internationally?


Its an expensive card and the shipping takes too long. I'm just worried that something could happen to it during this long shipping.

Well maybe I could send it abroad, but the postage cost with insurance cover (lets say up to 1000$) would be toooo costy


----------



## StillClock1

Quote:


> Originally Posted by *fat4l*
> 
> Its an expensive card and the shipping takes too long. I'm just worried that something could happen to it during this long shipping.
> 
> Well maybe I could send it abroad, but the postage cost with insurance cover (lets say up to 1000$) would be toooo costy


Ouch, yes indeed. What would you be willing to sell the card for to a UK buyer?


----------



## wermad

As a frequent seller of stuff and I have shipped international, it can get expensive. I only ship priority international or Fedex (ups) as its quick but its hella expensive. Ultimately, its up to the buyer. I know some folks live in a county w/ crazy high prices and buying overseas is the best bang for their buck. I shipped a RIVE to a guy in Malaysia and he paid the very expensive shipping. Though he did mention its far more expensive domestically.

I highly recommend do not go w/ the cheap international shipping options as they take forever and packages will get lost for a few *months*.


----------



## fat4l

yeah, one guy asked me if I send it to kazachstan.
That was 200£ as far as I remember....thats.....ugh..300$


----------



## wermad

If you can fit it inside a medium box, its $70 through usps priority. Priority international takes less then a week and at most two weeks.


----------



## SLK

Quote:


> Originally Posted by *Hoff248*
> 
> Hey guys, does anyone know of a website that is selling an EK waterblock for this card?
> 
> Alternatively, would anyone who has a properly functioning R9 295x2 fitted with any waterblockbe interested in selling theirs?
> 
> Apologies if this is not the correct place for this post.
> 
> Thanks


I have a used acetal/copper EK block with backplate and thermal pads along with a Diamond 295x2. Never installed block yet, was thinking about going with a nano and a block so I can fit my gigantic tube res in my PC.


----------



## cegonc

I have a R9 295x2 and the Samsung U28E590 Freesync display and I'm using Windows 10 Pro x64 and the Crimson 16.1 drivers but cannot enable freesync in windmill demo (the option is greyed out).

Enabled freesync in monitors OSD and from Crimson. I've also tried 2 different DP cables. No luck!

I tried a few games and still screen tearing when Vsync if off.

If anyone knows a fix for this problem please feel free to share.


----------



## F4ze0ne

Quote:


> Originally Posted by *cegonc*
> 
> I have a R9 295x2 and the Samsung U28E590 Freesync display and I'm using Windows 10 Pro x64 and the Crimson 16.1 drivers but cannot enable freesync in windmill demo (the option is greyed out).
> 
> Enabled freesync in monitors OSD and from Crimson. I've also tried 2 different DP cables. No luck!
> 
> I tried a few games and still screen tearing when Vsync if off.
> 
> If anyone knows a fix for this problem please feel free to share.


It doesn't work for me either. I think the windmill demo is bugged with newer drivers.


----------



## xer0h0ur

Those Freesync demos were pre-Crimson so god knows what or why it is acting like that. I have a Freesync monitor but I don't even use Freesync on it. Just would introduce input lag for me on CSGO. Haven't attempted to use Freesync yet on other games.


----------



## Paul17041993

Quote:


> Originally Posted by *cegonc*
> 
> I have a R9 295x2 and the Samsung U28E590 Freesync display and I'm using Windows 10 Pro x64 and the Crimson 16.1 drivers but cannot enable freesync in windmill demo (the option is greyed out).
> 
> Enabled freesync in monitors OSD and from Crimson. I've also tried 2 different DP cables. No luck!
> 
> I tried a few games and still screen tearing when Vsync if off.
> 
> If anyone knows a fix for this problem please feel free to share.


I'm under the impression that freesync works regardless of if vsync is enabled or not, but vsync should still cap the maximum to the max Hz on the display.
Or at least that's what it's like on my 290X, have you tried disabling crossfire to see if it makes a difference?


----------



## cegonc

Quote:


> Originally Posted by *Paul17041993*
> 
> I'm under the impression that freesync works regardless of if vsync is enabled or not, but vsync should still cap the maximum to the max Hz on the display.
> Or at least that's what it's like on my 290X, have you tried disabling crossfire to see if it makes a difference?


Even with crossfire off there is no difference, the games I tested continue to show screen tearing. I can only solve the tearing problem by enabling Vsync, even on wind mill demo.

Anyway, thanks for the suggestion.


----------



## xer0h0ur

Quote:


> Originally Posted by *cegonc*
> 
> Even with crossfire off there is no difference, the games I tested continue to show screen tearing. I can only solve the tearing problem by enabling Vsync, even on wind mill demo.
> 
> Anyway, thanks for the suggestion.


Well bud. If you are using Crimson drivers then I guess you're not aware that you CAN'T disable crossfire at all. Game profiles do not disable crossfire. That setting is hopelessly broken and has been since the very first Crimson driver. So in reality you still don't know if its because of crossfire or not.

I can though. That is because I am running a hybrid Crimson/Catalyst installation. Basically I installed Crimson 16.1 then opened the installation package for Catalyst 15.11.1 and only installed the Catalyst Control Center. Manually creating application profiles in the CCC is the only true method of disabling crossfire on the 295X2.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cegonc*
> 
> Even with crossfire off there is no difference, the games I tested continue to show screen tearing. I can only solve the tearing problem by enabling Vsync, even on wind mill demo.
> 
> Anyway, thanks for the suggestion.
> 
> 
> 
> Well bud. If you are using Crimson drivers then I guess you're not aware that you CAN'T disable crossfire at all. Game profiles do not disable crossfire. That setting is hopelessly broken and has been since the very first Crimson driver. So in reality you still don't know if its because of crossfire or not.
> 
> I can though. That is because I am running a hybrid Crimson/Catalyst installation. Basically I installed Crimson 16.1 then opened the installation package for Catalyst 15.11.1 and only installed the Catalyst Control Center. Manually creating application profiles in the CCC is the only true method of disabling crossfire on the 295X2.
Click to expand...

Sounds like that's what I'll have to do when i put my 295x2 back in


----------



## cegonc

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well bud. If you are using Crimson drivers then I guess you're not aware that you CAN'T disable crossfire at all. Game profiles do not disable crossfire. That setting is hopelessly broken and has been since the very first Crimson driver. So in reality you still don't know if its because of crossfire or not.
> 
> I can though. That is because I am running a hybrid Crimson/Catalyst installation. Basically I installed Crimson 16.1 then opened the installation package for Catalyst 15.11.1 and only installed the Catalyst Control Center. Manually creating application profiles in the CCC is the only true method of disabling crossfire on the 295X2.


I also tested with Catalyst 15.7. Same results.

Thank you anyway.


----------



## fat4l

That was always this way afaik.
With cf in your system u cant enable/disable Freesync in the demo. However freeysync is working..


----------



## xer0h0ur

Quote:


> Originally Posted by *cegonc*
> 
> I also tested with Catalyst 15.7. Same results.
> 
> Thank you anyway.


I wouldn't be testing with one of the first drivers to support Freesync if I were you. I don't even remember if the 15.7 even had crossfire freesync support yet.

Edit: 15.7 was the very first driver to "support" Freesync crossfired.


----------



## fat4l

Guys have you notices some flickering of the screen when CF + FS is enabled? Its pissing me off


----------



## xer0h0ur

Quote:


> Originally Posted by *fat4l*
> 
> Guys have you notices some flickering of the screen when CF + FS is enabled? Its pissing me off


Well I have the same monitor that you do except instead of using freesync I opted instead to use AMA High, Blur Reduction ON, AREA 100 then I installed the BenQ ICC profile and calibrated the colors according to the review at http://www.tftcentral.co.uk/reviews/benq_xl2730z.htm Lastly I went into the AMD driver and cranked up the color saturation on the monitor to my liking. I saved all of those settings to profile 1 on the puck for easy switching. Those are my settings for CS:GO and they function to great effect. Despite not running freesync or vsync the game looks buttery freakin smooth at 250+ FPS running on a single GPU.

What game do you experience this in? I might be able to give it a try with freesync enabled if I have it.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well I have the same monitor that you do except instead of using freesync I opted instead to use AMA High, Blur Reduction ON, AREA 100 then I installed the BenQ ICC profile and calibrated the colors according to the review at http://www.tftcentral.co.uk/reviews/benq_xl2730z.htm Lastly I went into the AMD driver and cranked up the color saturation on the monitor to my liking. I saved all of those settings to profile 1 on the puck for easy switching. Those are my settings for CS:GO and they function to great effect. Despite not running freesync or vsync the game looks buttery freakin smooth at 250+ FPS running on a single GPU.
> 
> What game do you experience this in? I might be able to give it a try with freesync enabled if I have it.


For example, world of warcraft.
I rolled my drivers backward and it looks better now. Will wait till the next crimson update and see how it is..









See these:
https://community.amd.com/thread/191877

__
https://www.reddit.com/r/3w4iv1/freesync_flickeringstrobing/%5B/URL
....


----------



## wermad

got a good deal on these puppies. One 750 will power the second system (family pc) and the 1000 and second 750 will power my rig. Almost done adapting my cables to these guys. I thought about some nice Bitfenix extensions, but I'll wait for my tax refund for those.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> got a good deal on these puppies. One 750 will power the second system (family pc) and the 1000 and second 750 will power my rig. Almost done adapting my cables to these guys. I thought about some nice Bitfenix extensions, but I'll wait for my tax refund for those.
> 
> 
> Spoiler: Warning: Spoiler!


Ooooo.....Shiny









Always liked the look of those PSU's tbh

Speaking of deals though I got a very nice deal ($120 USD) on this lot:





Going to need some extra fittings + time to test out the pump to make sure it works (got given it for free and previous owner said it was a bit dodgy).

all in all though.....I'm very happy


----------



## lCornholio

How can I fix this throttle???


----------



## xer0h0ur

Well ol' chap. First you have the unenviable task of figuring out if you have thermal throttling from the GPUs overheating, the VRMs overheating, or driver based throttling. Good luck.

Pro Tip: Try using ClockBuster to see if its just driver based throttling. That app will trick the video card into running the clocks at full tilt. Doesn't work though if you're having thermal throttling.


----------



## xer0h0ur

Probably should have given you a link: http://forums.guru3d.com/showthread.php?t=404465


----------



## sub0seals

Anyone know when they will stock this again?I'm looking to get the Aquacomputer waterblock set up for front and back for my XFX Radeon R9 295X2?I went to the link it says this item is no longer available,as of today. (2-3-2016)


----------



## Sgt Bilko

Quote:


> Originally Posted by *sub0seals*
> 
> Anyone know when they will stock this again?I'm looking to get the Aquacomputer waterblock set up for front and back for my XFX Radeon R9 295X2?I went to the link it says this item is no longer available,as of today. (2-3-2016)


I don't think you'd find any tbh, the card isn't new anymore and people who wanted the blocks have already bought them, you might be able to find a used one somewhere though


----------



## SLK

Koolance blocks are the only ones that are still available new.


----------



## wermad

Alphacool (not "full cover"):

http://www.performance-pcs.com/alphacool-nexxxos-gpx-ati-r9-295x2-m01-w-backplate-black.html

http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/17662/alphacool-nexxxos-gpx-ati-r9-295x2-m01-mit-backplate-schwarz?c=6470

Koolance:

http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2

Aquatuning has a copper w/ smoked plexi:

http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/16581/aquacomputer-kryographics-vesuvius-fuer-radeon-r9-295x2-black-edition?c=6470

If you search a bit more, you can find some other shops that might have AC, or EK, etc.

Btw, ebay has a 295x2 w/ AC clear nickel block. You can always message the seller regarding splitting them or buy the whole thing and sell the card you won't need (or keep it for quads!).


----------



## sub0seals

Wermad thank's for the help,very much appreciated!


----------



## wermad

here's one at hardforum:

http://hardforum.com/showthread.php?t=1871056&highlight=295x2


----------



## fat4l

So this is what I'm experiencing:
Flickering with crossfire + freesync enabled.
This is with 16.1.1 drivers.
For example, when I run World of warcraft, then alt-tab into windows, then run skype, then the desktop starts flickering, especially when I try to move the skype window.

The usage and frequency fluctuation with games, mostly older ones(DX9, DX10)...
Here are two vids showing the fluctuations. Both clock and usage.
#1 World of Warcraft 3.3.5 - 



#2 Crysis 1 - 




System spec-
Intel i7 4790K @5.1G
Asus M7 Hero
Corsair Dominator Platinum 4*4GB 2666CL10
Asus Ares III (2*290X)
Superflower Leadex 1200W Platinum
BenQ XL2730Z @1440p/144Hz

AMD Drivers: 16.1.1 Crimson, Windows 10 Pro 64 bit.


----------



## AvengerNoonZz

Would a EVGA 750W G2 PSU be cutting it close / work fine with a 295x2 and the following specs below?

ASROCK Z68-PRO3 Motherboard
[email protected] + Corsair H60 CPU Cooler
8GB DDR3 RAM
120GB SSD + 4TB HDD
Phantek Enthoo Pro Case (with 2 stock fans installed)
24" 1080p 144hz freesync monitor


----------



## jg900ss

I would refer to the valuable PSU spreadsheet listing available earlier in this TOPIC.

I tried the 295x2 with a 5930K, MSI X99S MPower board, 32GB G.Skill 2666 DDR4, 4 drives, 4 chassis fans, and a BeQuiet Dark Rock TF cooler. The PSU was the Corsair AX860i. It was not able to handle it. Regular crashes as soon as things started to cook in games, or testing. I had tried two MSI R9 290s in Crossfire before that and THEY ALSO DID NOT WORK with only the AX860i PSU. I replaced the 295x2 with a Sapphire Fury Tri-X OC, and moved the 295x2 into a new build with an EVGA 1600 G2, FX8350, ASRock 990FX Fatal1ty with M.2, and 32GB DDR3 RAM. I believe anything under 1000w is cutting it close with this card, and special attention needs to be focused on the 12v rail, to ensure its a SINGLE rail, and has the correct OCP protections. I had attempted to power the card with a Silverstone 1200w unit, but it turned out the 12v rail had been segmented with 4 OCPs, into effectively 4 rails with a maximum of 28 AMPs per OCP.

Check the spreadsheet (I believe it starts on page 490 of this topic) and validate your PSU choice against that. It certainly helped me.

Good luck!!


----------



## wermad

Looking at the specs, 62 amps, its a little too close for me. Amd recommends 50 amps for the 2952x2 and you would want at least 20 amps for the rest of your system.

edit: 850w gold/platinum would be the minimum imho, just dont dial up the clocks like crazy. 1000w+, you should be good. Just make sure you read the rails specs. The list I made in the op helps with the majority of units out there.


----------



## cmoney408

Quote:


> Originally Posted by *AvengerNoonZz*
> 
> Would a EVGA 750W G2 PSU be cutting it close / work fine with a 295x2 and the following specs below?
> 
> ASROCK Z68-PRO3 Motherboard
> [email protected] + Corsair H60 CPU Cooler
> 8GB DDR3 RAM
> 120GB SSD + 4TB HDD
> Phantek Enthoo Pro Case (with 2 stock fans installed)
> 24" 1080p 144hz freesync monitor





http://imgur.com/IFUaNiH


mine pulls 790w in and 720out. i think worse case, you remove any OC and you are ok

Asus Z87 deluxe
4770k @ 4.3Ghz - h100i with 2 noctua fans
295x2 with noctua fan (OC +8 GPU, 50%+ Power, 1600Mhz)
16gb ram
500gb 850 evo ssd
240gb 830 ssd
2tb hdd
1tb extrenal hdd
3 noctua case fans
hx850i (platinum rated)


----------



## AvengerNoonZz

Thanks all.

I found a used Thermaltake 1200W PSU (W0133RA) which is promised to be in excellent condition and comes with all cables for about $170 AUD / $122 USD so I might get that. Fingers crossed.


----------



## wermad

Quote:


> Originally Posted by *AvengerNoonZz*
> 
> Thanks all.
> 
> I found a used Thermaltake 1200W PSU (W0133RA) which is promised to be in excellent condition and comes with all cables for about $170 AUD / $122 USD so I might get that. Fingers crossed.


Make sure you find out how each connector/plug is assigned to the two rails:

[email protected], [email protected], *[email protected], [email protected]*, [email protected], [email protected]

As long as your 295x2 is going through the second rail, you're perfectly fine







. Usually, the manufacturer provides a wiring diagram to tell you which pin/wire is assigned to what rail.

Here's the lepa g1600 as an example:


----------



## AvengerNoonZz

What do you guys think of the Cooler Master Silent Pro M2 1000 W Power Supply? It is able to supply 80A on a single 12V rail, so about 50A for the GPU and 30A for the rest of my components?


----------



## wermad

That's fine. Though I have a friend who had issues with that one and even the rma's were bad. If you can, look for a cm v1000 or v1200. These are using better platforms and are fully modular. I had two v1000s and they're great.


----------



## BigCatRoach

I know personally on the PSU side I wasn't able to run with just a 750 but my system itself is a little more power hungry than yours @AvengerNoonZz so 1000+ definitely the way to go.
On another note but still relevant cause the talk of PSUs I just got my second 295x2 and right now for testing I'm using three PSUs. Here is what I'm pulling from the wall.
First is idle, second is fire strike extreme and third is fire strike with an OC on the cards (no voltage increase). So for me it looks like I'll be getting a 1600.


----------



## wermad

I should daisy chain my three


----------



## BigCatRoach

Mine aren't even daisy chained just jumped green and a black then one is for each GPU the third is the MoBo.


----------



## Mega Man

i am sorry what was the general consensus to why the 295x2 throttled at 75 and not 95 like normal 290xs, and could you please respond in the quoted thread, ... thanks
Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> No the devil 13 is NOT a 295x2.
> 
> It is just 2 290x cores on one pcb, and they are not the same the devil 13 was also released earlier then a 295x2
> 
> If power color made them throttle so low that is due to them.
> 
> If you don't believe me about the 295 throttling to protect the pump, come ask in the 295 thread.
> I know, I have quad 290x and 2 295s
> 
> 
> 
> Okay first thing is first. A 295x2 is AMDs name for a dual 290x card, anycard that has dual 290x gpus follows AMDs designs plans of a 295x2. 295x2 is the reference cards name nothing more, they can call it whatever they want it is still a 295x2.
> 
> "and they are not the same the devil 13 was also released earlier then a 295x2 " No it wasn't, nor could it be GPU partners dont just start making cards, if AMD doesn't make a Dual GPU reference design then board partners do not make one period. Same thing with Nvidia, Amd and Nvidia make the cards those are your reference models, then they allow certain changes to be made from board partners. I do not know where you are getting this but its incorrect. Below is the dates for you.
> 
> 295x2 - April 21, 2014
> Devil 13 - around June 15,14
> 
> "If power color made them throttle so low that is due to them. " You are right I suppose they could have edited in there custom bios, however they did not nor did Asus AFAIK.
> 
> "If you don't believe me about the 295 throttling to protect the pump," again I dont need to ask anyone, what you are saying doesn't pan out for anyone that knows anything about liquid cooling. If your GPUs are at 75c your water is more like 30c a pump can operate upwards of 50c your water is not hitting anywhere near 50c.
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> No, my cards are watercooled. uplugging it is like having it out anyways and just having one. you don't have to pull it out. you prolly don't have to unplug. i just do it.
> 
> 1. disable crossfire
> 2 shutdown and unplug power to second card
> 3. boot uninstall old driver with 12 or 13 driver (older drivers have uninstall feature)
> 4. Reboot and install new driver
> 5. Shutdown and plug second card
> 6. Boot and crossfire should set itself. if not, then set it yourself.
> 
> These steps i do to both my amd and intel. but, like i said before, i normally just install the new over the old after steps 1 & 2. i followed the steps above for crimson and omega.
> 
> Click to expand...
> 
> Umm I am 90% sure a gpu will still get power and still be recognized with just the 6 pins removed. As a mater of fact it would have to be or your idea would not work. As we both pointed out to you already it doesnt work that way.
> 
> If you have 1 card installed and try to add a second without reinstalling drivers it will not work, SLI does but CFX does not. Both cards have to be installed at the time of the driver install to unlock CF ability's. I ran into this when I got my second 290x as I did not know this and for days couldn't get CFX to work, I asked here and TSM told me that.
> 
> I do agree about the removing Water Cooled cards thing sucks though I love that my RIVBE can shut off PCIE lanes with switches best feature ever!.
> 
> Click to expand...
> 
> omg
> 
> so hard, now i remember why i blocked you
> 
> http://www.powercolor.com/us/products_features.asp?id=549
> Quote:
> 
> 
> 
> devil 13 dual core 290x
> 
> Click to expand...
> 
> you can find it under products and searching for 290x
> http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+290X&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search
> 
> under 295x2s you cant
> http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+295X2&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search
> 
> crap in ones can not be compared to WATER COOLING
> 
> crap in ones water, if the cores on the 295x2s could reach 95 like normal 290/290x then the water could SATURATE and get to 75, ( seriously have you seen how crappy asecrap is ?) again
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...
> 
> I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.
> 
> Click to expand...
> 
> Here ya go: http://www.overclock.net/t/1540259/r9-295x2-rad-fan-shootout/0_50
> 
> The SP120's do quite well, i'm running a couple of Noctua iPPC NF-F12's on mine and the Cougar's should perform well too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, also is there a way to disable the red LED?
> 
> Click to expand...
> 
> Yes!
> 
> Under the shroud you can unplug the LED's power cable, taking off the shroud is quite easy and will not void your warranty
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> The LED has its own power cable? +1 AMD.
> It really annoys me launching and having to install the bloatware that is known as GF Experience JUST to turn off the LED on my GTX 690.
> 
> Coming from Nvidia's "best" dual GPU to AMD's best. I really am going to be interested to see the difference
> 
> 
> 
> 
> 
> 
> 
> 
> One thing I love that almost no reviews seem to mention is the customizability of the cooling - simply change out the fans for different ones.
> 
> I heard/read that the 295X2 has a thermal throttle of 75'c - whereas the 290Xs have a higher throttle (85'c?). Seeing as how the R9 295X2 should have binned 290X cores - surely the thermal throttle is just for the AIO cooler / heat / power draw. Is it safe going beyond it (with a BIOS flash on the one BIOS to disable it)?
> 
> Purely for benchmarking, I will be running this thing stock 24/7 otherwise
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> here is the cable:
> 
> 
> 
> And yeah, the 295x2 throttles at 75c due to the amount of heat the pump can withstand afaik, the 290x has a throttle point of 95c.
> 
> i have the Sapphire OC Bios flashed onto my card and it runs quite well, haven't tried any hard benching but with the stock Bios i managed 1150/1625 with +100mV in Afterburner.
> 
> With the Sapphire Bios i can go up to +300mV in Sapphire Trixx.....waiting for cooler weather though
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> since you wont ask
> 
> @wermad @Sgt Bilko and anyone else
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i am sorry what was the general consensus to why the 295x2 throttled at 75 and not 95 like normal 290xs, and could you please respond in the quoted thread, ... thanks
> 
> Click to expand...
Click to expand...


----------



## xer0h0ur

Its not a consensus. Its a fact. The Asetek AIO cooler's pumps were not designed to handle high water temps that going above 75C would cause.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its not a consensus. Its a fact. The Asetek AIO cooler's pumps were not designed to handle high water temps that going above 75C would cause.


^ Bingo


----------



## Mega Man

can you please respond in the other thread, seems i am just a crazy wacko you can just click the little text balloon with the green arrow by my name in that quote


----------



## wermad

I guess I found a solution:



Too bad for the other members who are being mislead...They may want to get a 295x2 or a 290X x2 in the future....?


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> I guess I found a solution:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Too bad for the other members who are being mislead...They may want to get a 295x2 or a 290X x2 in the future....?


My brain hurts now after all that


----------



## wermad

Quote:


> Originally Posted by *BigCatRoach*
> 
> Mine aren't even daisy chained just jumped green and a black then one is for each GPU the third is the MoBo.


Oops, sorry missed your post among the funs going on right now.

I think you do have to daisy chain them in order for the mb to send the power on signal (via the green and black lines), otherwise, it may power on the cards separately and the mb may not detect a signal. I have a "jumper cable, which is probably the most common approach:



I used this cable to run quad 7970 Lightning BE's oc'd for the fanboy competition circa 2014. Used two V1000s in tandem. This time, i got a sweet deal on three (white!) NZXT Hale90s and I'll be chaining two of them to run my setup; the 1kw will power the first card and the mb/fans/pumps and the first 750w is only powering on the second 295x2. The second 750w will power on the second system in my TX10-D which for now is just doing general and school stuff for my little ones and the boss (aka the wife). if my little ones show more interes in pc gaming, I'll throw in a new gpu in there for them to start gaming on. My rig....is mine....mine! My precioussssssssssssssssssssss......









Quote:


> Originally Posted by *Sgt Bilko*
> 
> My brain hurts now after all that


----------



## sub0seals

Is this a good over all card? I have one sitting in the closet, i'm still trying to get the parts needed for the build its going in. So far i have the ASUS Rampage IV BE MOBO, EVGA NEX Classified 1500 PSU. Do any of you have this PSU? Thanks again for any help guys or anything about the card.


----------



## Mega Man

i love my cards,

as to the psu, it is a meh psu, not bad, but far from excellent


----------



## sub0seals

so what do i get if it's not a good one,i thought EVGA was supposed to make good ****???


----------



## Mega Man

first
evga does not make any psus, they buy them and slap a label on them
you need to look at OEMS and reviews

if you are interested you can start here
http://www.overclock.net/t/715889/phaedrus-psu-articles#post9110838

articles and myths are great places to learn

also johnny guru is your friend
http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story7&reid=311
the nex is meh, the g2s are great and gs and i am sure i am missing at least one other newer model
http://www.overclock.net/t/1395708/evga-power-supplies-information-thread/0_100
http://www.overclock.net/t/1541939/evga-supernova-lineup-explained/0_100#post_23556598

as for recommendations it depends on what you have locally as you are in my area ( however i am in china atm )
the new seasonic is good - seasonic prime
superflower leadex
evga g2,gs,t2
seasonic xseries and any based on them

http://www.overclock.net/t/1431929/psu-index-thread


----------



## wermad

I'll have to agree, I hear lots of not so good opinions on the "nex" units, especially the 1500w. Get a gold or platinum 1kw or 1.3kw (these are single rail btw).


----------



## sub0seals

I'm still trying to learn a lot that's why i come here,evga seems so widely used so i just assumed it was a good product.Who do you guys reccomend since evga is no good?


----------



## sub0seals

I have a be quiet dark power pro 10, 850 watt 80 plus platinum psu as well,they both are dual rail psu though,which i thought was supposed to be better for something that consumes so much power,like the xfx r9 290x2 hydro i bought. The evga 1500 is 80 plus gold. thanks again guys


----------



## wermad

I would skip the "NEX" evga units. All other evga gold and platinum (and titanium) units have been very good. If i had the space, I would be running an evga 1600w or an evga 1000+850. I'm going w/ cooler master when i get the chance since my current nzxt units are a bit too long. The V750 is ~140mm long and I can squeeze in my old favorite V1000.


----------



## Mega Man

My post above has good recommendations in the co area. Otherwise look it up on Johnnyguru,

Please never pick a psu on 80 plus gold rating

Also re edited my above post, one of the links I included disappeared I re added it.

http://www.overclock.net/t/715889/phaedrus-psu-articles#post9110838 please read at min the "on efficiency" article

However if you want to learn how to judge PSUs read all myths and all articles in that link


----------



## BigCatRoach

Quote:


> Originally Posted by *wermad*
> 
> Oops, sorry missed your post among the funs going on right now.
> 
> I think you do have to daisy chain them in order for the mb to send the power on signal (via the green and black lines), otherwise, it may power on the cards separately and the mb may not detect a signal. I have a "jumper cable, which is probably the most common approach:
> 
> 
> 
> I used this cable to run quad 7970 Lightning BE's oc'd for the fanboy competition circa 2014. Used two V1000s in tandem. This time, i got a sweet deal on three (white!) NZXT Hale90s and I'll be chaining two of them to run my setup; the 1kw will power the first card and the mb/fans/pumps and the first 750w is only powering on the second 295x2. The second 750w will power on the second system in my TX10-D which for now is just doing general and school stuff for my little ones and the boss (aka the wife). if my little ones show more interes in pc gaming, I'll throw in a new gpu in there for them to start gaming on. My rig....is mine....mine! My precioussssssssssssssssssssss......


Since mine is temporary I just use a low awg wire to jump it and turn the PSUs on before switching on the machine.


----------



## AvengerNoonZz

Ok 295x2 is here but I don't have a PSU yet.. Has anyone come across the Cooler Master Real Power M1000W Power Supply? Can't really find any recent information on it because I am guessing it is quite an old PSU.


----------



## wermad

Single rail bronze, it should still work imho. I think some reviews of it in 09' when I did a quick google search.


----------



## AvengerNoonZz

Quote:


> Originally Posted by *wermad*
> 
> Single rail bronze, it should still work imho. I think some reviews of it in 09' when I did a quick google search.


Literally just got a EVGA 1000W G2 PSU for $160 AUD shipped (brand new is $280 AUD or so here).


----------



## wermad

Kewl


----------



## AvengerNoonZz

Is disabling ULPS still recommended? I plan on installing the R9 295x2 with the latest crimson drivers tomorrow if that counts for anything.


----------



## sub0seals

I really appreciate all the helpful info from you and wermad. Since i have these 2 PSU already which would be best for the card i have(Radeon R9 295X2 Hydro Version) ? the Be Quiet Dark Power Pro 10 850 watt Plus Platinum or the 1500 watt 80 Plus Gold NEX Classified? thanks again guys.


----------



## Sgt Bilko

Quote:


> Originally Posted by *AvengerNoonZz*
> 
> Is disabling ULPS still recommended? I plan on installing the R9 295x2 with the latest crimson drivers tomorrow if that counts for anything.


Yeah, disabling ULPS is recommended as things run a bit smoother with it off and the small amount of extra heat is a worthy trade off
Quote:


> Originally Posted by *sub0seals*
> 
> I really appreciate all the helpful info from you and wermad. Since i have these 2 PSU already which would be best for the card i have(Radeon R9 295X2 Hydro Version) ? the Be Quiet Dark Power Pro 10 850 watt Plus Platinum or the 1500 watt 80 Plus Gold NEX Classified? thanks again guys.


I'd use the 1500w NEX but either of them would be fine


----------



## Mega Man

Quote:


> Originally Posted by *sub0seals*
> 
> I really appreciate all the helpful info from you and wermad. Since i have these 2 PSU already which would be best for the card i have(Radeon R9 295X2 Hydro Version) ? the Be Quiet Dark Power Pro 10 850 watt Plus Platinum or the 1500 watt 80 Plus Gold NEX Classified? thanks again guys.


http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story5&reid=296

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story7&reid=311

imo go with the be quiet, in terms of power quality is better


----------



## sub0seals

I read the link,thanks alot mega man. you are very helpful person.


----------



## sub0seals

One last question, he says in his overview that the Dark Power Pro 10 just missed the 80Plus Platinum rating,so why do they rate it 80 plus platinum on the box?


----------



## Mega Man

if you read his reviews alot ( just an fyi, i know you dont ) he gives ~ 1% error rate as his equip isnt the 10k $ equip also the psu was not terribly off, so he may of just got a bad unit, he has done for his for so long- he knows if he got a one off, or if the unit was bad, he is imo the best in the business


----------



## AvengerNoonZz

Does anyone have a copy of the ULPS config tool? Front page links doesn't seem to work.

Edit: Damn, the performance is great when games scale properly but I'm bummed out that GTA V runs better when crossfire is disabled :'(


----------



## TheScrib

What version of my 295x2 is this? Why does it have a crossfire bridge and also why is the front plate completely different than the majority if not all of yours?


----------



## Sgt Bilko

Quote:


> Originally Posted by *TheScrib*
> 
> What version of my 295x2 is this? Why does it have a crossfire bridge and also why is the front plate completely different than the majority if not all of yours?
> 
> 
> Spoiler: Warning: Spoiler!


That's an OEM card (AMD Radeon) and by the looks of it an engineering sample, where did you get it by chance?


----------



## TheScrib

I cannot answer that because I don't know, It was given to me. If it is indeed an engineering sample, is it possible that it can be bridged with the 7xxx series? such as a 7990. I know they have completely different architectures and the r9's use the xdma engine but what could that bridge be used for? Just makes you wonder.


----------



## Mega Man

the original 290s still had them as well. i think it was left over from the 79xx and may of been used but they then scrapped it as they got the " other " cfx working without bridges, but that is 100% pure speculation on my end

no you cant connect them to any series and have it work


----------



## xer0h0ur

Quote:


> Originally Posted by *AvengerNoonZz*
> 
> Does anyone have a copy of the ULPS config tool? Front page links doesn't seem to work.
> 
> Edit: Damn, the performance is great when games scale properly but I'm bummed out that GTA V runs better when crossfire is disabled :'(


Just use RadeonMod. Its one of the best registry tweak tools for AMD cards.


----------



## AvengerNoonZz

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just use RadeonMod. Its one of the best registry tweak tools for AMD cards.


Thanks!

Is this thread okay to talk about performance results? In a lot of games that have amd crossfire profiles they slow down a lot.

E.g. I've been reading gta v and shadow of mordor works well but in the GTA benchmarks I can see slowdowns/fps drops and when turning the camera in shadow of mordor it also slows down (slow motion and drops to 20fps then goes back up to 100fps).

Anyway what driver is everyone using with these cards?


----------



## axiumone

Quote:


> Originally Posted by *TheScrib*
> 
> What version of my 295x2 is this? Why does it have a crossfire bridge and also why is the front plate completely different than the majority if not all of yours?


Woa! Now that's interesting. Definitely a pre-production sample. Shroud design is different from production. That doesn't look like a crossfire connection, if I had to guess, that's some kind of a debug/diagnostics connection.

I don't think you'll be able to crossfire it with a 7000 series cards.


----------



## Siman

I got 2 sapphire oc cards and i cant get them to clock up at all they stay in the 300mhz area. Anyone else having problems?


----------



## xarot

I fired up my Ares III again using my Asus PG279Q and I was pleasantly surprised when I noticed that 144 Hz works without any hiccups (G-Sync screen). I am not using the cable included with the monitor though, so maybe that's why. With my older PG278Q I could never get 144 Hz to work, only 120 max.

On the other hand, what's the thing with the new Crimson drivers really. Almost nothing seems to work properly with them. I reverted to 15.11 and went from almost non-existant CF scaling to where I remember it been.


----------



## wermad

Isn't 8 freesync and 9 gsync?


----------



## Siman

Anyone know how to fix the idle clock issue? I cant get them to clock up...


----------



## ColeriaX

Quote:


> Originally Posted by *Siman*
> 
> Anyone know how to fix the idle clock issue? I cant get them to clock up...


Disable ULPS already? Change ULPS registry entries? Are you using a 3rd party program for overclocking? I'm pretty sure disabling powerplay in afterburner will do the trick as well. On another note, I sold my 2x 295s...feeling pretty sad theres nothing available (at least until 2H) that I want to replace them with. People who sold their 295s, what did you end up with?


----------



## Siman

Quote:


> Originally Posted by *ColeriaX*
> 
> Disable ULPS already? Change ULPS registry entries? Are you using a 3rd party program for overclocking? I'm pretty sure disabling powerplay in afterburner will do the trick as well. On another note, I sold my 2x 295s...feeling pretty sad theres nothing available (at least until 2H) that I want to replace them with. People who sold their 295s, what did you end up with?


yeah I disabled ULPS reinstalled windows and disabled powerplay.

I used clock blocker and got my master GPUs to clock up but the slave GPUs stay idle... After I restarted and clock blocker was off the master GPUs clock fine... but the slave GPUs dont clock at all...


----------



## xarot

Quote:


> Originally Posted by *wermad*
> 
> Isn't 8 freesync and 9 gsync?


No, PG278Q is G-Sync and MG279Q is Freesync. PG279Q is G-Sync

Quote:


> Originally Posted by *ColeriaX*
> 
> On another note, I sold my 2x 295s...feeling pretty sad theres nothing available (at least until 2H) that I want to replace them with. People who sold their 295s, what did you end up with?


I went to Titan X SLI after 295X2 and haven't much looked back. I still have my Ares III 290X2 too though, and currently swapped it in to see how it runs. Well it does very well.


----------



## wermad

Gotcha









I was close in getting the MG 9 (ips) but went with a dell 4k.


----------



## Mega Man

friends dont let friends buy dell









i love one liners not really directed at you, but you think they are, hehehehe


----------



## wermad

not sure if just trolling...


----------



## Mega Man

just messing with you wermad !


----------



## AvengerNoonZz

Is my [email protected] bottle necking games? So to keep it short -

The Witcher 3 - runs perfectly in crossfire
CS:GO - runs great
GTA V - runs well but sometimes it drops down to 20fps and goes back up to 90fps
Shadow of Mordor - Does the same as GTA V but happens frequently
Bioshock Infinite - Does the same as GTA V but happens frequently

I have even tried it on a clean Windows install but the problems are still there. The best I can describe the problem is the affected games have slowdowns where the game suddenly drops to 40fps and goes back up to 100fps, etc. It happens quite a lot when I turn the camera in Shadow of Mordor but it can happen anywhere in the other affected games.


----------



## Mega Man

I can bottleneck any and all cpus.

The best advice I can give you is upgrade if you want.

Iirc shadow of mor, and gta v are nvidia gameworks, amd has a hard time optimizing drivers. try manually disabling cfx iirc this helps with som


----------



## wermad

Alex runs a 2500k with his 295x2. I think he has it ~4.8. This cpu is still very capable even with a mild to medium oc.


----------



## AvengerNoonZz

Thanks. Yeah I do not want to give up this GPU just yet, it has a lot of potential.


----------



## Mega Man

Like I said I can bottle neck any cpu, just start running higher res. And you will be fine

I mean don't try to play at such a low res ( not that you are ) that the cpu is such an issue, people like to throw the term bottle neck around. It is a useless term, something bottlenecks everyone.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Gotcha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was close in getting the MG 9 (ips) but went with a dell 4k.


the 8 means TN and the 9 indicates IPS (for most of Asus' line-up but not all)


----------



## wermad

Quote:


> Originally Posted by *AvengerNoonZz*
> 
> Thanks. Yeah I do not want to give up this GPU just yet, it has a lot of potential.


Its still a very solid choice and I would squeeze as much as you can out of it before jumping ship. I've seen a few comparisons w/ newer chips and its still there but you can see its trailing sometimes in lower resolutions (ie 1080/<). If you plan to get a 2k or 4k or eyefinity setup, just give it a good oc though your 295x2 will do the bulk of the work. If you plan on staying w/ a 1080 screen, grab a Haswell or Skylake i5 for ~$150-200. I sold my 4690K for ~$145 and bought a new 6600k for $210.

Btw, amd does better with higher resolutions then 1080 and I will recommend Nvidia instead if you plan to stay at this resolution. Your 295x2 can trigger 2x3 2560x1440 btw









Quote:


> Originally Posted by *Sgt Bilko*
> 
> the 8 means TN and the 9 indicates IPS (for most of Asus' line-up but not all)


Yeah, I totally had a lapse there in memory. it must be this killer week of just intense work on my rig. I'm pretty close to finishing it just the final touches. Its been hell tbh and I found myself to the edge of tears many, many times....







. At least the crucial and delayed part (my loop!) came together beautifully and I'm very pleased with the results. Here's a shot of some leak-testing. I got some Bitfenix extensions added and both psu's are powering on my rig (







).


----------



## xarot

Quote:


> Originally Posted by *AvengerNoonZz*
> 
> The Witcher 3 - runs perfectly in crossfire


You don't get any stuttering? I am getting massive stuttering in that game but it's in the borderline of acceptable. GPU usage goes up and down all the time.

Also anyone tried Rise of the Tomb Raider, I am getting massive stuttering? Also Metro: Last Light seems to stutter quite a bit too, but that might be because the game is really heavy on the GPU when maxed out. Just would like to share experiences. I am on latest Crimson drivers and ULPS disabled.

I am currently having Titan Z and Ares III in my rig and testing them one after the other. Seems Ares III is a tad faster, but has a lot more stuttering too. And I'm using G-Sync on the Z...


----------



## remedy1978

Quote:


> Originally Posted by *xarot*
> 
> You don't get any stuttering? I am getting massive stuttering in that game but it's in the borderline of acceptable. GPU usage goes up and down all the time.
> 
> Also anyone tried Rise of the Tomb Raider, I am getting massive stuttering? Also Metro: Last Light seems to stutter quite a bit too, but that might be because the game is really heavy on the GPU when maxed out. Just would like to share experiences. I am on latest Crimson drivers and ULPS disabled.
> 
> I am currently having Titan Z and Ares III in my rig and testing them one after the other. Seems Ares III is a tad faster, but has a lot more stuttering too. And I'm using G-Sync on the Z...


Massive stuttering for me as well in Rise of the Tomb Raider. Can't comment on Witcher 3. Will try Metro Last Light. Unfortunately, I am going to throw in the towel on this GPU. Love the look, but the stuttering, lack of proper crossfire drivers, inability to disable crossfire on certain titles, and clock fluctuations with Freesync are driving me mad.


----------



## Sgt Bilko

Quote:


> Originally Posted by *remedy1978*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xarot*
> 
> You don't get any stuttering? I am getting massive stuttering in that game but it's in the borderline of acceptable. GPU usage goes up and down all the time.
> 
> Also anyone tried Rise of the Tomb Raider, I am getting massive stuttering? Also Metro: Last Light seems to stutter quite a bit too, but that might be because the game is really heavy on the GPU when maxed out. Just would like to share experiences. I am on latest Crimson drivers and ULPS disabled.
> 
> I am currently having Titan Z and Ares III in my rig and testing them one after the other. Seems Ares III is a tad faster, but has a lot more stuttering too. And I'm using G-Sync on the Z...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Massive stuttering for me as well in Rise of the Tomb Raider. Can't comment on Witcher 3. Will try Metro Last Light. Unfortunately, I am going to throw in the towel on this GPU. Love the look, but the stuttering, lack of proper crossfire drivers, inability to disable crossfire on certain titles, and clock fluctuations with Freesync are driving me mad.
Click to expand...

Witcher 3 and Metro:LL are pretty solid, haven't tried ROTR as yet.

I've only experienced stuttering on 1-2 games, the disabling Crossfire thing isn't new, you've not been able to do that on any Crimson driver afaik and I've never experienced any clock speed fluctuations with FreeSync unless I've have the fps capped.


----------



## remedy1978

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Witcher 3 and Metro:LL are pretty solid, haven't tried ROTR as yet.
> 
> I've only experienced stuttering on 1-2 games, the disabling Crossfire thing isn't new, you've not been able to do that on any Crimson driver afaik and I've never experienced any clock speed fluctuations with FreeSync unless I've have the fps capped.


Are you running Crossfire? There is an issue with Crossfire and Freesync. If Freesync is enabled and you are running Crossfire, the clock rates of the card will fluctuate unless you use Clock Blocker.


----------



## Sgt Bilko

Quote:


> Originally Posted by *remedy1978*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Witcher 3 and Metro:LL are pretty solid, haven't tried ROTR as yet.
> 
> I've only experienced stuttering on 1-2 games, the disabling Crossfire thing isn't new, you've not been able to do that on any Crimson driver afaik and I've never experienced any clock speed fluctuations with FreeSync unless I've have the fps capped.
> 
> 
> 
> Are you running Crossfire? There is an issue with Crossfire and Freesync. If Freesync is enabled and you are running Crossfire, the clock rates of the card will fluctuate unless you use Clock Blocker.
Click to expand...

I was using this card and FreeSync in Nov-Dec and it wasn't an issue then so I'm assuming this is a pretty recent problem?

And I've never had to use ClockBlocker, Aterburner has had the same ability for a very long time, you just disable PowerPlay if you need to lock in 3D Clock speeds.


----------



## xarot

Quote:


> Originally Posted by *remedy1978*
> 
> Massive stuttering for me as well in Rise of the Tomb Raider. Can't comment on Witcher 3. Will try Metro Last Light. Unfortunately, I am going to throw in the towel on this GPU. Love the look, but the stuttering, lack of proper crossfire drivers, inability to disable crossfire on certain titles, and clock fluctuations with Freesync are driving me mad.


I think I 'fixed' it, my guess is that 4 GB VRAM per each card falls short on it's feet at 2560x1440. ROTTR uses around 4,5 - 6 GB VRAM easily. On the other hand, it should not cause much issues as per PCPER article, but maybe on these cards it does. So what I did was to just clearly drop to high settings and it's all good now. Some info here: http://steamcommunity.com/app/391220/discussions/0/451852225134000777/

_"Also note that textures at Very High requires over 4GB of VRAM, and using this on cards with 4GB or less can cause extreme stuttering during gameplay or cinematics."_

Edit. Didn't realise VRAM size has become an issue as I've used Titan Xs for almost a year so no VRAM bottlenecking there.









Also after I could properly disable ULPS on Crimson drivers, Metro: Last Light is now butter smooth.

Only Witcher 3 is still causing issues, it's really a shame they didn't or couldn't fix that game.


----------



## spyshagg

You guys think your 295x2's are the real deal ?!

..
..
..

Then Its time!!!!!!




*AMD vs NVIDIA*

> Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50

Put your 295's to WORK!
Lets do this! come on!


----------



## cmoney408

If any of you live in San Jose (South Bay Area) and want another 295x2, i am selling mine. i know, im sorry. it has been "upgraded" with fujipoly pads on the vrm and caps. it has grizzly minus 8 pads on the ram and grizzly kryonaut on the GPU's. even OC'd it stays cooler then it did with the old pads/paste at stock.

i only upgraded because of a dell promo code error. so i ended up getting (2) GTX 980ti's for a little under $900.


----------



## wermad

Quote:


> Originally Posted by *spyshagg*
> 
> You guys think your 295x2's are the real deal ?!
> 
> ..
> ..
> ..
> 
> Then Its time!!!!!!
> 
> 
> 
> 
> *AMD vs NVIDIA*
> 
> > Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50
> 
> Put your 295's to WORK!
> Lets do this! come on!


I wasn't happy they skipped it last year when i was ready. So I'm not too interested this year tbh. I doubt Amd will pull off the win this year though good luck to all







.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *spyshagg*
> 
> You guys think your 295x2's are the real deal ?!
> 
> ..
> ..
> ..
> 
> Then Its time!!!!!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> *AMD vs NVIDIA*
> 
> > Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50
> 
> Put your 295's to WORK!
> Lets do this! come on!
> 
> 
> 
> I wasn't happy they skipped it last year when i was ready. So I'm not too interested this year tbh. I doubt Amd will pull off the win this year though good luck to all
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

^ Ditto that.

AMD won't win it simply because Nvidia does really well in Futuremark benches atm, it's just the way of things.


----------



## pgetueza

Unfortunately, my 295x2 decided to quit on me. I was able to play games normally until early this month. As previously shared on this thread, I had some issues with some game particularly BFH which would cause OCP trip after two mins of game time followed by benchmarking tools, recently. Whenever I launch Heaven or Cinebench, I'd get BSOD with ATIKMDAG.sys. I also tried reformatting to no avail. Whenever I install the drivers, it would crash in halfway and yields 'display driver crash and have recovered.. I could normally install the drivers through device manager but I am unable to launch and D3D. At this point, I have reason to believe that the card is defective but what do you guys think? I spoke to some techs and they mentioned it might be the heat because the card usually runs at 75*C. They also told me that re-balling might fix the issue but it's a 50-50.


----------



## wermad

Try the backup bios? Under load, the card will throttle back your core clocks once it hits 75°. You can always pull it out of the case and tested with some added cooling. Tbh, installing drivers should not put much load onto one core, so if it does it when your doing this, it could be a faulty card. Try the back up bios and do some external testing if you can.


----------



## pgetueza

I did backup the BIOS. I have an extra hardware with an EVGA z170 FTW/i3 6100/8GB RAM\RM1000, I was able to install the drivers and then ran Furmark. It was stable for 15 minutes, no issues after I quit. But when I ran any Unigine benchmarks be it Valley or Heaven, when it loads, it crashes right after. then restarts. No BSOD for now.


----------



## SLK

I just put my 295x2 on water with my 5820k. I have a D5 pump 420 and 240 XSPC EX Rads running at about setting 3 with my fans on high. My water temps are hitting 39c+ full load. Idle water temp is around 24c. Is this around what everyone else is seeing?


----------



## Mega Man

Mine is cooler. But I have 5x your rad space. Mostly 35-37


----------



## SLK

So you are saying it is normal for the 295x2 to really heat up the water? I ordered a fan controller so I can adjust when I am gaming.


----------



## xer0h0ur

I mean at full tilt its dumping 500W of heat (not including overclocks) into the loop so yeah, not surprised its raising the water temp that much.


----------



## SLK

OK, thanks. I will add another 3x 140mm Phanteks fans on top for Push and pull.


----------



## fat4l

Well my max water temp is around 29C with idle of ~23C..


----------



## SLK

What are your rads and pumps?


----------



## fat4l

Quote:


> Originally Posted by *SLK*
> 
> What are your rads and pumps?


EK DDC in series with X-tops, 2x240 EK PE rad push pull, Mora 3 420 with 8x 200mm fans push pull(~3x420 rads).

The pump combo makes a big difference.... I would never run 1 pump again.


----------



## SLK

Thanks, looks like I am tapped out and shouldn't add anymore to this loop. I ordered more fans to cool down the rads. Once I dump this GPU it will be a single GPU again using less power so my water temps should shave off a good 5c at full load.


----------



## crislevin

Sorry if this has been asked. I had this card for almost a year now, I am wondering if the liquid cooling system needs maintenence fromm time to time? If so, how often and what should I do?

Thanks


----------



## wermad

Quote:


> Originally Posted by *crislevin*
> 
> Sorry if this has been asked. I had this card for almost a year now, I am wondering if the liquid cooling system needs maintenence fromm time to time? If so, how often and what should I do?
> 
> Thanks


Its a closed-loop cooler (aio: all in one), so the only maintenance needed is just making sure dust doesn't build up on the rad/fan/cooler. A can of compressed air is good enough for that. You'll void your warranty if you actually tap into the liquid loop. You can remove the whole thing for a full cover water block upgrade, but my recommendation is to keep it in tact in case if you sell it later on. And don't open up the parts w/ liquid.

If on the odd chance something fails on the cooler, as long as you didn't tamper with it, you should get support as the clc is covered by warranty. With proper care, these coolers can last a few years and by then, this card will be slow and power hungry (like many amd cards of yore







).


----------



## crislevin

Much appreciated!


----------



## wermad

Np


----------



## BigCatRoach

Really need some help guys. I got my second card and installed it. I am get 0% usage in all games and benchmarks for #2-4 only usage on #1 I have disabled ULPS in afterburner as well as the registry to be sure. Have also double checked drivers with DDU. Run out of ideas. It was showing usage on both cards when I only had one installed I changed nothing after installing the second when the issue originally started.


----------



## wermad

Run that card on its own. That's one thing I do, i never just drop it in and hope it works. remove the first one and see what the new one does on its own. It will narrow things down for you greatly.


----------



## Deathscythes

Hi guys,
Just applied to join the club ^^

Benchmark and pic =) :




Currently tweaking them, both overclocking and bios modding.
Also if anyone knows ANYWHERE to get the EK backplates that would be awesome if he could tell me ^^


----------



## Mega Man

just sold my extra sorry :/


----------



## wermad

The stock or other make backplate should work as well if you get the right screws. I've seen a few xspc for sale. My xfx units came with the stock backplate, did yours have any btw?


----------



## kayan

Quote:


> Originally Posted by *Deathscythes*
> 
> Hi guys,
> Just applied to join the club ^^
> 
> Benchmark and pic =) :
> 
> 
> 
> 
> Currently tweaking them, both overclocking and bios modding.
> Also if anyone knows ANYWHERE to get the EK backplates that would be awesome if he could tell me ^^


I've got an EK backplate, never used.


----------



## alessandrorb

Hello guys! I have a Sapphire R9 295X2 4096 MB PCI-E 1018/1250 Bios: 015.044.000.009.004297. Anyone out there who have the same vga, could you please send me the 2 original bios, Master and Slave of this model?


----------



## AeroXbird

Quote:


> Originally Posted by *alessandrorb*
> 
> Hello guys! I have a Sapphire R9 295X2 4096 MB PCI-E 1018/1250 Bios: 015.044.000.009.004297. Anyone out there who have the same vga, could you please send me the 2 original bios, Master and Slave of this model?


I've uploaded them for you here: http://www.mediafire.com/download/3m8at1jgiqwrt5g/Sapphire+R9+295X2+BIOS.zip

Anybody else who wants this BIOS can download them too.


----------



## alessandrorb

Thanks friend AeroXbird! Thank you, saved my day!


----------



## kayan

Do you guys think it's worth putting my 295 on water at this point? I want to replace it with next gen, whenever it comes. I already have a backplate and a block, but I don't know. I need thermal pads though.

What do you guys think?


----------



## Deathscythes

@kayan

Hi Kayan, actually if i find a second backplate i would gladly buy yours if it's possible, i live in France =)
I really dont know what you would get from it to be honest as i custom watercooled both when i got them








Probably better accoustic, but also consider how much headroom you have for your cpu temps as two more hawaii GPUs will definitely add a lot of heat to your loop.


----------



## F4ze0ne

Quote:


> Originally Posted by *alessandrorb*
> 
> Hello guys! I have a Sapphire R9 295X2 4096 MB PCI-E 1018/1250 Bios: 015.044.000.009.004297. Anyone out there who have the same vga, could you please send me the 2 original bios, Master and Slave of this model?


Did you try looking here?

http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=Sapphire&model=R9+295X2


----------



## wermad

Quote:


> Originally Posted by *kayan*
> 
> Do you guys think it's worth putting my 295 on water at this point? I want to replace it with next gen, whenever it comes. I already have a backplate and a block, but I don't know. I need thermal pads though.
> 
> What do you guys think?


If you're not gonna block it, sell the block asap. Full cover blocks depreciate quickly and its hard to sell a brand new block for a very old generation gpu down the road. You would need to find an afficianado willing to pay top money for essentially an old gen gpu block. Most sensible people and advise say invest in a newer gpu and you may spend about the same in the wc gear. Unless you're selling for very cheap, it don't make sense to hold on to it and loose out on a chance to make some money now.

The 295x2 is stll very relevant and there's ppl still looking for a good price on a block. But don't let it be a sutation like this: "6990 block, bnib, rare, purrtty, $150 USD in 2016". Seriously, I've seen these listings







and most don't sell unless the price is dropped dramatically.

If you plan to hold on to your card for a bit more and oc or maybe throw in a second one, use the block imho.


----------



## xarot

I'm going to try quadfire soon. I couldn't pass another Ares III for a rather reasonable price, it was broken though but fixed with a simple mod by a known overclocker. I'm still testing it and waiting for my triple parallel bridge from EK to connect both cards. Those things look amazing. Can't say how quadfire will work as I've seen so many issues with recent drivers and only crossfire...even remember reading some crimson release notes that issues with quadfire setups are likely to be expected?


----------



## Mega Man

I don't have a lot of issues


----------



## fat4l

Quote:


> Originally Posted by *xarot*
> 
> I'm going to try quadfire soon. I couldn't pass another Ares III for a rather reasonable price, it was broken though but fixed with a simple mod by a known overclocker. I'm still testing it and waiting for my triple parallel bridge from EK to connect both cards. Those things look amazing. Can't say how quadfire will work as I've seen so many issues with recent drivers and only crossfire...even remember reading some crimson release notes that issues with quadfire setups are likely to be expected?


I'm curious what the problem was ?








Btw how do they clock for ya ? I'm doing 1200/1700MHz(with mem mods 1500 timings), before "blackouts" appear.


----------



## xarot

Quote:


> Originally Posted by *fat4l*
> 
> I'm curious what the problem was ?
> 
> 
> 
> 
> 
> 
> 
> 
> Btw how do they clock for ya ? I'm doing 1200/1700MHz(with mem mods 1500 timings), before "blackouts" appear.


Both GPUs got 0V, issue was with VRM not making contact or something. Afaik it wasn't blown up or anything, just didn't work from the beginning..

I haven't dared OCing my first Ares much, and probably don't dare to OC the second at all in fear of completely blowing up the VRM







. I think I got the first one gaming stable at 1150 core with added voltage. Yours sounds really good, what is the core voltage at 1200?


----------



## Mega Man

not ocing the areas sounds like a waste


----------



## wermad

Post em benches, gpu only (can't compete on those 5960x owners







)


----------



## Deathscythes

Quote:


> Originally Posted by *wermad*
> 
> Post em benches, gpu only (can't compete on those 5960x owners
> 
> 
> 
> 
> 
> 
> 
> )


True the only reason i regret not having a 5960X is that with one i would probably be in top 100








Just bought 3Dmark installed Win10 on my new samsung 950 Pro (2600 MB/s read...)
GPU stock
http://www.3dmark.com/3dm/11077961


----------



## wermad

I see a lot of these builds:

-i7 5960x
-Ramapage V Extreme
-EVGA GTX Titan X SC (one, just one







)
-Corsiar AX1500i

I can't compete with the cpu scores, it'll slaughter me. Honestly, i never had an interest in a 5960x and barely have any at all for the a hexa-core, but for now, a quad works and really, you can't go further with Z170







. I actually ditched a perfectly good working RVE for a gorgeous G1 Z170









edit: just to lay it all out, I only game







, so extra cores (and threads) don't help me much. i prefer to dump that extra cash on my loop or case


----------



## xarot

Quote:


> Originally Posted by *Mega Man*
> 
> not ocing the areas sounds like a waste


Quote:


> Originally Posted by *wermad*
> 
> Post em benches, gpu only (can't compete on those 5960x owners
> 
> 
> 
> 
> 
> 
> 
> )


When I get both up and running during next week or so. Well the Ares cards don't oc that much higher than 295X2s anyway. I got them mainly for the looks.


----------



## fat4l

Quote:


> Originally Posted by *xarot*
> 
> Both GPUs got 0V, issue was with VRM not making contact or something. Afaik it wasn't blown up or anything, just didn't work from the beginning..
> 
> I haven't dared OCing my first Ares much, and probably don't dare to OC the second at all in fear of completely blowing up the VRM
> 
> 
> 
> 
> 
> 
> 
> . I think I got the first one gaming stable at 1150 core with added voltage. Yours sounds really good, what is the core voltage at 1200?


I'm using custom bios. The stock bios had +25mV offset even with me not touching any voltage in afterburner.
So i got this reverted to 0.
My stock voltage is GPU1:1.19375v and gpu2:1.225v
Asic is:gpu1:79.5% and gpu2:78.3% (different asic = different stock voltage)
Now I'm using custom bios with modded voltages and ram timings, 1376-1500MHz strap timings for 1700MHz.
GPu1:1.368v
gpu2:1.406v
The real 3d voltage is much lower due to vdoop, about 1.3v for gpu1 and 1.33v for gpu2.

I wonder whats ur asic for all 4 cores and your stock voltage.
Can you pls run this program for me ? It tells you your default voltage for your cards/cores and asic quality as well. Pls post screens








Link:- The Stilt's VID APP

Here is the results for 1200/1650MHz with 1500 strap timings + 390 memory controller timings
http://www.3dmark.com/fs/7402234 - 13062 graphics score
Here is 1200/1700MHz with 1500 strap timings + normal memory controller timings
http://www.3dmark.com/fs/7047299 - 12961 graphics score


----------



## kayan

Random question: is anyone interested in a comparison of a 295x2 vs 980 Ti in Fire Strike, on the same system?


----------



## xarot

Quote:


> Originally Posted by *fat4l*
> 
> I'm using custom bios. The stock bios had +25mV offset even with me not touching any voltage in afterburner.
> So i got this reverted to 0.
> My stock voltage is GPU1:1.19375v and gpu2:1.225v
> Asic is:gpu1:79.5% and gpu2:78.3% (different asic = different stock voltage)
> Now I'm using custom bios with modded voltages and ram timings, 1376-1500MHz strap timings for 1700MHz.
> GPu1:1.368v
> gpu2:1.406v
> The real 3d voltage is much lower due to vdoop, about 1.3v for gpu1 and 1.33v for gpu2.
> 
> I wonder whats ur asic for all 4 cores and your stock voltage.
> Can you pls run this program for me ? It tells you your default voltage for your cards/cores and asic quality as well. Pls post screens
> 
> 
> 
> 
> 
> 
> 
> 
> Link:- The Stilt's VID APP
> 
> Here is the results for 1200/1650MHz with 1500 strap timings + 390 memory controller timings
> http://www.3dmark.com/fs/7402234 - 13062 graphics score
> Here is 1200/1700MHz with 1500 strap timings + normal memory controller timings
> http://www.3dmark.com/fs/7047299 - 12961 graphics score


Thanks!

Is this what you were looking for?



I got them up and running but so far not really impressed. I tested 13 Steam games and out of those only 7/13 games worked without issues.







Some games that worked good enough if not flawlessly on single card, are completely unplayable stuttering mess. I didn't expect everything to work out of the box, but this is not encouraging as I didn't get functionality nor performance. Tried three different drivers.

The worst performing games look a bit like this... 




3DMark performs well, but with current PSU I am not able to crank the voltage up because the PSU shuts down. I am going to change the PSU in a week or so. Some successfull run... http://www.3dmark.com/fs/7851288


----------



## fat4l

Quote:


> Originally Posted by *xarot*
> 
> Thanks!
> 
> Is this what you were looking for?
> 
> 
> 
> I got them up and running but so far not really impressed. I tested 13 Steam games and out of those only 7/13 games worked without issues.
> 
> 
> 
> 
> 
> 
> 
> Some games that worked good enough if not flawlessly on single card, are completely unplayable stuttering mess. I didn't expect everything to work out of the box, but this is not encouraging as I didn't get functionality nor performance. Tried three different drivers.
> 
> The worst performing games look a bit like this...
> 
> 
> 
> 
> 3DMark performs well, but with current PSU I am not able to crank the voltage up because the PSU shuts down. I am going to change the PSU in a week or so. Some successfull run... http://www.3dmark.com/fs/7851288


Thank you man! There's not many ppl with Ares 3 out there so I'm very happy that u supplied the info. +rep!!!!

I really wonder how asus done the binning of the chips. They say its 500(1000) binned chips to get the best out of it however I'm not sure what they consider the "best".
I see your asic is not any great, or in other words, not any super high. I would expect 80+% but that's just my opinion. Also maybe it doesn;t even matter what asic it is









Do your cards have +25mV offset ? What I mean is, if you don't do anything to volts, and install MSI afterburner, is it showing +25mV ? For my card it was showing(idk the reason), but i got it removed by @gupsterg.
Also as you can see stock volts for all the cores are different and this influences the overcloking on stock voltages.
My stock volts are rly low, particularly for the core 1.


For example, all of the 295X2 card I've seen had 1.25V default voltage.
I can support the bios I'm using if you want to try it out. It gives my about 20% boost in performance(tested in Crysis3 and 3Dmark FS X).
See(stock vs OC, 19.7% Graphics score): http://www.3dmark.com/compare/fs/6593207/fs/6590481
I also think running 4 gpus is meh. Amd's Crossfire support is not perfect and hping for quadfire support is ......>.<








Also, have you tried to download V2 bios for PLX chip that increases compatibility with X99 systems ? It's windows app so easy to do.
https://www.asus.com/uk/Graphics-Cards/ROG_ARESIII8GD5/HelpDesk_Download/

What about temps ? Can you pls run Unigine Valley bench, 1 loop, and post temps from all the cores/vrm1/vrm2 pls ? (if valley cant use 4 cores then try 3DM pls)
Previously I had 295X2 with waterblock and at stock i was getting about 40C in load. OCed I was getting about 45C maybe. When I oced this ares,the cores had temps of ~55-60C and vrms close to 100C in my WCooling system, which is nonsense.
So I removed the block and replaced the paste + pads. I went with fujipoly pads 14W/mK and at first, Thermal Grizzly Kryonaut(Dropped ~5C) but I changed it to Coollaboratory Liquid Pro and dropped another 15C from the cores.
Now when I'm running it @load, Oced to 1200/1700MHz, I'm getting 52C on gpu1 and 45C on gpu2 and max ~60-65 For vrms(1) max with ambient of 20-22C.........

I'm rly curious about your temps, pls!









Also 2 chips on the back side don't have thermal pads! Idk what asus were thinking but it definitely cripple mem overclock....
Also, the paste application on the cores is ....meh. See yourself.
(mem pad/strip missing where the "ares" cutout is)


----------



## xarot

Quote:


> Originally Posted by *fat4l*
> 
> Thank you man! There's not many ppl with Ares 3 out there so I'm very happy that u supplied the info. +rep!!!!
> 
> I really wonder how asus done the binning of the chips. They say its 500(1000) binned chips to get the best out of it however I'm not sure what they consider the "best".
> I see your asic is not any great, or in other words, not any super high. I would expect 80+% but that's just my opinion. Also maybe it doesn;t even matter what asic it is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do your cards have +25mV offset ? What I mean is, if you don't do anything to volts, and install MSI afterburner, is it showing +25mV ? For my card it was showing(idk the reason), but i got it removed by @gupsterg.
> Also as you can see stock volts for all the cores are different and this influences the overcloking on stock voltages.
> My stock volts are rly low, particularly for the core 1.
> 
> 
> For example, all of the 295X2 card I've seen had 1.25V default voltage.
> I can support the bios I'm using if you want to try it out. It gives my about 20% boost in performance(tested in Crysis3 and 3Dmark FS X).
> See(stock vs OC, 19.7% Graphics score): http://www.3dmark.com/compare/fs/6593207/fs/6590481
> I also think running 4 gpus is meh. Amd's Crossfire support is not perfect and hping for quadfire support is ......>.<
> 
> 
> 
> 
> 
> 
> 
> 
> Also, have you tried to download V2 bios for PLX chip that increases compatibility with X99 systems ? It's windows app so easy to do.
> https://www.asus.com/uk/Graphics-Cards/ROG_ARESIII8GD5/HelpDesk_Download/
> 
> What about temps ? Can you pls run Unigine Valley bench, 1 loop, and post temps from all the cores/vrm1/vrm2 pls ? (if valley cant use 4 cores then try 3DM pls)
> Previously I had 295X2 with waterblock and at stock i was getting about 40C in load. OCed I was getting about 45C maybe. When I oced this ares,the cores had temps of ~55-60C and vrms close to 100C in my WCooling system, which is nonsense.
> So I removed the block and replaced the paste + pads. I went with fujipoly pads 14W/mK and at first, Thermal Grizzly Kryonaut(Dropped ~5C) but I changed it to Coollaboratory Liquid Pro and dropped another 15C from the cores.
> Now when I'm running it @load, Oced to 1200/1700MHz, I'm getting 52C on gpu1 and 45C on gpu2 and max ~60-65 For vrms(1) max with ambient of 20-22C.........
> 
> I'm rly curious about your temps, pls!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also 2 chips on the back side don't have thermal pads! Idk what asus were thinking but it definitely cripple mem overclock....
> Also, the paste application on the cores is ....meh. See yourself.
> (mem pad/strip missing where the "ares" cutout is)


Yes Afterburner shows +25 mV for my cards too.

I updated the firmware, but it's not better really. Also I swapped the PSU, now I can crank up voltage but my liquid cooling loop is not up to the task...anyway, this setup is temporary, I am not going to install another 480 rad because CFX is what it is. I'm going to test it for a few days/weeks and try things. My system is like a house radiator with two Ares cards...temps can get way too hot. :O cooled with 480+360 rads but also the 5960X is a fireball. I think I'll have to run the tests with another card disabled for you. I'll get on to it, but it may take a few days.







Also I think two of Ares waterblocks seem to be a bit restrictive with Koolance QDCs I am using. Anyway, why not posting a pic of the current rig? I like it, but I wish CFX worked much better. This is temporary setup, I can pull both of them out and replace then with my Titan Xs any time.


----------



## Deathscythes

Quote:


> Originally Posted by *xarot*
> 
> Yes Afterburner shows +25 mV for my cards too.
> 
> I updated the firmware, but it's not better really. Also I swapped the PSU, now I can crank up voltage but my liquid cooling loop is not up to the task...anyway, this setup is temporary, I am not going to install another 480 rad because CFX is what it is. I'm going to test it for a few days/weeks and try things. My system is like a house radiator with two Ares cards...temps can get way too hot. :O cooled with 480+360 rads but also the 5960X is a fireball. I think I'll have to run the tests with another card disabled for you. I'll get on to it, but it may take a few days.
> 
> 
> 
> 
> 
> 
> 
> Also I think two of Ares waterblocks seem to be a bit restrictive with Koolance QDCs I am using. Anyway, why not posting a pic of the current rig? I like it, but I wish CFX worked much better. This is temporary setup, I can pull both of them out and replace then with my Titan Xs any time.


Hi, I run 2 295x2 and it's a pain in the ass to cool... I would be very interested in seeing your temps.


----------



## Mega Man

Assuming you mean with full water blocks your core temps should not reach 50, if they do, your doing something wrong


----------



## Deathscythes

Quote:


> Originally Posted by *Mega Man*
> 
> Assuming you mean with full water blocks your core temps should not reach 50, if they do, your doing something wrong


Wow ! thanks ! I have a total of 1440 mm of rad length including 960 in push pull and in crossfire i reach the throttling temp at stocks








It's been figured that the flow of my loop is way too low and need another pump to fix it.
I was just curious about what i could expect once my loop is fixed =)


----------



## wermad

i has 4000mm









Enter Megaman for pissing contest....


----------



## Mega Man

Not yet, my tx10 is waiting, but don't you worry, trying for this summer to buy the 3-4 peds I need, I can now measure, and I need to lol... then it is on..... this time though bottom ped will have pumps (at least 8, 4 for each side ) both bottom res , and I will try to find some cold plates for when I go geothermal to keep down there ... worst case I'll have to make my own
@death
Once I hit 3 blocks ( cpu/gpus ) I like to have at least 2 pumps, but me, I am crazy, I do 2x2 so I can have a full set of redundancy


----------



## wermad

Dude, I'm seriously terrified cl is gonna eol parts soon. Might get a vented door to add a few more rads for the second system and air cooling for a nas I wanna add. I may also do the top and add a 3rd rad up there.


----------



## Mega Man

I don't think they will eol accessories yet, at least 1 year, I will be buying a bunch of peds and triple 480 mounts, another psu mount for the rear and a few other minor odds and ends ... so glad I got the extended Topps when I did for both my m8 and tx10


----------



## xarot

Quote:


> Originally Posted by *Deathscythes*
> 
> Hi, I run 2 295x2 and it's a pain in the ass to cool... I would be very interested in seeing your temps.


Not surprising. I don't have enough radiator space for CFXing two cards, temps continue to keep rising until going through the roof. The system is using over 1400 watts in Sleeping Dogs and I think CPU wasn't even overclocked. :O So 480+360 radiators will not be nearly enough, I'd need another 360 at least I think but I don't have space for it. Anyway, this is a temporary setup, it's fine for 90 % of testing I want to do.









I use Sleeping Dogs without Vsync to get max temps on water, it's like Furmark. That game can utilize constant 100 % on all 4 GPU cores. Very easy to hit high temps when leaving the game on for 30 minutes.


----------



## Mega Man

You should be fine on rad space, but may need more pump..... or your fans sucks for rads

I can't think of anything that shouldn't be OK with 480+360 consumer side, even quad gpu


----------



## wermad

My rig still runs even if I shut down one of the d5's. I got ~30 90° fittings







. One d5 or ddc will work fine with a couple of rads and three blocks.

Discovered one of the cores from the 2nd card is not showing temps. I'm worried now and I won't be able to test for a week....







. Ulps off.


----------



## Mega Man

I never said you can't I said I like, aka I prefer, but again 480+360 and not getting good temps, something is wrong.

I can darn near run my system passively, that is obviously project for the future I want to do. A passive pc, I dunno if water or air, but I think I want to do a passive water in like ax2, I could easily weld the al to some cooper pipe

Idk it is one of the things on my to do list, before I go geothermal


----------



## kayan

What would happen if someone were to hook up one of these cards to a PSU that didn't have nearly enough AMPS? What would happen to the card? What would happen to the system? Any chance of frying something?


----------



## Alex132

If it were a half decent PSU it'd just turn itself off when it draws too much current.


----------



## kayan

Quote:


> Originally Posted by *Alex132*
> 
> If it were a half decent PSU it'd just turn itself off when it draws too much current.


Ok, so, if it never had enough amps to begin with?

Long story short. I sold my 295x2 last week, and the person who bought it said they installed it and the card spins up but that they are not getting any signal whatsoever. I asked what PSU they had, and said person had a GS600 by Corsair. I checked and it has a combined 48A. I'm worried that something happened and that I will have to take it back due to damage. The person said they ordered a new PSU today, from the list here in the first post.

Any insight on this?


----------



## Mega Man

I have tripped ocp (over current protection), which is the afore mentioned safety, without issue Manhattan, many times. odds are it is fine


----------



## xarot

Quote:


> Originally Posted by *Mega Man*
> 
> I never said you can't I said I like, aka I prefer, but again 480+360 and not getting good temps, something is wrong.


I can get the temps to stay good enough for 24x7 use bot not if I fire up Sleeping Dogs at 144 Hz...it also makes the whole system to consume 1400-1500W of continuous power and I think asking 840mm of rad space to cool all that heat (assuming all would come from CPU+GPUs only) is a tad much, over 200 W power per 120.1. Of course, if I used 3000 rpm fans and my rads were outside the case, maybe a different story. Currently using AP-15s and yeah probably the loop flow is a tad low due to QDCs.

Btw, anyone else dared to fire up Sleeping Dogs?









I also have tripped OCP with my old AX1200 and it just shuts down, no harm done.


----------



## Mega Man

First power input does not equal heat output. No where near, second stock cooling is a 120mm rad roughly for both cpu and each dual gpu.

If we go with cores (5) and the water cooling "rule of thumb" 120×5 +120 = 720, anything above that is gravy, not to mention much more then stock cooling. If your temps suck that much, it is because something is wrong.

Ap15s should be fine. And you do not need an exterior rad, something else is the problem. Possibly even how you have the fans set up or even what is controlling them, or flow, a set of qdc's ( normally used -name brand ) won't be what kills your flow, a 2 dual gpu blocks and a cpu block however will. I would start there first


----------



## Deathscythes

Quote:


> Originally Posted by *Mega Man*
> 
> @death
> Once I hit 3 blocks ( cpu/gpus ) I like to have at least 2 pumps, but me, I am crazy, I do 2x2 so I can have a full set of redundancy


Yeah currently i run a single d5 for EK-RVE Monoblock, 2x EK-FC R9 295x2, 480 monsta, 480 XT45, 240 UT60, 240 ST30 and it's far from enough as you can see here how slow the bleeding was :/





I plan to add this
https://www.ekwb.com/shop/ek-xtop-revo-dual-d5-pwm-serial-incl-pump

I dont have much room left in my phanteks enthoo primo so here is what i have plan to do:


The flow way is from the bay res to the cpu block.
the purple rectangle at the bottom is supposed to be the EK dual pumps.

fat4l told me that the pumps may potentially pull too much water and be damaged if there is no res right before them.
As long as the loop is filled i don't understand how. Do you have an explaination?
If this works, i plan to bleed the loop while the dual pumps are unplugged so that they don't run dry and exclusively use the one in bay res.

What do you think about it ? 3 d5s is totally overkill, i guess, but i don't mind







Can it cause troubles though?
Thank you for your help =)

The current build:


----------



## Mega Man

To much pumps isn't a problem, I would change your loop order. Only rule is res to pump.

I would go res pump rad rad gpu cpu back to res

Ideally you want the res before the pump esp when pumps start they create a vacuum in the suction side ( the "in" ) with the res there it pulls water down quickly if not it will attempt to pull water through the block and take longer to fill it can cause failure to the bearings (generally) that is the general idea


----------



## Deathscythes

Quote:


> Originally Posted by *Mega Man*
> 
> To much pumps isn't a problem, I would change your loop order. Only rule is res to pump.
> 
> I would go res pump rad rad gpu cpu back to res
> 
> Ideally you want the res before the pump esp when pumps start they create a vacuum in the suction side ( the "in" ) with the res there it pulls water down quickly if not it will attempt to pull water through the block and take longer to fill it can cause failure to the bearings (generally) that is the general idea


I understand, thank you very much!


----------



## Sgt Bilko

Got an issue guys, I've only got one GPU core showing up for me now, tried it in two rigs (my i5 and FX ones) and still nothing, GPU-Z tells me that Crossfire is disabled and there is only one GPU Core detected.

Afterburner can see the other core but no data from it.......me thinks the second core is dead


----------



## wermad

If gpuz can't see it, it might be a bad core. Try the backup bios?

I need to test mine too as one of the cores is showing nothing. But all four are detected in ab and ccc


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> If gpuz can't see it, it might be a bad core. Try the backup bios?
> 
> I need to test mine too as one of the cores is showing nothing. But all four are detected in ab and ccc


Tried the BIOS switches back and forth, I just plugged in into another PCIe slot and it's reading both now just a little wonky.

will post an update in a few

Update: After plugging it into another slot it is now reading both Cores just fine, disabled ULPS and everything is back to normal

phew........that was a scare









2nd Update: Plugging it back into the first slot and the same thing happens, only one GPU core is detected, I think I'm going to have to put it back into the 2nd slot, reinstall drivers then plug it into the first......

3rd Update: After swapping over to my Win 10 install on my FX rig it's all back to normal, it's only on my i5 Win 10 and FX Win 7 installs this happens.......probably because I never installed the driver with the 295x2 in there, I usually used a 290x or 390x.

.


----------



## wermad

drivers for the 290x and 390x should be compatible with the 295x2. The only time I've experienced specific drivers for amd is with their ancient, and useful gpu's (much like nvidia).

Check in gpuz if your bandwidth is at least 8x 3.0 (or 16x 2.0).


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> drivers for the 290x and 390x should be compatible with the 295x2. The only time I've experienced specific drivers for amd is with their ancient, and useful gpu's (much like nvidia).
> 
> Check in gpuz if your bandwidth is at least 8x 3.0 (or 16x 2.0).


Oh they are but I've had this issue before actually, sometimes when i install a driver i need to install it when the GPU i want to use is in a non primary slot for it to work in the primary.

It's a weird bug and it only affects a handful of people but after realizing what it is I can fix it now


----------



## wermad

Kewl


----------



## xarot

Bleh, I think my second Ares just died out of nowhere. Started to freeze the whole PC in games and after that it's not detected in PC unless shut down multiple times. It was a rather pricey run...


----------



## fat4l

Quote:


> Originally Posted by *xarot*
> 
> Bleh, I think my second Ares just died out of nowhere. Started to freeze the whole PC in games and after that it's not detected in PC unless shut down multiple times. It was a rather pricey run...


Try to use the "bake" method


----------



## wermad

No way for you to send it to warranty?


----------



## xarot

Quote:


> Originally Posted by *wermad*
> 
> No way for you to send it to warranty?


No warranty. Having other issues as well now too so hope it's only one card broken.


----------



## fat4l

Quote:


> Originally Posted by *xarot*
> 
> No warranty. Having other issues as well now too so hope it's only one card broken.


bake it! bake it! bake it!


----------



## xarot

Not yet, I'll have to take the card apart and inspect. I doubt baking would help in this case. Completely lacking motivation now though. Put my Titan Xs back in - everything is smooth as butter again.


----------



## fat4l

I can send you some pics of my card when u disassemble it, if needed for comparison or something. Not many ppl with ares out there lol


----------



## Sgt Bilko

Just thought I'd throw this in for the fun of it, got my 295x2 going in my Skylake rig with my 750w TT PSU

small overclock on the card (1100/1500 with +50mV) and 4.7Ghz on the i5.

It pulled 623w peak during Catzilla's Raymarch test (getting my readings via TT's PSU app), was a little worried this PSU didn't have the Amperage needed but turns out it's doing just fine. might be a different story with a more hungry CPU though.

you can add this one to your list @wermad


----------



## spyshagg

my 750w TX corsair started to "coil whine" after a few hours of benching 2x 290x's. It shut it self down as well, but only after a few hours. It didn't die and is now doing very light work on a NAS. But it will never be again the same psu it once was lol


----------



## MIGhunter

Quote:


> Originally Posted by *spyshagg*
> 
> my 750w TX corsair started to "coil whine" after a few hours of benching 2x 290x's. It shut it self down as well, but only after a few hours. It didn't die and is now doing very light work on a NAS. But it will never be again the same psu it once was lol


My 750 TX died this week after 4 years of continual use


----------



## fishingfanatic

I think ur psus were too close to max draw. Even an 850 would be enough of a little buffer. Most psus if I remember correctly run best in the 60-70

% max load range. This is not gospel, just a general rule of thumb.

Efficiency is also considered if you have a system using a 500w draw and you have a 1500w the efficiency would be poor.

If my max draw is 600w I would look at an 850. 400w at least 500-550 imho.

Obviously there are many variables, ocing and peripherals as well as the gpu(s)

Perhaps a psu guru can elaborate further, as I'm NO expert.









If the psu runs warm then imho it isn't big enough for the draw, at least for longevity.

FF


----------



## Alex132

Quote:


> Originally Posted by *fishingfanatic*
> 
> I think ur psus were too close to max draw. Even an 850 would be enough of a little buffer. Most psus if I remember correctly run best in the 60-70
> 
> % max load range. This is not gospel, just a general rule of thumb.
> 
> Efficiency is also considered if you have a system using a 500w draw and you have a 1500w the efficiency would be poor.
> 
> If my max draw is 600w I would look at an 850. 400w at least 500-550 imho.
> 
> Obviously there are many variables, ocing and peripherals as well as the gpu(s)
> 
> Perhaps a psu guru can elaborate further, as I'm NO expert.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If the psu runs warm then imho it isn't big enough for the draw, at least for longevity.
> 
> FF


Don't buy a PSU for hoping the peak load to be within 60-80%. If your peak is ~750w like most 295X2 users with lots of add-ons and a mild GPU OC (and CPU OC), then an 850w is really what you need.

Ideally I would have purchased the EVGA G2 850w (prefer its design too) - but it was much more expensive than my PSU


----------



## ssiperko

Quote:


> Originally Posted by *kayan*
> 
> Random question: is anyone interested in a comparison of a 295x2 vs 980 Ti in Fire Strike, on the same system?


I would. I have 2 systems I could do it on. A 4.8 4770k with a 1575/8600 air cooled 980Ti or a 4.9 5960x box with a 980ti SLI under water.

Was thinking of getting a Powercolor Devil 13 to play with in the air cooled box.

SS


----------



## fishingfanatic

All I can suggest to you is to look up online what size psu you should use, and if possible go 1 size bigger for possible sli/xfire and a bit of

futureproofing.

I can never remember the site but google it and you should find it easily enough, there's a chart to fill in what hardware you have and it will

recommend the size of psu that's best.

There are a few things I will spend that extra bit for and the psu is 1 of them. A good psu can eliminate a lot of little headaches.

Personally I have always had luck with the Corsair higher end psus. Never had 1 go even once.

Actually had a TX850M if I remember correctly that I inadvertently ran 2 690s with for a benchmark b4 I realized the error.

It got a little warm but it's still going strong even after 3-4yrs now.

I ran a wc loop 4 HDDs 3SSDs and 780 ti kingpins in sli with about 15 fans. 8 on the 2 480b rads with each fan moving 107cfm alone with my AX1200 and it never got warm

ever.

I'm not in retail or anything else, just personal experience. Bought a 1600w psu from a reputable name and went thru 4 b4 I got my money

back, as none ever worked BNIB.

I won't name them bcuz they made it right . For me that's only fair.

FF


----------



## ssiperko

Quote:


> Originally Posted by *fishingfanatic*
> 
> All I can suggest to you is to look up online what size psu you should use, and if possible go 1 size bigger for possible sli/xfire and a bit of
> 
> futureproofing.
> 
> I can never remember the site but google it and you should find it easily enough, there's a chart to fill in what hardware you have and it will
> 
> recommend the size of psu that's best.
> 
> There are a few things I will spend that extra bit for and the psu is 1 of them. A good psu can eliminate a lot of little headaches.
> 
> Personally I have always had luck with the Corsair higher end psus. Never had 1 go even once.
> 
> Actually had a TX850M if I remember correctly that I inadvertently ran 2 690s with for a benchmark b4 I realized the error.
> 
> It got a little warm but it's still going strong even after 3-4yrs now.
> 
> I ran a wc loop 4 HDDs 3SSDs and 780 ti kingpins in sli with about 15 fans. 8 on the 2 480b rads with each fan moving 107cfm alone with my AX1200 and it never got warm
> 
> ever.
> 
> I'm not in retail or anything else, just personal experience. Bought a 1600w psu from a reputable name and went thru 4 b4 I got my money
> 
> back, as none ever worked BNIB.
> 
> I won't name them bcuz they made it right . For me that's only fair.
> 
> FF


GREAT advice!!!!!

4 years ago I would've said pffffffttttt what the hay do you say?

Since then I've had a pair 290's (along with 12 Maxwell's with no power limits and several 5960x's) running CF at 1200/1550 with a 750 and could literally hear it scream NOOOOOOOOOOOOOO under a full with my 4770k at 4.8.

Personally today for EVERYONE who may even think they will tax today's upper tier cards I would advise on a quality 850 to 1000 watt PSU.

Two major components that will never be a bad purchase? Your PSU and case.







I'm old and have have wasted more hard earned cash on garbage to save a buck I should've just smoked it..... the difference would be none.









SS


----------



## fishingfanatic

Ditto !









Couldn't have said it better myself !!!

I never use less than a 1000w myself., but I run wcing and lots of fans, and I bench.

I had an AX1200 until I dropped a splash of water on it. Drank a brew in it's honour and moved on. lol

FF


----------



## Mega Man

Never buy corsair psus NONE of them are worth what they ask and specific to their new psus all I have seen are mediocre at best (when weighing cost, value, and quality. Now when I say quality I mean *quality power and hardware (caps, soldering ect) the whole bundle not just one piece)*

And I generally don't build pcs with 1 psu, I current have one with 3.2 kw of Powah! I r futureproofed


----------



## xarot

I went for Corsair AX1500i because I hate swapping PSUs when I need more capacity all of a sudden. Never had a problem with the AX series, I still have three AX1200s and AX860.


----------



## ssiperko

Quote:


> Originally Posted by *fishingfanatic*
> 
> Ditto !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Couldn't have said it better myself !!!
> 
> I never use less than a 1000w myself., but I run wcing and lots of fans, and I bench.
> 
> I had an AX1200 until I dropped a splash of water on it. Drank a brew in it's honour and moved on. lol
> 
> FF


Yeap ..... I have a pair of 1200's. One EVGA and one SeaSonic both platinum's even though my one box is basic with air cooling and a 4770k I'd rather have more than need more. Besides, if the 60x water cooled SLI rig supply pukes I can swap it out and still be able to use my test 650 on the lesser box until the replacement comes.









SS


----------



## kayan

Quote:


> Originally Posted by *ssiperko*
> 
> I would. I have 2 systems I could do it on. A 4.8 4770k with a 1575/8600 air cooled 980Ti or a 4.9 5960x box with a 980ti SLI under water.
> 
> Was thinking of getting a Powercolor Devil 13 to play with in the air cooled box.
> 
> SS


Here ya go:

Firestrike Extreme

http://www.3dmark.com/compare/fs/6548220/fs/7977956#

295x2 vs 980ti. CPU has a good overclock, but both GPU's were not overclocked by me. The 980ti runs that high on its own.

And Firestrike Ultra

http://www.3dmark.com/compare/fs/6548220/fs/7977956#

Same system again, just a different GPU.

Edit: Derp grammar.


----------



## fishingfanatic

Well MM you must've gotten a few lemons. I know the RM series have been a lost cause, but the TX HX and AX series have never failed or tripped.

I had the old TX from 2008 and sold it to a friend and he has replaced 2 other psus in his other systems during that time.

One Seasonic and 1 SilverStone. Both of which I couldn't believe due to their reps.

We're not all going to like the same and will have different scenarios to make us scratch our heads at times, even with the best hardware.

FF


----------



## Mega Man

See your statement is based on opinion mine is fact.

I have trash psus that last years too...

All recent psus from corsair have either had mediocre quality of power supplied ( ripple, voltage reg, ect ) and / or priced significantly higher then competition. Neither makes for a good power supply

If you doubt. Feel free to pick one and I can pull some reviews and pricing


----------



## fishingfanatic

That's not true sorry. I'm not going to butt heads on something that can't be defined since every manufacturer has lemons or crappier models, but I based my comments

on fact, personal experience, nothing more . I believe I mentioned that already.

Sorry if you feel otherwise as I'm not trying to say something that I haven't seen or experienced, that would be silly as those who know better would prove me wrong

otherwise

I don't know about Corsairs being junk. I think there are too many people who use them that would dispute that.

If I remember correctly one of the better manufacturers did make Corsair's psus anyway. I thought it was Seasonic who made them at one point in time.

Who makes them now I'm not certain.

I do believe there's 1 thing we can agree on and that is DON'T cheap out on ur psu!!!









You've been here for some time and have an excellent rep, so my apologies if anything I said upset you or rubbed you the wrong way.

I can always use some help or suggestions myself, so I would hate to think that was the case, I'm only trying to help out if I can and have nowhere near ur experience,

but still stand by what I said.

FF


----------



## ssiperko

Quote:


> Originally Posted by *kayan*
> 
> Here ya go:
> 
> Firestrike Extreme
> 
> http://www.3dmark.com/compare/fs/6548220/fs/7977956#
> 
> 295x2 vs 980ti. CPU has a good overclock, but both GPU's were not overclocked by me. The 980ti runs that high on its own.
> 
> And Firestrike Ultra
> 
> http://www.3dmark.com/compare/fs/6548220/fs/7977956#
> 
> Same system again, just a different GPU.
> 
> Edit: Derp grammar.


Thanks.

Quite interesting to see the physics differences with just a card swap isn't it?

SS


----------



## Mega Man

1 I never said junk, mediocre is far different.

2 it is true,

Your experience is an opinion, I am taking about facts, pricing, and quality of power, you can't see that without a scope if I was not on mobile phone i could link several reviews, ill try to deal with it when I get home


----------



## ssiperko

Quote:


> Originally Posted by *Mega Man*
> 
> 1 I never said junk, mediocre is far different.
> 
> 2 it is true,
> 
> Your experience is an opinion, I am taking about facts, pricing, and quality of power, you can't see that without a scope if I was not on mobile phone i could link several reviews, ill try to deal with it when I get home


Sorry, but with electronics each piece tested is PURELY based on the unit and thus opinion ....... VERY few scientific 100% repeatable fact tests exist for the majority of items in the world today ...... or course like my butt and opinions they all stink.









SS


----------



## dagget3450

Quote:


> Originally Posted by *ssiperko*
> 
> Thanks.
> 
> Quite interesting to see the physics differences with just a card swap isn't it?
> 
> SS


Yeah, i noticed this also. it gets worse for me the more gpu's i add. Watching the last test combined, it seems once over 2 gpus, it either goes down or barely slightly goes up. it reminds me years ago when nvidia had a similar issue i think it was in 3dmark11 combined score.


----------



## fishingfanatic

Don't bother MM. Lost cause at this point but thanx anyway.

FF


----------



## MIGhunter

Crappy pic, sorry. Here's my 295x2 broken down with EKWB blocks.


----------



## fishingfanatic

That is 1 big block !

FF


----------



## Mega Man

Congrats mig! How are you liking it?

**I need to correct a previous post, apparently the new corsair rmx series are high quality (again power quality output) and reasonably priced.


----------



## MIGhunter

Quote:


> Originally Posted by *Mega Man*
> 
> Congrats mig! How are you liking it?
> 
> **I need to correct a previous post, apparently the new corsair rmx series are high quality (again power quality output) and reasonably priced.


Thanks Mega Man, I'm loving it. I did goof when I rearranged some of my stuff and have some of my fans backwards. Debating on whether it's worth it to swap them around. Right now, the top rad fans are pulling air into the case while the front rad fans are pulling air out of the case. So, i'm thinking about changing the front fans so they are pushing into the case, leaving the top rad the same and just having the 1 exhaust fan in the back. That way it's fresh, hopefully cooler ambient room air crossing the rads into the case and then being venting out the back.


----------



## wermad

Picked up a "little" friend for my second system to increase the amd gpu family in my TX10-D











Spoiler: Warning: old skool amd gpu's


----------



## gupsterg

*1x special offer*







.

Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course







) I will offer ROM mod service over and above what HawaiiReader does (ie multi state VDDCI / VDDC Limit / LLC / fSW ) aka "The Works"







.

*Only 2 conditions:-*

i) do a entry .

ii) ROM mod done after entry, within my own time constraints (which usually is not long wait







) .


----------



## wermad

After last year was a no-go, im no longer interested tbh. But im sure some one will take the call to arms


----------



## fat4l

I will do it Gupsterg
I still owe you so....


----------



## wermad

Quote:


> Originally Posted by *fishingfanatic*
> 
> That is 1 big block !
> 
> FF


Theyre all about the same but i think the piggiest of them all is the Aquacomputer. Ill check when i get home (much needed break at beach).


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> I will do it Gupsterg
> I still owe you so....


You owe me nothing my friend







.

Whatever I can do, I'm yours







.


----------



## SLADEizGOD

Need some help here. I Decided to water cool my card & there is only one block that amazon has in stock. but don't know if its compatible with the XFX R9 295x2 8Gb card.

http://www.amazon.com/Koolance-VID-AR295X2-Water-Block-Radeon/dp/B00K2DGD5G/ref=sr_1_1?ie=UTF8&qid=1459055251&sr=8-1&keywords=r9+295x2+waterblock


----------



## Mega Man

All 295x2 are reference all 295x2s fits all blocks made


----------



## SLADEizGOD

Quote:


> Originally Posted by *Mega Man*
> 
> All 295x2 are reference all 295x2s fits all blocks made


Thanks. was worried for a second.

+Rep


----------



## wermad

I have two of your cards and two of the koolance blocks.


----------



## rdr09

nvm.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> You owe me nothing my friend
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Whatever I can do, I'm yours
> 
> 
> 
> 
> 
> 
> 
> .


Scores submitted!









http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1800_30#post_25034016


----------



## wermad

Alright, I guess I'll chip in. Maybe tonight or tomorrow. I need to get the Ares Mk.1 5870x2 tested and I need to fit the 3rd fan to the Phanteks cooler. Other then the second system, my rig is all stock (and i've yet to test that 4th core) so that will have to do for now.


----------



## wermad

So in the Fanboy comp, jpmboy mentioned my xfire scores were lower vs his single 295x2, ??? I only had the cpu @ 4.5 and the cards were stock. You guys have FS (regular, not extreme or ultra) numbers to compare? I'm suspecting his 5960x might be the reason his overall score was higher....?

Side note: Interesting and scary thing happened. i opted via CCC to upgrade to the latest wqhl for the competition and after running some test, something didn't seem right. Turn on AB and no 4th core







. Nothing in the device manager and gpuz







. I'm really freaking out as I had troubles with that core showing up before after the switch to hardline and some down time. I was already on the new crimson drivers and i didn't even think these were an issue. I seriously thought I lost one core and that sinking feeling kicked in (almost teared up). Before I started crying and searching for the seller for the invoice, I found 15.7 in my download folder and guess what?!?!?!?!? Four cores baby!!!!!!!!!!!1 finished the competition runs (x4 only) with all four gpu's in stock and loading, along with 4.5 i5, and finicky (now fixed and running xmp 3000mhz) stock ram. Seeing the Red boys were well ahead with a few hours left, I called it a day and went on to some family functions (and buying new fish!!!) for the rest of my thursday.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> So in the Fanboy comp, jpmboy mentioned my xfire scores were lower vs his single 295x2, ??? I only had the cpu @ 4.5 and the cards were stock. You guys have FS (regular, not extreme or ultra) numbers to compare? I'm suspecting his 5960x might be the reason his overall score was higher....?
> 
> Side note: Interesting and scary thing happened. i opted via CCC to upgrade to the latest wqhl for the competition and after running some test, something didn't seem right. Turn on AB and no 4th core
> 
> 
> 
> 
> 
> 
> 
> . Nothing in the device manager and gpuz
> 
> 
> 
> 
> 
> 
> 
> . I'm really freaking out as I had troubles with that core showing up before after the switch to hardline and some down time. I was already on the new crimson drivers and i didn't even think these were an issue. I seriously thought I lost one core and that sinking feeling kicked in (almost teared up). Before I started crying and searching for the seller for the invoice, I found 15.7 in my download folder and guess what?!?!?!?!? Four cores baby!!!!!!!!!!!1 finished the competition runs (x4 only) with all four gpu's in stock and loading, along with 4.5 i5, and finicky (now fixed and running xmp 3000mhz) stock ram. Seeing the Red boys were well ahead with a few hours left, I called it a day and went on to some family functions (and buying new fish!!!) for the rest of my thursday.


This is my Single 295x2 entry in the comp: http://www.3dmark.com/fs/7957059

Tess off of course

Did your x4 scores get added to the board? I didn't see them on there (Ah..no link with the screencaps, that's why they never got added)


----------



## wermad

Hmmm...the rules never said anything about a link. That's why i didn't add them. Oh well, it was just a waste in the end









Tnx for the info







Ill run some singles as well but I'm more intrigued on my crossfire runs compare to others.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Hmmm...the rules never said anything about a link. That's why i didn't add them. Oh well, it was just a waste in the end
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tnx for the info
> 
> 
> 
> 
> 
> 
> 
> Ill run some singles as well but I'm more intrigued on my crossfire runs compare to others.


Wish I could give you some comparisons but unfortunately I've only got the one 295x2


----------



## fat4l

Quote:


> Originally Posted by *Sgt Bilko*
> 
> This is my Single 295x2 entry in the comp: http://www.3dmark.com/fs/7957059
> 
> Tess off of course
> 
> Did your x4 scores get added to the board? I didn't see them on there (Ah..no link with the screencaps, that's why they never got added)


@wermad
Here is my Ares 3. U can call it 295x2 with custom pcb.

32.6k graphics score
http://www.3dmark.com/3dm/11446172


----------



## wermad

Tnx guys









I'll run some tests later next week since the second system in my TX is down and some cables from the first system (my rig) had to be removed temporarily. Once everything is back together, I'll run a few tests and compare (single and xfire). One thing I did wanted to share, and its too late for the fanboy comp, I got my xmp working. I was ready to ditch this board and get an x99 god or ws tbh


----------



## EpitaphUnmei

Need help with XfX 295x2 card bios are not starting flashed both with original bios, pc wont boot on right switch, left switch tells me tread stuck in device driver?

edit

this is how I fixed it there are two bios master and slave I flashed two master bios so i re flashed with AMD bios set.


----------



## Bytales

I have exchange my Radeon Fury X for two Radeon 295x2, after paying an extra 550 EUR. I am having problem starting my PC with these cards, more details here, as i have decided to create a separate thread!
http://www.overclock.net/t/1596859/having-trouble-starting-pc-with-a-295x2

Can anyone help me ?
Perhaps you guys have more experience with these cards and i can pinpoint the culprit.


----------



## SLADEizGOD

Quote:


> Originally Posted by *wermad*
> 
> I have two of your cards and two of the koolance blocks.


did you use the same back plate for your cards? or bought new ones?


----------



## wermad

Stock xfx (reference):


----------



## xarot

Quote:


> Originally Posted by *Bytales*
> 
> I have exchange my Radeon Fury X for two Radeon 295x2, after paying an extra 550 EUR. I am having problem starting my PC with these cards, more details here, as i have decided to create a separate thread!
> http://www.overclock.net/t/1596859/having-trouble-starting-pc-with-a-295x2
> 
> Can anyone help me ?
> Perhaps you guys have more experience with these cards and i can pinpoint the culprit.


It seems you figured it out already, but still curious, why did you make the swap? I got some videos when my 295X2 CFX was "working" and my experience with quad-crossfire at worst is this...






and this...


----------



## Bytales

2 Radeon 295x2 are worh exactly 4 times as a fury X. (overclocked at 1100). Basically i quadrupled the value fo the fury x with only 550 EUR.
Still Need to make the waterloop.

I have installed the waterblock, the waterloop not yet, without water it works so Long as i dont stress the gpu. Because if the gpu gets loaded, theres no way the passive block can bring the heat away.


----------



## xarot

Quote:


> Originally Posted by *Bytales*
> 
> 2 Radeon 295x2 are worh exactly 4 times as a fury X. (overclocked at 1100). Basically i quadrupled the value fo the fury x with only 550 EUR.
> Still Need to make the waterloop.
> 
> I have installed the waterblock, the waterloop not yet, without water it works so Long as i dont stress the gpu. Because if the gpu gets loaded, theres no way the passive block can bring the heat away.


Depends what you use them for, in games there are quite a bit of issues with two 295X2s. Running them passive with waterblock only sounds crazy, hope you don't kill them.


----------



## SAFX

Looking for XSPC 295x2 waterblock,


----------



## Mega Man

Congrats?


----------



## wermad

Quote:


> Originally Posted by *SAFX*
> 
> Looking for XSPC 295x2 waterblock,


There's a 295x2 w/ an ek block on ebay, ask the seller if he's willing to part with the block?

Koolance still sells theirs new:

http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2

Just careful on with using silver or copper-sulfate bio control as the nickel can failure.

Alphacool has their setup, but its not a true full cover block in the sense and for the price, get the koolance.

I have two of the Koolance blocks and the quality is good, though I'm only using distilled with no additives.


----------



## SAFX

Alphacool or Aquacool?

Nice setup, and thanks for the advice!


----------



## wermad

Aquacomputer? I haven't seen any but try some of the european stores or hit up aquatuning.us. I saw the AC (aquacomputer) copper/plexi listed but it had "unable" which may be out of stock or back-ordered. Try hardforum's for sale, that market is huge and you may see one there. Of the two readily available, get the Koolance one. i feel the Alphacool maybe just a tad better then stock as it relies on external cooling for everything else but the two cores.





I wanted to mentioned to all owners: i've noticed these guys are appreciating. I'm very tempted to sell both of mine and actually break even of my initial investment over a year ago. One thing to keep in mind, don't use other member's pictures. I found a guy on ebay using several of my pictures from my build to sell his cards with blocks. I should watermark them tbh, but still, I shot him a message to remove them and take a pic of the actual items for sale (







). Its not the first I've caught ebay sellers using my pictures (and I'm sure I'm not alone).

edit: here are the other blocks btw if anyone is interested:

Aquacomputer:






EK:





XSPC:



it's a shame Heatkiller didn't make one for this beast


----------



## SLADEizGOD

Hi guys. I'm running into a problem with my 295x2. I bought a new monitor & wanted to try out so I bought the Acer 27in freesync at 144hz.. I bought an display port to mini display port to get things rolling. And for some reason it only allowing me to use 4096 on GTA5. I've tested my R9 290x & its fine. But my 295x2 is only running on 1 GPU. Would this be the wire? I'm lost.


----------



## wermad

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Hi guys. I'm running into a problem with my 295x2. I bought a new monitor & wanted to try out so I bought the Acer 27in freesync at 144hz.. I bought an display port to mini display port to get things rolling. And for some reason it only allowing me to use 4096 on GTA5. I've tested my R9 290x & its fine. But my 295x2 is only running on 1 GPU. Would this be the wire? I'm lost.


That's seem perfectly normal to me. Keep in mind, the 295x2 is two 4GB 290X, which in turn does *not* stack the two banks of 4gb of vram. So, essentially you only have 4gb of "effective" memory. Your reading is correct, but I would be more concerned with gpu load. Turn on Afterburner, detach the monitoring screen, enlarge it, make sure you have at least gpu usage and vram for *both* cores. Run benchmarks and some games see what you get. If you only have one core showing usage, check your settings such as crossfire profiles and/or ulps. Try a few different games and see what happens. Tbh, 4gb maxed out using 120+hz and 2k, its no surprising. Take some screenshots of AB monitoring if you still have some concerns and post em here


----------



## SLADEizGOD

Quote:


> Originally Posted by *wermad*
> 
> That's seem perfectly normal to me. Keep in mind, the 295x2 is two 4GB 290X, which in turn does *not* stack the two banks of 4gb of vram. So, essentially you only have 4gb of "effective" memory. Your reading is correct, but I would be more concerned with gpu load. Turn on Afterburner, detach the monitoring screen, enlarge it, make sure you have at least gpu usage and vram for *both* cores. Run benchmarks and some games see what you get. If you only have one core showing usage, check your settings such as crossfire profiles and/or ulps. Try a few different games and see what happens. Tbh, 4gb maxed out using 120+hz and 2k, its no surprising. Take some screenshots of AB monitoring if you still have some concerns and post em here


I've going to have to download MSI AB when I get home from work. I can tell you that I jumped on GTA5 and went into the graphics setting. thats where it only alloew 4096 Gb of memory. Also it's not saying it's linked anymore on crimson.


----------



## wermad

GTA V is known to peg gpu's with 4gb when running 4k or 120hz 2k. I've seen it (and shadow of mordor) used for vram testing. From these reviews, 4GB is still plenty to get good frame rates. Even the 390/390X reviews say stick with 290/290X Hawaii if you have one already, as the vram won't make much of a difference. Take note of your frames as that's more important tbh. If you're getting under 60hz (or 90hz for freesync), try toning down the eyecandy. The only other options would be a 290X 8gb (rare), 390/390X 8gb, Titan 6gb, 980 Ti 6gb, or Titan X. I would recommend the Ti, but in terms of raw power, even in 4k, the 295x2 is much better (and they're appreciating in value).


----------



## SLADEizGOD

Quote:


> Originally Posted by *wermad*
> 
> GTA V is known to peg gpu's with 4gb when running 4k or 120hz 2k. I've seen it (and shadow of mordor) used for vram testing. From these reviews, 4GB is still plenty to get good frame rates. Even the 390/390X reviews say stick with 290/290X Hawaii if you have one already, as the vram won't make much of a difference. Take note of your frames as that's more important tbh. If you're getting under 60hz (or 90hz for freesync), try toning down the eyecandy. The only other options would be a 290X 8gb (rare), 390/390X 8gb, Titan 6gb, 980 Ti 6gb, or Titan X. I would recommend the Ti, but in terms of raw power, even in 4k, the 295x2 is much better (and they're appreciating in value).


The frame rate is still good. But bothers me just seeing 4gb on my 295x2. I've add my 290x to get another 4Gb. but I can deal with it. I will try to make sure & run Afterburner just in case.


----------



## Sgt Bilko

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> GTA V is known to peg gpu's with 4gb when running 4k or 120hz 2k. I've seen it (and shadow of mordor) used for vram testing. From these reviews, 4GB is still plenty to get good frame rates. Even the 390/390X reviews say stick with 290/290X Hawaii if you have one already, as the vram won't make much of a difference. Take note of your frames as that's more important tbh. If you're getting under 60hz (or 90hz for freesync), try toning down the eyecandy. The only other options would be a 290X 8gb (rare), 390/390X 8gb, Titan 6gb, 980 Ti 6gb, or Titan X. I would recommend the Ti, but in terms of raw power, even in 4k, the 295x2 is much better (and they're appreciating in value).
> 
> 
> 
> The frame rate is still good. But bothers me just seeing 4gb on my 295x2. I've add my 290x to get another 4Gb. but I can deal with it. I will try to make sure & run Afterburner just in case.
Click to expand...

Vram doesn't Pool together or stack, it's mirrored so you'll only ever have 4GB of Vram to use.


----------



## wermad

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Hi guys. I'm running into a problem with my 295x2. I bought a new monitor & wanted to try out so I bought the Acer 27in freesync at 144hz.. I bought an display port to mini display port to get things rolling. And for some reason it only allowing me to use 4096 on GTA5. I've tested my R9 290x & its fine. But my 295x2 is only running on 1 GPU. Would this be the wire? I'm lost.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> That's seem perfectly normal to me. Keep in mind, the 295x2 is two 4GB 290X, which in turn does *not* stack the two banks of 4gb of vram. So, essentially you only have 4gb of "effective" memory. Your reading is correct, but I would be more concerned with gpu load. Turn on Afterburner, detach the monitoring screen, enlarge it, make sure you have at least gpu usage and vram for *both* cores. Run benchmarks and some games see what you get. If you only have one core showing usage, check your settings such as crossfire profiles and/or ulps. Try a few different games and see what happens. Tbh, 4gb maxed out using 120+hz and 2k, its no surprising. Take some screenshots of AB monitoring if you still have some concerns and post em here
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *SLADEizGOD*
> 
> I've going to have to download MSI AB when I get home from work. I can tell you that I jumped on GTA5 and went into the graphics setting. thats where it only alloew 4096 Gb of memory. Also it's not saying it's linked anymore on crimson.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> GTA V is known to peg gpu's with 4gb when running 4k or 120hz 2k. I've seen it (and shadow of mordor) used for vram testing. From these reviews, 4GB is still plenty to get good frame rates. Even the 390/390X reviews say stick with 290/290X Hawaii if you have one already, as the vram won't make much of a difference. Take note of your frames as that's more important tbh. If you're getting under 60hz (or 90hz for freesync), try toning down the eyecandy. The only other options would be a 290X 8gb (rare), 390/390X 8gb, Titan 6gb, 980 Ti 6gb, or Titan X. I would recommend the Ti, but in terms of raw power, even in 4k, the 295x2 is much better (and they're appreciating in value).
> Quote:
> 
> 
> 
> Originally Posted by *SLADEizGOD*
> 
> The frame rate is still good. But bothers me just seeing 4gb on my 295x2. I've add my 290x to get another 4Gb. but I can deal with it. I will try to make sure & run Afterburner just in case.
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
Click to expand...

Unfortunately, vram doesn't stack as I mentioned earlier, no matter how many gpu's you throw in there. I have crossfire 295x2 and it doesn't mean i have 16gb of vram. i still have 4gb just like a single 290/290X. Vram is one of those things that gets too much attention solely because a few games consume more then the status quo. At your level of resolution and refresh, 4GB is pretty healthy. Vram is a tricky thing to understand as well. Unless you have 2gb and you're trying to push your setup, i don't see it as a problem. Check out the 8GB 390X review (the 390/390X is a slightly faster 290/290X Hawaii core); most will agree the 8GB is not enough of an upgrade to jump on it right now. Very few games need more then 4GB and if they do, you generally don't have much of an impact if the gpu is pretty beefy like Hawaii.

Btw, all dual gpu core cards are generally marketed with twice the vram of each core. A Titan Z doens't have 12GB but 6GB of effective vram; 295x2 has 4gb not the advertised 8GB, 7990 has 3gb effective and not 6GB, GTX 690 has 2gb per core and not 4gb that's on the box/name, 6990 again the same with 2gb per core not 4gb stamped on the specs/box, and even my might Ares Mk1 only has 2GB per core and not the 4GB Asus markets it (but still more then the 5970 1GB per core, marketed as 2gb).

One last thing, this concern is not new and jumping on the higher vram cards may yield you little to no change. I was too freaking out back then and switched from 1.5gb GTX 480s to 3GB 580s. Crysis 2 was the one of the most demanding games back then and in quads, I didn't break 2GB in Surround 1080. Don't feel left behind as the 295x2's raw power will still over most of the top tier cards today, even the mighty Titan X with 12GB. There were a couple of members who tested 4k Eyefinity and Surround (3x 4k monitors), and the Nvidia system peaked vram ~5GB using quad Titan BEs.


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SLADEizGOD*
> 
> Hi guys. I'm running into a problem with my 295x2. I bought a new monitor & wanted to try out so I bought the Acer 27in freesync at 144hz.. I bought an display port to mini display port to get things rolling. And for some reason it only allowing me to use 4096 on GTA5. I've tested my R9 290x & its fine. But my 295x2 is only running on 1 GPU. Would this be the wire? I'm lost.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> That's seem perfectly normal to me. Keep in mind, the 295x2 is two 4GB 290X, which in turn does *not* stack the two banks of 4gb of vram. So, essentially you only have 4gb of "effective" memory. Your reading is correct, but I would be more concerned with gpu load. Turn on Afterburner, detach the monitoring screen, enlarge it, make sure you have at least gpu usage and vram for *both* cores. Run benchmarks and some games see what you get. If you only have one core showing usage, check your settings such as crossfire profiles and/or ulps. Try a few different games and see what happens. Tbh, 4gb maxed out using 120+hz and 2k, its no surprising. Take some screenshots of AB monitoring if you still have some concerns and post em here
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *SLADEizGOD*
> 
> I've going to have to download MSI AB when I get home from work. I can tell you that I jumped on GTA5 and went into the graphics setting. thats where it only alloew 4096 Gb of memory. Also it's not saying it's linked anymore on crimson.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> GTA V is known to peg gpu's with 4gb when running 4k or 120hz 2k. I've seen it (and shadow of mordor) used for vram testing. From these reviews, 4GB is still plenty to get good frame rates. Even the 390/390X reviews say stick with 290/290X Hawaii if you have one already, as the vram won't make much of a difference. Take note of your frames as that's more important tbh. If you're getting under 60hz (or 90hz for freesync), try toning down the eyecandy. The only other options would be a 290X 8gb (rare), 390/390X 8gb, Titan 6gb, 980 Ti 6gb, or Titan X. I would recommend the Ti, but in terms of raw power, even in 4k, the 295x2 is much better (and they're appreciating in value).
> Quote:
> 
> 
> 
> Originally Posted by *SLADEizGOD*
> 
> The frame rate is still good. But bothers me just seeing 4gb on my 295x2. I've add my 290x to get another 4Gb. but I can deal with it. I will try to make sure & run Afterburner just in case.
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> Unfortunately, vram doesn't stack as I mentioned earlier, no matter how many gpu's you throw in there. I have crossfire 295x2 and it doesn't mean i have 16gb of vram. i still have 4gb just like a single 290/290X. Vram is one of those things that gets too much attention solely because a few games consume more then the status quo. At your level of resolution and refresh, 4GB is pretty healthy. Vram is a tricky thing to understand as well. Unless you have 2gb and you're trying to push your setup, i don't see it as a problem. Check out the 8GB 390X review (the 390/390X is a slightly faster 290/290X Hawaii core); most will agree the 8GB is not enough of an upgrade to jump on it right now. Very few games need more then 4GB and if they do, you generally don't have much of an impact if the gpu is pretty beefy like Hawaii.
> 
> Btw, all dual gpu core cards are generally marketed with twice the vram of each core. A Titan Z doens't have 12GB but 6GB of effective vram; 295x2 has 4gb not the advertised 8GB, 7990 has 3gb effective and not 6GB, GTX 690 has 2gb per core and not 4gb that's on the box/name, 6990 again the same with 2gb per core not 4gb stamped on the specs/box, and even my might Ares Mk1 only has 2GB per core and not the 4GB Asus markets it (but still more then the 5970 1GB per core, marketed as 2gb).
> 
> One last thing, this concern is not new and jumping on the higher vram cards may yield you little to no change. I was too freaking out back then and switched from 1.5gb GTX 480s to 3GB 580s. Crysis 2 was the one of the most demanding games back then and in quads, I didn't break 2GB in Surround 1080. Don't feel left behind as the 295x2's raw power will still over most of the top tier cards today, even the mighty Titan X with 12GB. There were a couple of members who tested 4k Eyefinity and Surround (3x 4k monitors), and the Nvidia system peaked vram ~5GB using quad Titan BEs.
Click to expand...

^ Everything he said....


----------



## Its L0G4N

I'm having these red lines appear during boot and then gets to the windows 7 boot animations then BSODs. Anyone have an idea?

Taken out then cards, cleaned them checked the cords and it's not the screens...


----------



## wermad

Quote:


> Originally Posted by *Its L0G4N*
> 
> 
> I'm having these red lines appear during boot and then gets to the windows 7 boot animations then BSODs. Anyone have an idea?
> 
> Taken out then cards, cleaned them checked the cords and it's not the screens...


Try your mobo's igpu, check your cables, and/or if you're using mini dp try a different port on your 295x2. I get these lines using an hdmi to mini dp adapter, though no bsod. Could be your card but test the simple stuff first.


----------



## Its L0G4N

Quote:


> Originally Posted by *wermad*
> 
> Try your mobo's igpu, check your cables, and/or if you're using mini dp try a different port on your 295x2. I get these lines using an hdmi to mini dp adapter, though no bsod. Could be your card but test the simple stuff first.


I did and it boots up fine. cables are fine... the center monitor is mini dp to hdmi but the one to the right is just straight dvi to dvi.


----------



## Alex132

Sounds like a dying GPU to me. Probably from overvoltage/overclocking/lots of heat.

My one 5870 did something similar before it died


----------



## Its L0G4N

Quote:


> Originally Posted by *Alex132*
> 
> Sounds like a dying GPU to me. Probably from overvoltage/overclocking/lots of heat.
> 
> My one 5870 did something similar before it died


I have never overclocked this card though... F**K ME! I got this card on sale for $660 and now they are going between $1,600 and $2,000.


----------



## Alex132

I payed around $450 for mine









I wouldn't say it's dead, just try all the variables first.

1) Different monitor
2) Different cable + different monitor
3) Reinstall drivers
4) Entirely different PC
5) etc.


----------



## Its L0G4N

Quote:


> Originally Posted by *Alex132*
> 
> I payed around $450 for mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wouldn't say it's dead, just try all the variables first.
> 
> 1) Different monitor
> 2) Different cable + different monitor
> 3) Reinstall drivers
> 4) Entirely different PC
> 5) etc.


Basically robbed them!









I have three other monitors not shown in that photo that I tried it all on with different cables/adapters etc. doubt drivers would be an issue at boot, but who knows. May try the other pc approach.


----------



## Alex132

Trying another PC+monitors is the fastest way to pin-point that it's the graphics card itself and nothing else.


----------



## wermad

As I mentioned before, these guys are appreciating. I'm seeing auctions ending $700+ and greedy sellers w/ buy-it-now ~$1000-2500. I tried talking the boss into selling these and getting some Fury's (not X), but she said no







.


----------



## Its L0G4N

Quote:


> Originally Posted by *wermad*
> 
> As I mentioned before, these guys are appreciating. I'm seeing auctions ending $700+ and greedy sellers w/ buy-it-now ~$1000-2500. I tried talking the boss into selling these and getting some Fury's (not X), but she said no
> 
> 
> 
> 
> 
> 
> 
> .


And I was here, hoping they would get cheaper or go on sale again to go quad-fire!


----------



## SLK

I guess I sold mine at the wrong time. Jeez...


----------



## fat4l

Im thinking about selling my card now and going over to the green team








Asus Ares III


----------



## Agent Smith1984

Crazy cause I almost pulled the trigger a few times on some 295's going for $450-500 about a year ago, now you can't get one for less than $700 online.... slow performance improvements in GPU, along with rebranding of old GPU's causes stuff like this to hold it's value really well.

Good job to anyone who "hung in there" cause these cards still represent awesome "single card" performance.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Crazy cause I almost pulled the trigger a few times on some 295's going for $450-500 about a year ago, now you can't get one for less than $700 online.... slow performance improvements in GPU, along with rebranding of old GPU's causes stuff like this to hold it's value really well.
> 
> Good job to anyone who "hung in there" cause these cards still represent awesome "single card" performance.


I'll say it now..........I am NEVER selling mine


----------



## wermad

Don't forget ebay chargers ~12% and paypal ~2%, so you're gonna have to cough up ~14-15% (not including shipping). So if you're card sells for $700 usd, you're only getting ~$600 back. Which ain't bad considering I paid $600 for a month old barely used card last year (February) and I paid $650 for a bnib unit a week later. I would imagine a new unit wrapped would fetch towards $1000. Still, remember guys, msrp @ launch was $1500 usd







.

edit: too bad ek is not making lightning Ti blocks, otherwise you would have seen my cards up on ebay for sale







BP makes some but there's no Lightnings unless i fork out $1k. The LE's are available but for that price, the Amp! Extreme is a better choice and BP also makes a block. Sucks 3/4-way sli sucks with Maxwell. I'll probably just stay put for now....


----------



## Its L0G4N

These are the other two things that appear on my screen.


----------



## wermad

Try the second bios switch, if that don't work, then see if you have recourse for rma/return. Good luck and it sucks you got a bad one.


----------



## fayzaan

Hey Guys,

I just recently purchase a R9 295x2 card. Works great, but...I noticed there is a 2 pin cable running from the radiator, what the heck is that for?


----------



## Sgt Bilko

Quote:


> Originally Posted by *fayzaan*
> 
> Hey Guys,
> 
> I just recently purchase a R9 295x2 card. Works great, but...I noticed there is a 2 pin cable running from the radiator, what the heck is that for?


You sure you don't mean the fan header?

If you could post a picture might be able to get a better idea about what you mean


----------



## fayzaan

Sorry not home right now but...this one shows it



Lol I get it now, so its the cable to connect the fan. I didn't get a fan with it, so I attached the radiator to one of the case fans..but I guess if I connect the fan to that connector on the card, the card will regulate the speed and such?

In fact that is my next question...since I don't have the stock fan, what fan's should I get for the radiator? probably will do push/pull config. I don't mind noise much, in fact if I can get something that drops the temps for some overclocking, that would be a treat







.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fayzaan*
> 
> Sorry not home right now but...this one shows it
> 
> 
> 
> Lol I get it now, so its the cable to connect the fan. I didn't get a fan with it, so I attached the radiator to one of the case fans..but I guess if I connect the fan to that connector on the card, the card will regulate the speed and such?
> 
> In fact that is my next question...since I don't have the stock fan, what fan's should I get for the radiator? probably will do push/pull config. I don't mind noise much, in fact if I can get something that drops the temps for some overclocking, that would be a treat
> 
> 
> 
> 
> 
> 
> 
> .


Ahh ok then

Yeah that's the 3 pin fan cable, it doesn't have PWM control so most of us use motherboard headers or fan controllers.

as for the fans I use Noctua NF-F12 Industrials but some use Corsair SP120's, Cougar Vortex's etc etc




^ here's a video on a few different types


----------



## wermad

Quote:


> Originally Posted by *fayzaan*
> 
> Hey Guys,
> 
> I just recently purchase a R9 295x2 card. Works great, but...I noticed there is a 2 pin cable running from the radiator, what the heck is that for?
> Quote:
> 
> 
> 
> Originally Posted by *fayzaan*
> 
> Sorry not home right now but...this one shows it
> 
> 
> 
> Lol I get it now, so its the cable to connect the fan. I didn't get a fan with it, so I attached the radiator to one of the case fans..but I guess if I connect the fan to that connector on the card, the card will regulate the speed and such?
> 
> In fact that is my next question...since I don't have the stock fan, what fan's should I get for the radiator? probably will do push/pull config. I don't mind noise much, in fact if I can get something that drops the temps for some overclocking, that would be a treat
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Before you buy another fan, what case and how are you setting up the radiator?


----------



## fayzaan

https://www.nzxt.com/products/noctis-450-black-red-mid-tower-computer-gaming-case-ca-n450w-m1

Noctis 450 is the case I am using. And currently I have the radiator mounted to 1 of the case fans in the front of the case. So fan pulls in air and cools radiator, and theres lots of heat inside the case because of it...I think I should just mount it to the back of the case, so that way the hot air is going out instead of IN to the case. Its just the fan on the back of the case is too big, will need to buy a 120mm.


----------



## wermad

Quote:


> Originally Posted by *fayzaan*
> 
> https://www.nzxt.com/products/noctis-450-black-red-mid-tower-computer-gaming-case-ca-n450w-m1
> 
> Noctis 450 is the case I am using. And currently I have the radiator mounted to 1 of the case fans in the front of the case. So fan pulls in air and cools radiator, and theres lots of heat inside the case because of it...I think I should just mount it to the back of the case, so that way the hot air is going out instead of IN to the case. Its just the fan on the back of the case is too big, will need to buy a 120mm.


What do you have cooling your cpu? aio or hsf?


----------



## fayzaan

for now just HSF


----------



## wermad

ok, gotcha, try setting up the front fans as intake and top mount the radiator to exhaust (set your fan in pull). Also,turn your hsf to pull air from the bottom to the top so that way, its airflow direction is similar to the radiator. Since your case can do a 360 on top, mount the radiator on the forward most top 120mm mount so there's space for your hsf to push air out. You can add a fourth rear intake fan to feed your case even more if you wish. I wouldn't add fans for exhaust on the top as you wanna try to achieve positive air pressure (more air pushed in the case than exhausted). Give that a try and if your temps don't improve, we have other suggestions.

Edit:

I made a little illustration for you. Hope this helps, report back what you find


----------



## Mega Man

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fayzaan*
> 
> Sorry not home right now but...this one shows it
> 
> 
> 
> Lol I get it now, so its the cable to connect the fan. I didn't get a fan with it, so I attached the radiator to one of the case fans..but I guess if I connect the fan to that connector on the card, the card will regulate the speed and such?
> 
> In fact that is my next question...since I don't have the stock fan, what fan's should I get for the radiator? probably will do push/pull config. I don't mind noise much, in fact if I can get something that drops the temps for some overclocking, that would be a treat
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Ahh ok then
> 
> Yeah that's the 3 pin fan cable, it doesn't have PWM control so most of us use motherboard headers or fan controllers.
> 
> as for the fans I use Noctua NF-F12 Industrials but some use Corsair SP120's, Cougar Vortex's etc etc
> 
> 
> 
> 
> ^ here's a video on a few different types
Click to expand...

Eh, I think you mean , most use full water blocks


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fayzaan*
> 
> Sorry not home right now but...this one shows it
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Lol I get it now, so its the cable to connect the fan. I didn't get a fan with it, so I attached the radiator to one of the case fans..but I guess if I connect the fan to that connector on the card, the card will regulate the speed and such?
> 
> In fact that is my next question...since I don't have the stock fan, what fan's should I get for the radiator? probably will do push/pull config. I don't mind noise much, in fact if I can get something that drops the temps for some overclocking, that would be a treat
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Ahh ok then
> 
> Yeah that's the 3 pin fan cable, it doesn't have PWM control so most of us use motherboard headers or fan controllers.
> 
> as for the fans I use Noctua NF-F12 Industrials but some use Corsair SP120's, Cougar Vortex's etc etc
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ^ here's a video on a few different types
> 
> Click to expand...
> 
> Eh, I think you mean , most use full water blocks
Click to expand...

You know what I meant......


----------



## Mega Man

I know nothing!


----------



## wermad

$700+ on ebay......maybe its time for three Zotac 980 Ti Amp! Extreme (1300mhz factory boost oc







). Bp makes a block but its clear/nickel, but I'm hoping they do a "black ice" version. Still mulling over it









edit: holy crap! one sold for $810....good lordy, this is hyper tempting....


----------



## fayzaan

well, I was really lucky, I got one locally for $450 Canadian. And its basically brand spanking new, he had it in a demo workstation.


----------



## Sgt Bilko

Hell of a deal, you practically stole it


----------



## wermad

They were going for under $500 just a few months ago. My guess is that even after driver some driver maturity, this card still spanks Titan X, 980 Ti, and Fury X and it was selling for about the same new and used was less. Still great value considering how these new cards tank badly in scaling beyond two cores. I'm really tempted since $800 should translate to above $700 after ebay/paypal and shipping fees. God...its got my head in overtime... Two Amp! Extreme's sold for ~$600 used, so I should be able to break even on two and if my monitor and other stuff sells, I should be able to get enough for the 3rd Zotac. Its so tempting....


----------



## fayzaan

so, for the time being I moved the radiator to the back of the case, seems to be doing better temps wise...but still the card gradually gets to 74c and then starts throttling the clocks. It gets to the point where games start lagging







. Any suggestions? do you think getting new fans will have enough of an impact to stabilize the temps, does my set up need some other improvements?

I think part of it is probably due to the cpu fan blowing out hot air, which then affects the radiator cuz it starts to push hot air against the it lol..


----------



## wermad

Quote:


> Originally Posted by *fayzaan*
> 
> so, for the time being I moved the radiator to the back of the case, seems to be doing better temps wise...but still the card gradually gets to 74c and then starts throttling the clocks. It gets to the point where games start lagging
> 
> 
> 
> 
> 
> 
> 
> . Any suggestions? do you think getting new fans will have enough of an impact to stabilize the temps, does my set up need some other improvements?
> 
> I think part of it is probably due to the cpu fan blowing out hot air, which then affects the radiator cuz it starts to push hot air against the it lol..


Think you missed my post:
Quote:


> Originally Posted by *wermad*
> 
> ok, gotcha, try setting up the front fans as intake and top mount the radiator to exhaust (set your fan in pull). Also,turn your hsf to pull air from the bottom to the top so that way, its airflow direction is similar to the radiator. Since your case can do a 360 on top, mount the radiator on the forward most top 120mm mount so there's space for your hsf to push air out. You can add a fourth rear intake fan to feed your case even more if you wish. I wouldn't add fans for exhaust on the top as you wanna try to achieve positive air pressure (more air pushed in the case than exhausted). Give that a try and if your temps don't improve, we have other suggestions.
> 
> Edit:
> 
> I made a little illustration for you. Hope this helps, report back what you find


You can always hit "go to my last post" in your subscriptions. It will then take you to your last post (as in the name) and can catch up on posts after yours. It helps so you don't skip any replies or info that might be pertinent.


----------



## fayzaan

yea, sorry I did miss it. But thank you, for all that effort. Looks like a great idea, I will give it a shot and post the results







.


----------



## fayzaan

So, I tried what you suggested, but there was one issue.

The CPU HSF doesn't allow me to change its position to push air upwards. So, I left the rear to push air out, and cpu hsf to push towards the rear, as it was, but changed the position of the video cards radiator to push pull upwards.

This configuration seemed to help a lot. Temperature seems to be steady at 73c, but when i look at the gpu clocks, they still fluctuate from around 800+ to 1015.


----------



## fayzaan

heres an illustration of what I have done. Going to see what else I can do to help.

Also noticed, idle temps for the card is 41/42c (gpu2/gpu1). And my CPU is idling at 33-35c.

Looked up online and I see CPU idles at 21->25c and video cards idle 32->35c. So clearly somethings up.


----------



## wermad

Your case does resemble another nzxt case a member was complaining about high temps as well. I think it was the 340 and apparently he settled on switching cases. In the mean time, iirc, he left the door panel off.

Which cpu cooler you have? Most will allow you to rotate it, even the venerable H212 works rotated.


----------



## fayzaan

Yea, thats disappointing considering how expensive this case was :/, oh wells. I will do some more research, see what else I can do. thanks for all your help







.


----------



## fayzaan

So, bought 2 Corsair SP120 HP edition. They helped keep the temps at 72c. So I decided to do as you suggested, removed the side, front, and top panels.


----------



## wermad

Those things are loud at full blast. I only run my 65 at full when benching. For the majority of the time, including gaming, they all sit @ ~4.8v (40% of fan controller). On the occasional, super hot, and humid socal nights, I will bump them up a bit to ~6-7v and use my headphones.

Look into the corsair cases. A lot of cases weren't really meant to hold such a beast as the 295x2 and its stock cooler. Also, keep in mind your ambient. If its a hot day, crack a window open (or turn your home ac on if you have any) and open the side panel. It will help your card. If you can, look from some high cfm fans for intake if your current stock ones aren't cutting it. Corsair is a safe choice, so look into the AF120s.

Ultimately, going full custom water will help dramatically, but its an expensive endeavor. Its perplexing why amd decided on such a low thermal limit for the 295x2 (vs the 99°C on the 290X reference). Its something all owners on stock have to deal with unfortunately.


----------



## fat4l

I'm now selling my card, Asus Ares III, which is technically 295X2 with custom pcb and watercooled.
Theres just 2 cards now on ebay.co.uk... I hope I sell my card for good money..









Whats going on with this price increase ?


----------



## wermad

Lots of factors but its hard to pin point one or two primary reasons. You have the limited availability of these cards. The 295x2 is still 5-10% faster then Titan X, Ti, Fury X, & Fury. One card can run 4k with ease (I know first hand) and its already got a water cooler. good luck man, and tnx for all your help!









I'm leaning on the side of keeping my cards. After some research, I came to the conclusion I'll just be side-grading and spending a bit more. Its very disappointing that the Zotac Extreme has issues (and no acetal or "black ice" blocks from BP), I can't find a Lightning for ~600 usd, and reference cards are holding ~$500-600. I may wait if I can find three (or four







) Lightnings (non LE's) under $500 in the next six to twelve months.


----------



## fat4l

Quote:


> Originally Posted by *wermad*
> 
> Lots of factors but its hard to pin point one or two primary reasons. You have the limited availability of these cards. The 295x2 is still 5-10% faster then Titan X, Ti, Fury X, & Fury. One card can run 4k with ease (I know first hand) and its already got a water cooler. good luck man, and tnx for all your help!


Last time I checked my Ares 3 OC vs Fury X oc, I got 35-45% more performance over Fury X. I'm still not happy with CF + Freesync support tho thus selling the card...


----------



## Its L0G4N

Guys, talk me out of selling my 295x2 and going with two Fury Xs. I'll be getting it back from XFX RMA in a coupe of weeks.


----------



## Feyris

Quote:


> Originally Posted by *Its L0G4N*
> 
> Guys, talk me out of selling my 295x2 and going with two Fury Xs. I'll be getting it back from XFX RMA in a coupe of weeks.


Keep it and wait for Polaris, its that easy.


----------



## fat4l

Quote:


> Originally Posted by *Feyris*
> 
> Keep it and wait for Polaris, its that easy.


Polaris will be ****. Amd is aiming for perfomance/watt ratio not max perofmance.

Polaris 11 will be low end card with super low power consumption.
Polaris 10 will be middle range card with performance ~Fury X at best.
Polaris 12 should be >Fury X but ....thats not coming now.


----------



## wermad

leaked(and/or fake) roadmap shows no hbm2 8gb for the next gen. I hope I'm wrong...like Fury X.


----------



## Its L0G4N

i'm hoping i could get close to $800 if not more on eBay for the 295x2 and that would set me up nicely for the Fury X's. That all i'm thinking.


----------



## fat4l

Quote:


> Originally Posted by *wermad*
> 
> leaked(and/or fake) roadmap shows no hbm2 8gb for the next gen. I hope I'm wrong...like Fury X.


HMB2 is 2017...
Polaris 11 will use DDR5.
Polaris 10 will most probably use DDR5 too and if not then hbm1 as DDR5X are not available yet.


----------



## wermad

Quote:


> Originally Posted by *Its L0G4N*
> 
> i'm hoping i could get close to $800 if not more on eBay for the 295x2 and that would set me up nicely for the Fury X's. That all i'm thinking.


Don't forget, ebay and paypal want their cut, so you're gonna cough up ~15% of the sale price. So lets say your card sells for $800, you'll only get 680. Now you can sell it on the forums for ~$700 and leave ~$680 as your bottom line. Also, don't forget to factor shipping. There's one right now for $850, but that one is only a few months old. My guess, my 1.25 y/o cards can be close to $700, so that's $600. I may go with two Zotac Amp! Extremes as 3-way scaling is hit or miss with 980 Ti. Ugh, makes it a tough call...








Quote:


> Originally Posted by *fat4l*
> 
> HMB2 is 2017...
> Polaris 11 will use DDR5.
> Polaris 10 will most probably use DDR5 too and if not then hbm1 as DDR5X are not available yet.


Man, if hbm2 is not ready this year, more reason to switch to Nvidia. Most ppl tell me to wait for Pascal and hope there's a price drop on Maxwell. But having had Nvidia in the past, they tend to hold their value much longer or depreciate slower vs amd, even after a new gen launch. For the the 970 and 980 will drop but the Ti, a direct replacement won't come until later, so it may stay where they are now.


----------



## fat4l

Quote:


> Originally Posted by *wermad*
> 
> Man, if hbm2 is not ready this year, more reason to switch to Nvidia. Most ppl tell me to wait for Pascal and hope there's a price drop on Maxwell. But having had Nvidia in the past, they tend to hold their value much longer or depreciate slower vs amd, even after a new gen launch. For the the 970 and 980 will drop but the Ti, a direct replacement won't come until later, so it may stay where they are now.


That's exactly what I'm gonna do. Selling the card, selling the monitor(its freesync) and getting g-sync monitor + Pascal.

Also, you are correct that the "full" pascal that should feature HBM2 won't come this early as 16nm FinFet process is very new so nvidia won't be able to procude a "full" card any time soon due to wafer defects.
What is coming now is a replacement for 980(first) and 970(a bit later). The replacement for 980 aka 1080 will be stronger than 980Ti, but should still feature DDR5 only. However I don't see a reason not to buy nvidia. I'm talking about "june" time


----------



## wermad

Yeah, that's the thing that worries me, if pascal 1080 will post up better then Ti Maxwell. Still, I don't expect prices will drop as its not as common as amd price drops and quicker depreciation (cept' 295x2 powah!). Probably why I'm leaning on getting three vanilla Ti's and skip the uber Ti's like Lightning and Amp!Extreme.


----------



## Mega Man

even with amd not focusing on hard hitting power they are so far ahead of nvidia i have no doubts they will eat them up ...

nvidia can not compete with amd at higher res, they just cant.

1080p, sure


----------



## wermad

Gonna test the waters on ebay, setting up a listing for everything (cards, blocks, stock coolers, and bridge). If no bites, I'll split them up. Sad to see them go, but I'm just gonna get sli Ti's and call it a day.


----------



## Mega Man

See you soon when the new and come out


----------



## SAFX

I'm having a VERY tough time finding XSPC Razor waterblock for my custom loop; I managed to grab the last backplate from performancepcs.com, but the waterblock is my main concern. I may need to search outside the US; open to suggestions for reputable e-tailers in Europe


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> See you soon when the new and come out


You can buy em if you like







, ? For your other side of the TX build??? As i mentioned, just testing to see what happens. If I do wanna sell something, i price it more aggressive and not conservatively. And if I get no takers, I'm still fine holding on to these guys. Its just enticing having them appreciate to more then what I paid for as its rare to make a profit on older gpu's.


----------



## Mega Man

nah thanks, i have 2 295x2 and 5 290xs i am good in that regard


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> nah thanks, i have 2 295x2 and 5 290xs *i am good in that regard*










wow, never thought you would settle based on your demeanor.

Anyways, got a few offers that seem fair but I feel that if its not a good or great offer, I would just keep them.


----------



## Mega Man

oh i am not, i want 4 fury x but i have accepted the fact i have a baby... will have to live with getting 4 more soon of w.e. gpu is out and possibly getting 4 fury x later ...

btw if anyone says they know everything, they lie !

i have a crapton to learn.


----------



## ramos29

hi every one
i have no idea about overcklocking
i used to be against such things
but since new games have no/broken croosfire profil i started reconsidering my decision
so via msi afterburner if i set 1100 mhz + 1400 mhz in memorie are there any risk in doing that?
am i obliged to twick the max voltage?
how much gain will i get
thx


----------



## ramos29

need for speed : broken crossfire poor scaling unplayable cross enabled
the division : gosting shadowing flickering on crossfire
gothic armada : no crossfire
quantum break no croosfire
just cause no croosfire
i wonder if sli/croosfire setups are the best choice nowadays
if next gen amd cards are 4k capable ( i mean 60-50 fps mid high seting ) i will go for a single card setup


----------



## fayzaan

So...I have a r9 290x that was sitting in a box...hasn't sold yet, thinking I should use it in a TriFire config. But, need to get a more powerful PSU.


----------



## ramos29

yeah 850 w are not enough to handle 3 gpus, i think at load 295x2+290x eats 1050w
where are you from, if you are near i can buy it from you XDDD


----------



## fayzaan

Yea, I'm in Calgary, AB, Canada.


----------



## ramos29

you are almost the boy next door, only 8945 km seperate us XDDD
even if you get another psu, are you sure i5 6600k can handle 3 gpus without bottleneck?


----------



## ramos29

amd released a new driver which fixes nfs/ the division and many other issues


----------



## fayzaan

not sure, worth a try







. good to hear about new drivers, gunna update after work.


----------



## ramos29

i kept refresshing amd website for a driver update. i cant test it before tomorow though
in gta 5 at 4k my card is runnin hot almost 70, its getting hot in my country
during winter it does not exeed 60 , ocking my card will surely bring it to the maximum tolerated temp which is 75 :/


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> oh i am not, i want 4 fury x but i have accepted the fact i have a baby... will have to live with getting 4 more soon of w.e. gpu is out and possibly getting 4 fury x later ...
> 
> btw if anyone says they know everything, they lie !
> 
> i have a crapton to learn.


Read the 4way reviews. At most 3way with Fury and Ti imho.

Got some offers and if they go, going with two Ti's (and maybe a 3rd down the road).


----------



## Mega Man

heck no.

they always say 3/2way is better

quad gpus ( of flagship gpu) or BUST !


----------



## fayzaan

Ok...not sure if I should have done this, but I tried the tri-fire on this 850w psu and it works...at least on BF4. Getting big difference in performance, 4k ultra and getting around 100-160fps. Seemed smooth, no complaints.

But, I noticed that my 290x wont run when playing Witcher 3. Not sure if its not supported or if I have to change something in the settings to get it to work?


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> heck no.
> 
> they always say 3/2way is better
> 
> quad gpus ( of flagship gpu) or BUST !


Well, very few titles scale beyond three in 4k. I'm more inclined with Ti, get some oc on them and maybe throw in a 3rd down the road. 4-way scaling is also poor in most cases with Ti but I wouldn't mind adding the 4th when they're cheaper. I'm not holding my breath pascal will drop ti prices until its successors arrives later on.


----------



## Mega Man

yea, that is pretty much what everyone says about what the last 3 amd generations ( like 5 or 6 nvidia generations )


----------



## wermad

Quote:


> Originally Posted by *Mega Man*
> 
> yea, that is pretty much what everyone says about what the last 3 amd generations ( like 5 or 6 nvidia generations )


For me, its where my needs go is where my wallet follows. I don't have a blood-oath to any brand compromising my choices for my gaming rig







.

Anyways, seems like I may just split them, but I can also trade them here in the forums for Ti's if anyone is not reluctant to do so.

Quote:


> Originally Posted by *fayzaan*
> 
> 
> Ok...not sure if I should have done this, but I tried the tri-fire on this 850w psu and it works...at least on BF4. Getting big difference in performance, 4k ultra and getting around 100-160fps. Seemed smooth, no complaints.
> 
> But, I noticed that my 290x wont run when playing Witcher 3. Not sure if its not supported or if I have to change something in the settings to get it to work?


Get yourself a outlet power gauge like Kill-A-Watt. This will tell you total power consumption at the wall (ac) and its just some simple math for dc conversion with psu efficiency factor. Just watch out for the 295x2 as it draws a lot of amps and it could overwhelm your setup and that psu.


----------



## WestwardFIlly5

Hey, I've done a quick search of this thread and couldn't find anything about the issue I'm having, but I figured you guys would probably be the best people to ask.

After about 2 months of use the vrm fan started to ramp up to 100% and stay there for 10 or so minutes before returning to normal when woken from sleep. But only every second time its woken up.

This has been seriously grinding my gears for a while and they only solution I've come across is to connect the vrm fan to my motherboard and control it though software or give up sleep, both of which I want to try and avoid.

Anyone have any suggestions?


----------



## wermad

Long shot, but open gpuz and open the monitoring section. Put your rig to sleep and then wake it. Check your vrm temps just to make sure its not some sort of spike in vrm temps to could be ramping up the fan. The reason I suggest that is that I had a bad block mount once (wrong pad thickness) and I would get video crashing with in a few minutes because of the lack of contact. If your temps look normal, could be a faulty fan. You may have to rma the whole thing if you can't find a replacement fan on ebay.


----------



## WestwardFIlly5

Thanks for the reply, I'll have a look when I get home as I just left the house.


----------



## WestwardFIlly5

Quote:


> Originally Posted by *wermad*
> 
> Long shot, but open gpuz and open the monitoring section. Put your rig to sleep and then wake it. Check your vrm temps just to make sure its not some sort of spike in vrm temps to could be ramping up the fan. The reason I suggest that is that I had a bad block mount once (wrong pad thickness) and I would get video crashing with in a few minutes because of the lack of contact. If your temps look normal, could be a faulty fan. You may have to rma the whole thing if you can't find a replacement fan on ebay.


I got home and put the pc to sleep with gpu-z running to check the vrm temps as you said. Everything was looking normal, It didn't ramp the fan up, nor did it the second time, or the third.
Its been 30 minutes of sleeping and restarting to try and have the issue repeat its self, still working. I'm not sure why its working now or what even fixed it but its not happening anymore. Not that I'm complaining.

Well thanks for your help as I guess its fixed.


----------



## SAFX

I contacted XSPC directly regarding the R9 Razor water block, their reply....

Quote:


> Hi Sir,
> 
> I'm afraid we sent the rest of our stock out to WatecoolingUK, who have now sold them all about 1 year ago.
> 
> Regards
> 
> Customer Services
> XSPC Shop


I'm having a tough time finding _*any*_ water block, much less XSPC, for this card. Kicking myself for not pulling trigger last year,
So far, for my custom loop, I've got one bottle of Antimicrobial silver coil, that's it







, and a LONG way to go.


----------



## wermad

Quote:


> Originally Posted by *WestwardFIlly5*
> 
> I got home and put the pc to sleep with gpu-z running to check the vrm temps as you said. Everything was looking normal, It didn't ramp the fan up, nor did it the second time, or the third.
> Its been 30 minutes of sleeping and restarting to try and have the issue repeat its self, still working. I'm not sure why its working now or what even fixed it but its not happening anymore. Not that I'm complaining.
> 
> Well thanks for your help as I guess its fixed.


Do you have any application that manages your fan? ie Afterburner, Trixx, Asus???

If it happens again, try running only Overdrive's fan control options in CCC.

Quote:


> Originally Posted by *SAFX*
> 
> I contacted XSPC directly regarding the R9 Razor water block, their reply....
> I'm having a tough time finding _*any*_ water block, much less XSPC, for this card. Kicking myself for not pulling trigger last year,
> So far, for my custom loop, I've got one bottle of Antimicrobial silver coil, that's it
> 
> 
> 
> 
> 
> 
> 
> , and a LONG way to go.


I got a few offers for my cards and all of them don't want the blocks, so I may have two Koolance blocks available if they sell. I'll post in the club to give you guys first dibs before posting them in the market if I still have them.

Btw, Koolance still sells theirs new:

http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2


----------



## WestwardFIlly5

Quote:


> Originally Posted by *wermad*
> 
> Do you have any application that manages your fan? ie Afterburner, Trixx, Asus???
> 
> If it happens again, try running only Overdrive's fan control options in CCC.


So its started to happen again, The only fan control software running apart from CCC is Corsair Link but I'm fairly sure that Isn't affecting it. (I've closed it for now to see if it happens again with out it running).

But I cant seem to reproduce the issue anymore. It only seems to be happening when it wants to at the moment.


----------



## wermad

Hopefully it stays that way for a while


----------



## Alex132

I just plugged my radiator fans into a mobo header and control the temp ramp-up via SpeedFan. The 295X2's power delivery for the fan is weird. It wouldn't spin up my fans at all for some reason.


----------



## Antares666

Hey guys, I was hoping to get a bit of help for a friend of mine. He's got a 295x2 running on a Phenomx6 1100T. Sadly he´s not very tech savy so thats why I'm the one asking for him.

Are these scores normal for his hardware? Cause everywhere I looked, the 295x2 always scores well in the 1500's. He's running a 1200W PSU as well, so power shouldnt be a problem either.

http://www.3dmark.com/fs/8322257

Thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *Antares666*
> 
> Hey guys, I was hoping to get a bit of help for a friend of mine. He's got a 295x2 running on a Phenomx6 1100T. Sadly he´s not very tech savy so thats why I'm the one asking for him.
> 
> Are these scores normal for his hardware? Cause everywhere I looked, the 295x2 always scores well in the 1500's. He's running a 1200W PSU as well, so power shouldnt be a problem either.
> 
> http://www.3dmark.com/fs/8322257
> 
> Thanks


It's the CPU, an 1100T isn't enough to feed that card

going to an overclocked FX series CPU would help but if he's gaming at 1440p or lower then you need an Intel chip

Here's an i7 6700k: http://www.3dmark.com/fs/8200826

and here's an i5 6600k: http://www.3dmark.com/fs/7957059

and for comparisons sake, here's an FX-9590: http://www.3dmark.com/fs/5005251

Firestrike isn't a very accurate representation of gaming performance though to be fair but the CPU does hold the card back a bit :/


----------



## Antares666

I figured that it could be the cpu, just wasnt expecting that much of a performance hit. Could it be also a combination of the CPU struggling plus the card running on a PCIex 2.0?

Seem quite wasteful, expecially since my single Stock 390x scores around the mid 10k. Since it technically uses the same chip (the Grenada is pretty much the same as the Hawaii) I was expecting much higher scores...

Anyway, thanks! Rep'd!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's the CPU, an 1100T isn't enough to feed that card
> 
> going to an overclocked FX series CPU would help but if he's gaming at 1440p or lower then you need an Intel chip
> 
> Here's an i7 6700k: http://www.3dmark.com/fs/8200826
> 
> and here's an i5 6600k: http://www.3dmark.com/fs/7957059
> 
> and for comparisons sake, here's an FX-9590: http://www.3dmark.com/fs/5005251
> 
> Firestrike isn't a very accurate representation of gaming performance though to be fair but the CPU does hold the card back a bit :/


It's amazing how that little 6600K let's those GPU's sing just as well as the i7 does.... what a great chip at $230~

I want to see some x3 and x4 CF between the two compared and see if the i5 still hangs cause if it does, that may be my next chip (that or zen depending on price/performance).

I have no plan of x3 or x4 crossfire, but I figure if it does x4 well now, then it will do x2 2 years from now and still not bottle me.


----------



## rdr09

Quote:


> Originally Posted by *Antares666*
> 
> Hey guys, I was hoping to get a bit of help for a friend of mine. He's got a 295x2 running on a Phenomx6 1100T. Sadly he´s not very tech savy so thats why I'm the one asking for him.
> 
> Are these scores normal for his hardware? Cause everywhere I looked, the 295x2 always scores well in the 1500's. He's running a 1200W PSU as well, so power shouldnt be a problem either.
> 
> http://www.3dmark.com/fs/8322257
> 
> Thanks


That's 2 290x in a single pcb. i had to oc my Phenom to 4.2 GHz to feed a single 290 properly. Still struggled. PCIe X2 does not matter much.


----------



## ramos29

hi, via amd overdrive i made those changes
power limit +50
gpu oc 8% : 1100MHZ
memory oc 1400mhz
every thing looks to run ok, max temp 60-64
but after 5-10 min in gta 5 and star wars the game crashes an error is displayed in the background but i cant read it
what may be the problem? i did not had those crashes before the oc
i know i am not gain huge fps boost with 8% oc but at the same time i dont want to remove the oc :/
what are the possible reason? i tuned down the oc to 7% but still have crashes in gta 5


----------



## Sgt Bilko

Quote:


> Originally Posted by *Antares666*
> 
> I figured that it could be the cpu, just wasnt expecting that much of a performance hit. Could it be also a combination of the CPU struggling plus the card running on a PCIex 2.0?
> 
> Seem quite wasteful, expecially since my single Stock 390x scores around the mid 10k. Since it technically uses the same chip (the Grenada is pretty much the same as the Hawaii) I was expecting much higher scores...
> 
> Anyway, thanks! Rep'd!


Phenom's do pretty well in Firestrike compared to FX because of the way the combined test is handled, for FX 8 cores it only utilizes 4 of those 8 cores during the combined and that has a large impact on the overall score, as I said before, Firestrike isn't an accurate representation of gaming performance but the Thuban is still holding that card back in gaming (especially at a lower resolution)

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's the CPU, an 1100T isn't enough to feed that card
> 
> going to an overclocked FX series CPU would help but if he's gaming at 1440p or lower then you need an Intel chip
> 
> Here's an i7 6700k: http://www.3dmark.com/fs/8200826
> 
> and here's an i5 6600k: http://www.3dmark.com/fs/7957059
> 
> and for comparisons sake, here's an FX-9590: http://www.3dmark.com/fs/5005251
> 
> Firestrike isn't a very accurate representation of gaming performance though to be fair but the CPU does hold the card back a bit :/
> 
> 
> 
> It's amazing how that little 6600K let's those GPU's sing just as well as the i7 does.... what a great chip at $230~
> 
> I want to see some x3 and x4 CF between the two compared and see if the i5 still hangs cause if it does, that may be my next chip (that or zen depending on price/performance).
> 
> I have no plan of x3 or x4 crossfire, but I figure if it does x4 well now, then it will do x2 2 years from now and still not bottle me.
Click to expand...

I don't have the cards to do quadfire anymore but I do have a 390x here for Trifire

I've not run it on either the i7 or i5 as yet but when I might have a little bit of spare time over the weekend and I'll give it a quick run.

And yes, the 6600k is an awesome little CPU imo, once you overclock it it matches anything my FX could do in Multi threaded apps and the single core perf is pretty awesome.

I'm working on a comparison between all 3 CPU's to see which really is the better one for those that just care about gaming and hopefully I'll be able to cover the 3 major resolutions as well so it should be pretty good


----------



## ramos29

and why no one responds to my questions? it would be good to make a tutorial or something in the first page to avoid that people like me ask questions
i disabled ulps but this did not help, crash after random time of play


----------



## Sgt Bilko

Quote:


> Originally Posted by *ramos29*
> 
> and why no one responds to my questions? it would be good to make a tutorial or something in the first page to avoid that people like me ask questions
> i disabled ulps but this did not help, crash after random time of play


Take the overclock off and see how you go.

If it still crashes then re-install drivers


----------



## ramos29

without overclock every thing is ok, reinstalled the driver but still the same problem


----------



## Sgt Bilko

Quote:


> Originally Posted by *ramos29*
> 
> without overclock every thing is ok, reinstalled the driver but still the same problem


Well if everything is ok without the overclock then obviously the overclock isn't stable.

either add voltage or don't run the overclock


----------



## ramos29

i made it a challenge to make the oc stable but as you said i am starting to consider giving up and turn off oc, unless someone figure out whose responsible for the system instability


----------



## Sgt Bilko

Quote:


> Originally Posted by *ramos29*
> 
> i made it a challenge to make the oc stable but as you said i am starting to consider giving up and turn off oc, unless someone figure out whose responsible for the system instability


YOU are responsible for the system instability, if everything works fine with the GPU at stock but doesn't when it's overclocked then obviously the overclock is UNSTABLE.

Again, either add voltage to make it stable or take it back to stock and be happy.


----------



## ramos29

ok going back to stock speed, thx


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> I got a few offers for my cards and all of them don't want the blocks, so I may have two Koolance blocks available if they sell. I'll post in the club to give you guys first dibs before posting them in the market if I still have them.
> 
> Btw, Koolance still sells theirs new:
> http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2


Thanks for that link, wermad.
Hows the performance on those Koolance blocks?
Does it come with a backplate? active/passive?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> I got a few offers for my cards and all of them don't want the blocks, so I may have two Koolance blocks available if they sell. I'll post in the club to give you guys first dibs before posting them in the market if I still have them.
> 
> Btw, Koolance still sells theirs new:
> http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2
> 
> 
> 
> Thanks for that link, wermad.
> Hows the performance on those Koolance blocks?
> Does it come with a backplate? active/passive?
Click to expand...

He is using the stock backplates iirc but I'm pretty sure Koolance has passive backplates for them, the only active backplate I know of is the one from Aquacomputer


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> I got a few offers for my cards and all of them don't want the blocks, so I may have two Koolance blocks available if they sell. I'll post in the club to give you guys first dibs before posting them in the market if I still have them.
> 
> Btw, Koolance still sells theirs new:
> http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2


Quote:


> Originally Posted by *Sgt Bilko*
> 
> He is using the stock backplates iirc but I'm pretty sure Koolance has passive backplates for them, the only active backplate I know of is the one from Aquacomputer


I couldn't find any backplates on Koolances' site, did you see them for sale anywhere?


----------



## Sgt Bilko

Quote:


> Originally Posted by *SAFX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> I got a few offers for my cards and all of them don't want the blocks, so I may have two Koolance blocks available if they sell. I'll post in the club to give you guys first dibs before posting them in the market if I still have them.
> 
> Btw, Koolance still sells theirs new:
> http://koolance.com/video-card-vga-amd-radeon-r9-295x2-water-block-vid-ar295x2
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> He is using the stock backplates iirc but I'm pretty sure Koolance has passive backplates for them, the only active backplate I know of is the one from Aquacomputer
> 
> Click to expand...
> 
> I couldn't find any backplates on Koolances' site, did you see them for sale anywhere?
Click to expand...

Performance PC's have the XSPC one but I can't find Koolance ones anywhere :/


----------



## SAFX

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Performance PC's have the XSPC one but I can't find Koolance ones anywhere :/


No longer, I have it now


----------



## SAFX

295X2 *released on April 21, 2014*, just over 2 years ago today...

*Did I miss something?*...why haven't the latest Nvidia and AMD cards made the 295x2 obsolete? two years is an eternity in this market


----------



## Dagamus NM

Quote:


> Originally Posted by *SAFX*
> 
> 295X2 *released on April 21, 2014*, just over 2 years ago today...
> 
> *Did I miss something?*...why haven't the latest Nvidia and AMD cards made the 295x2 obsolete? two years is an eternity in this market


We appear to have hit a wall.

I am happy that my 295x2s are still relevant.


----------



## invincible20xx

would it be a wise decision to upgrade to this from my current dual 290 tri x ?


----------



## Dagamus NM

Quote:


> Originally Posted by *invincible20xx*
> 
> would it be a wise decision to upgrade to this from my current dual 290 tri x ?


If those are just regular 290s then yes. Or just add it to the repertoire.


----------



## invincible20xx

Quote:


> Originally Posted by *Dagamus NM*
> 
> If those are just regular 290s then yes. Or just add it to the repertoire.


they are tri-x R9 290s


----------



## ramos29

dispite the update of the crossfire profil for nfs, the scaling is still horrible with many freeses.. unplayable when cross enabled :/
the division they fixed the gosting flickering stuff but fps is like waves ranging from 60 to 20 even if you turn shadows reflection to medium / low


----------



## fayzaan

got my 1300w PSU yesterday, hooked it up...for some weird reason it broke my raid ><, so I had to re-configure a new one and re-install windows from scratch.

Working good so far with the R9 295x2 and R9 290x. still doing some tests. GTA 5 Ultra 4k I get around 90->100 fps.


----------



## Mega Man

Yea, raid sucks. You should get rid of it. Unless you have a proper raid card...

Even then imo money is better spent on a ssd.


----------



## fayzaan

lol I do have SSD's, 2 radeon r7 ssd's







in raid. Good for loading games like GTA 5


----------



## Mega Man

Then when you have to " fix " your raid. Don't blame the psu

raid sucks


----------



## Samuris

Hi guys, my girl have the sapphire r9 295x2, any one of you have a modded bios for her card ? or just a uefi bios thanks guys


----------



## rakesh27

Guys,

Quick question, im using monitor PG278q G-Sync and game at 2560x1440.... your probably why i got a gsync monitor with a AMD card.

Reason being in my rig in the top slot i have PowerColor 295x2 and third slot i have Evga 980Ti SC Aio water&rad...

At present my dp cable is connected to the 295x2 and my refresh rate is on 85hz, when i set it to 100hz or 120hz the screen flickers in game and desktop...

When i connect the 980Ti with the dp cable i can easily do 144hz gsync without any issues.

Anyone ever done this, as why can i use the 295x2 at 120hz without flicker....

Thanks all...


----------



## Mega Man

i can do 144 hz without issues /... no idea my friend..


----------



## xarot

Quote:


> Originally Posted by *rakesh27*
> 
> Guys,
> 
> Quick question, im using monitor PG278q G-Sync and game at 2560x1440.... your probably why i got a gsync monitor with a AMD card.
> 
> Reason being in my rig in the top slot i have PowerColor 295x2 and third slot i have Evga 980Ti SC Aio water&rad...
> 
> At present my dp cable is connected to the 295x2 and my refresh rate is on 85hz, when i set it to 100hz or 120hz the screen flickers in game and desktop...
> 
> When i connect the 980Ti with the dp cable i can easily do 144hz gsync without any issues.
> 
> Anyone ever done this, as why can i use the 295x2 at 120hz without flicker....
> 
> Thanks all...


It's a common issue, I've seen it on my PG278Q and PG279Q and with my Ares III and Sapphire 295X2. I tried like 5 different cables, and none of them worked properly. Only 85 Hz worked without losing signal sometimes. I guess AMD+G-Sync don't play well together. 120 Hz works too, but it will cause issues every now and then.


----------



## fat4l

So my Ares III is sold now. Finally I can move to the green team lol.









Xarot, what about your dead ares ? Any news ?


----------



## ramos29

fat4 you are now considered a diffector and an ennemi of rome


----------



## gupsterg

Nice to read you've sold your Ares III for what you wanted







, but







on green team join/purchase ...


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Nice to read you've sold your Ares III for what you wanted
> 
> 
> 
> 
> 
> 
> 
> , but
> 
> 
> 
> 
> 
> 
> 
> on green team join/purchase ...


Well.....it all comes down to polaris 10 and its performance.
1080 has a much larger chip so I expect t obe a lot stronger...,,,

If so then Green team will need your help with bios mods


----------



## Alex132

Quote:


> Originally Posted by *Mega Man*
> 
> Then when you have to " fix " your raid. Don't blame the psu
> 
> raid sucks


Never had an issue with any of my RAID arrays over 6+ years. I have kept the same RAID0 array on my motherboard by swapping out PSUs, resetting CMOS, etc. etc. If you know what you're doing, it's very hard to mess it up. You just need to take extra care with some things.

RAID0/1 is fine on your motherboard, RAID5 hardware you pretty much have to get an external card.


----------



## fayzaan

yea im not sure what caused it, not saying it was the PSU..just that after swapping out the PSU, for some reason it said failed, and only 1 drive was showing in the raid, the other was showing as non-raid. Happened to me before as well when I updated the motherboards BIOS.

Oh well, not too big of a deal...just had to reinstall all my games







.


----------



## wermad

Cards sold on ebay









Waiting for payment to clear and I'll be shipping them out. sad to see them go


----------



## Mega Man

Congrats


----------



## wermad

Ty







, they did go for the asking and thats' inline with prices right now.

Sadly, I'm taking a break and selling my rig minus the tX10 and the rads. I just bought a little beater car and I need to pay for it (and upgrades!







) and i'm starting a new job that has expenditures right away. If things are good in a few months, I'll slowly start rebuilding. I'm more inclined to get a RVE, block it, 5930K, and four 980 Ti's. I may also pc the TX all white







.


----------



## Mega Man

......


----------



## xarot

Quote:


> Originally Posted by *fat4l*
> 
> So my Ares III is sold now. Finally I can move to the green team lol.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Xarot, what about your dead ares ? Any news ?


I sold it as a dead card, "perhaps fixable" for low price (roughly the price of a single 295X2 waterblock). My wallet took a great hit there but I got rid of it.

I am trying to sell the another Ares now locally over here but so far no one's interested. Usually buyers around here are just 'kicking the tires' and making ridiculously low offers.









My opinion only but I think you won't be disappointed, for me the grass is way greener in the other camp.


----------



## wermad

if you bought it through ebay, you could have used the buyer protection program as long as you're within the window. I bought a 7970 BE edition a couple of years ago and I asked the guy if it was the 1100 or 1150 clocked as many don't know the BE was clocked in different versions. Says it was, bought it, got it, and popped it in. Turns out it was the former and I confronted the guy. He probably didn't bother to put it in his machine and check gpuz (which i told him initially he could do). Nasty messages back and forth until ebay then can make a decision. 48 hours later, i got the nod and sent back his cards. Once ups confirmed it was delivered, Paypal started the refund process.


----------



## Dagamus NM

What the heck? Everybody is bailing on their 295x2s. Mine are still running strong. If I spent more time gaming I might be annoyed that they seem to struggle maintaining 60fps at 4K on dual asus PB287Qs. Solution, moved those monitors to my quad 980ti rig. Went with a pair of Acer H257HU 1440p monitors and now everything is happy. I think I will grab a third Acer monitor.

The only downside is that these monitors run hit and these cards run hot too.


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> Well.....it all comes down to polaris 10 and its performance.
> 1080 has a much larger chip so I expect t obe a lot stronger...,,,
> 
> If so then Green team will need your help with bios mods


I'm reckoning you may just think you should have kept Ares III







.

Just saw this on Videocardz, my Fury X with tess.tweak result(OC bench stable tested). Will try it with my best stable 24/7 OC 1135/535 as well and 3DM FS Extreme as soon as finish [email protected] run.

Dunno about how legit this 1080 3DM results are, but assuming not too far off from what we'll see.

Green team bios mod by me? not happening







.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> I'm reckoning you may just think you should have kept Ares III
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Just saw this on Videocardz, my Fury X with tess.tweak result(OC bench stable tested). Will try it with my best stable 24/7 OC 1135/535 as well and 3DM FS Extreme as soon as finish [email protected] run.
> 
> Dunno about how legit this 1080 3DM results are, but assuming not too far off from what we'll see.
> 
> Green team bios mod by me? not happening
> 
> 
> 
> 
> 
> 
> 
> .


Well....regarding performance I should maybe be worried. However I lack "peace of mind" with the red team. The whole crossfire+freesync thingy is pissing my off from time to time and the powerdraw as well.
I'm willing to get some "peace of mind" in trade of performance.
We shall see how it will all end up







For now I'm using my 4790K integrated GPU ahaha


----------



## gupsterg

IF those 3DM GTX 1080 benches are legit and from how all the "talk" that Polaris 11/10 is aiming at differing segment/price range of GPU, which maybe an indication of expected performance, then 290/X, 390/X and 295X2 owners are probably not gonna swap out GPU IMO.

Yeah lower power usage is great, but if your not running [email protected], etc long periods of time would the gamer really see a reduction in electricity bill to warrant losing money on replacing GPU?

I reckon any owner of a 295X2 who even bought it at launch must be luving how long they've enjoyed the card







. Yeah AMD/CF has issues but is really nVidia/SLi truly better/without an issue? from time to time when I have had a peak at green stuff it doesn't seem as it is greener on the other side







.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> IF those 3DM GTX 1080 benches are legit and from how all the "talk" that Polaris 11/10 is aiming at differing segment/price range of GPU, which maybe an indication of expected performance, then 290/X, 390/X and 295X2 owners are probably not gonna swap out GPU IMO.
> 
> Yeah lower power usage is great, but if your not running [email protected], etc long periods of time would the gamer really see a reduction in electricity bill to warrant losing money on replacing GPU?
> 
> I reckon any owner of a 295X2 who even bought it at launch must be luving how long they've enjoyed the card
> 
> 
> 
> 
> 
> 
> 
> . Yeah AMD/CF has issues but is really nVidia/SLi truly better/without an issue? from time to time when I have had a peak at green stuff it doesn't seem as it is greener on the other side
> 
> 
> 
> 
> 
> 
> 
> .


Performance/cost = 295X2 wins by far. Yes. thats true. Even "old" 7990 wins. I had it before and sold it to my friend that is still rocking with it..

But...Its those little things I dont like. I'm aiming for a single 1440p graphics card w/o any issues with SLI/CF, no additional "lag" from split renderring, no issues with freesync + CF = flickering.
Those "leaked" tests(if legid) are tested with i7 3770 lolz, which definitely influences graphics scores as well.
If I say, my Oced ares was ~20% better than Oced 980Ti, and now if 1080 is faster that 980Ti by even 10% and will be ocing as good as 980Ti then I will get close to Ares 3 performace, with a single gpu, low power draw, no issues with CF/SLI etc.
We will see tho. If not, I can still go for AMD


----------



## fat4l

Oh new results here :


----------



## wermad

paypal put a hold on the money until the buyer receives and confirms he's happy, what the frack!!!!!!!!!!!! Grrrrrrrrrrrrrrrrrrrr...ebay shows to ship it. I'm just a bit hesitant to ship it while paypal holds my money ransom.


----------



## Agent Smith1984

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> paypal put a hold on the money until the buyer receives and confirms he's happy, what the frack!!!!!!!!!!!! Grrrrrrrrrrrrrrrrrrrr...ebay shows to ship it. I'm just a bit hesitant to ship it while paypal holds my money ransom.


It's been like this for a few years now... I think it does this until you have 50+ positive feedback or something like that. You are going to have to ship the item, and you will get your money within 7 days of the item delivering.


----------



## wermad

I have close to 500.

Edit: its probably because i dont sell this amount frequently.


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> But...Its those little things I dont like. I'm aiming for a single 1440p graphics card w/o any issues with SLI/CF, no additional "lag" from split renderring, no issues with freesync + CF = flickering.


I concur that's why I've never gonna dual gpu yet, those little things would bug me. I stuck to 1080P with Hawaii, even using that with Fury X. I like max eyecandy without compromising on FPS, so I don't see 1440P viable yet on single GPU (which in my price range). I've been a Custom PC subscriber since issue 1 and so highly regard their review, some of those are on Bit Tech. Crysis has always been by bench for buying a GPU, view the 1440P results for Crysis 3 here (test rig is [email protected]).
Quote:


> Originally Posted by *fat4l*
> 
> Those "leaked" tests(if legid) are tested with i7 3770 lolz, which definitely influences graphics scores as well.


Hmm, not convinced a i7 3770 would hold back graphics score.
Quote:


> Originally Posted by *fat4l*
> 
> If I say, my Oced ares was ~20% better than Oced 980Ti, and now if 1080 is faster that 980Ti by even 10% and will be ocing as good as 980Ti then I will get close to Ares 3 performance, with a single gpu, low power draw, no issues with CF/SLI etc.
> We will see tho. If not, I can still go for AMD
> 
> 
> 
> 
> 
> 
> 
> :thumb:


I can see where you're coming from







.
Quote:


> Originally Posted by *fat4l*
> 
> Oh new results here :
> 
> 
> Spoiler: Warning: Spoiler!


Any chance of source for image? if you compare the VideoCardz i7 3770 result GS = 10102 , your image with i7 5820K GS = 10733

Ran 3DM FS Extreme on my daily Fury X OC, results. Now compare result 2 & 3 ~1100 point GS score difference and you can see FPS scale.


----------



## Ironjer

Hey guys i found this GOP BIOS with lastest version but i cant find the slave one. Not tested yet

http://www.techpowerup.com/vgabios/168806/168806


----------



## xarot

Quote:


> Originally Posted by *wermad*
> 
> if you bought it through ebay, you could have used the buyer protection program as long as you're within the window. I bought a 7970 BE edition a couple of years ago and I asked the guy if it was the 1100 or 1150 clocked as many don't know the BE was clocked in different versions. Says it was, bought it, got it, and popped it in. Turns out it was the former and I confronted the guy. He probably didn't bother to put it in his machine and check gpuz (which i told him initially he could do). Nasty messages back and forth until ebay then can make a decision. 48 hours later, i got the nod and sent back his cards. Once ups confirmed it was delivered, Paypal started the refund process.


If your reply was for me, no the Ares III I bought was not off the bay. I got it from a local forum so I was screwed because it died like in a week. Note to myself...don't buy used stuff from any users who use LN2 and tend to "fix" their broken hardware with hardware mods. Lol the guy said that if it outputs picture it is ok. Gah...I've seen graphics cards fail after running 3DMark in a loop after few hours.


----------



## SAFX

XSPC water block, anyone currently using?


----------



## Samuris

If i flash my sapphire r9 295x2 with insanity 1.8 hynix bios, what will happen ?


----------



## Ironjer

Quote:


> Originally Posted by *Samuris*
> 
> If i flash my sapphire r9 295x2 with insanity 1.8 hynix bios, what will happen ?


what bios are you talking about?


----------



## fayzaan

Not sure if everyone is experiencing this issue, but on Witcher 3, my 3rd GPU stays idle. I have a r9 295x2 and r9 290x running in crossfire. GTA 5, BF4, 3D Mark, work fine...but on Witcher 3 3rd card is always at 0% load. I am sure its the 290x that is sitting idle...

anyone know why this is happening? is there a setting I need to change? is this a bug?


----------



## xarot

Quote:


> Originally Posted by *fayzaan*
> 
> Not sure if everyone is experiencing this issue, but on Witcher 3, my 3rd GPU stays idle. I have a r9 295x2 and r9 290x running in crossfire. GTA 5, BF4, 3D Mark, work fine...but on Witcher 3 3rd card is always at 0% load. I am sure its the 290x that is sitting idle...
> 
> anyone know why this is happening? is there a setting I need to change? is this a bug?


Witcher 3 runs very poorly in CF. I couldn't get it properly working in quad-CFX either. I think it worked on 2 GPUs too.


----------



## Mega Man

Isn't it a gamefails title? Errhm. Excuse me gameworks


----------



## wermad

With a heavy heart, I said good-bye to my twin 295x2's







. Over 20lbs for the package! Two cards with stock coolers, two blocks, and their accessories. Plus, lots, and lots of packing material in a large box with a bunch of foam spacers. As i was testing each one, the urge to keep them and run them stock grew and grew, but ultimately, I'm selling them for good reasons and to take a break from something that is a hobby then a necessity in life.

I can't believe the Ti's are dropping down to ~$350-400s







. I just saw a few w/ ek blocks go for ~$500 on ebay. Damn....well, money is tight so I'll just run the ares and chivf for now. Still, i may get a couple if 1080 makes them drop more (







$699?!?!?!?!?!?!?!).

If anyone is interested in a G1 Z170 Gaming (stable) and/or a 6600k and/or G.skill 3000 ddr4, hit me up before I set it up for sale soon.


----------



## fat4l

Quote:


> Originally Posted by *wermad*
> 
> With a heavy heart, I said good-bye to my twin 295x2's
> 
> 
> 
> 
> 
> 
> 
> . Over 20lbs for the package! Two cards with stock coolers, two blocks, and their accessories. Plus, lots, and lots of packing material in a large box with a bunch of foam spacers. As i was testing each one, the urge to keep them and run them stock grew and grew, but ultimately, I'm selling them for good reasons and to take a break from something that is a hobby then a necessity in life.
> 
> I can't believe the Ti's are dropping down to ~$350-400s
> 
> 
> 
> 
> 
> 
> 
> . I just saw a few w/ ek blocks go for ~$500 on ebay. Damn....well, money is tight so I'll just run the ares and chivf for now. Still, i may get a couple if 1080 makes them drop more (
> 
> 
> 
> 
> 
> 
> 
> $699?!?!?!?!?!?!?!).
> 
> If anyone is interested in a G1 Z170 Gaming (stable) and/or a 6600k and/or G.skill 3000 ddr4, hit me up before I set it up for sale soon.


How much did they go for ? I sold my ares 3 already too. Now running on 4790k integrated graphics








Selling my BenQ XL2730Z, and I already bought Asus Swift PG278Q for 379£ used. Nice!
Now waiting for 1080


----------



## wermad

I'm going to sell my G1 Z170 and get something cheap, so I can put the rest of my limited funds towards possibly getting a three or four Ti's.


----------



## Ironjer

Hey Guys new Case over here. Corsair Air 540

I came Cooler Master Cosmos 2.

Beautiful Case!


----------



## wermad

Gorgeous setup


----------



## TheLAWNOOB

How much is the R9 295X2 now?


----------



## xarot

Quote:


> Originally Posted by *wermad*
> 
> I'm going to sell my G1 Z170 and get something cheap, so I can put the rest of my limited funds towards possibly getting a three or four Ti's.


Not really worth it getting more than two cards. Why getting three or four cards on cheapo platform anyway?


----------



## Alex132

Epeen?


----------



## gupsterg

Quote:


> Originally Posted by *Samuris*
> 
> If i flash my sapphire r9 295x2 with insanity 1.8 hynix bios, what will happen ?


It won't work as there are no 295X2 insanity ROMs







.


----------



## wermad

Quote:


> Originally Posted by *xarot*
> 
> Not really worth it getting more than two cards. Why getting three or four cards on cheapo platform anyway?


A cheap X79 will still give me 40 lanes vs a $500 cpu only X99 setup. Money saved here can be put towards a 4th Ti. Scaling is typical, poor for the majority of the time, but the 3rd and/or 4th does have a few gains. And with prices dipping below $400 usd, i don't mind getting a bit more. I wouldn't do it if the cards were still closer to their msrp. Plus, i have a big a$$ case and loop that beg for more hardware.

Edit: started looking at prices and it seems I'm not too far off if I sell my z170 and get an x79 setup. My current board still gives me 8/8/8/8 with the plx, so its probably easier to stay put.


----------



## Dagamus NM

Quote:


> Originally Posted by *wermad*
> 
> A cheap X79 will still give me 40 lanes vs a $500 cpu only X99 setup. Money saved here can be put towards a 4th Ti. Scaling is typical, poor for the majority of the time, but the 3rd and/or 4th does have a few gains. And with prices dipping below $400 usd, i don't mind getting a bit more. I wouldn't do it if the cards were still closer to their msrp. Plus, i have a big a$$ case and loop that beg for more hardware.
> 
> Edit: started looking at prices and it seems I'm not too far off if I sell my z170 and get an x79 setup. My current board still gives me 8/8/8/8 with the plx, so its probably easier to stay put.


Do it. I have four Tis on my x99 and love it. Actually two of them, 780Tis on one and 980Tis on the other.

X79 is still pretty sweet. Sure there may not be anymore bios updates but on the upside you can pick up 64gb of DDR3 for cheap. I actually have a couple of kits I will sell soon.


----------



## wermad

Yeah, I just missed out on 32gb for a cheap too and there's a few hexacores for under $200. But by the time i add ram, even cheap ram, a cpu, sound card, and blocks, I'm at or very close to what I would get if I sold my Z170 setup. I kinda have a crush on the looks of my current board. Just the bios can be tricky and its hard to swallow your pride and accept things don't work properly with some expensive gear (no xmp or no cpu oc'ing, bios are that borked).

Going through some tough times right now and I think rebuilding with the Ti's is gonna help me a lot. I start a new job in a couple of weeks so might as well start. Just picked up two EK Titan X blocks on Hard and one w/ a backplate on ebay.


----------



## wermad

295x2 still holds strong vs 1080, and using tpu's summary, its about 15% off in general. Not bad considering the "FE" version is $699 (







). Hmmm, hope this doesn't plateau 980 Ti prices







.

Anyways, my 295x2's were delivered to the east coast and soon they're gonna make their way to Russia. The seller has been very cooperative and I contacted paypal and they assured me my monies will be available by the end of the week. Sadly, my upgrade is gonna take a back seat as I need the cash for something much more important. Don't fret! I already received two blocks and I have two more coming in soon. I'm hoping in August, I should be back to normal (financially) and I should be able to order four Ti's







.

I persuaded the new owner to join as this is club has a bunch of great folks


----------



## leonman44

Hello guys , i am going to trade my 980ti for a 295X2 because of DX12 changes , an r9 290X is just 2 fps behind me and that kills me as i sold my 290X for that card!!!! I have found a guy who is editing and likes 980ti more than his 295X2 , i have An RM850i corsair gold psu , will i be able to run this card on this psu without voltage increase? Will i have problems with vrms using a single card? I have the Corsair 780t full tower so airflow is good. Does this card have any other problem to worry?


----------



## wermad

Quote:


> Originally Posted by *leonman44*
> 
> Hello guys , i am going to trade my 980ti for a 295X2 because of DX12 changes , an r9 290X is just 2 fps behind me and that kills me as i sold my 290X for that card!!!! I have found a guy who is editing and likes 980ti more than his 295X2 , i have An RM850i corsair gold psu , will i be able to run this card on this psu without voltage increase? Will i have problems with vrms using a single card? I have the Corsair 780t full tower so airflow is good. Does this card have any other problem to worry?


It really depends on your resolution. Hawaii does really well in 2k and 4k, but amd is known to have poor 1080 or < performance, especially in xfire. This card on its own will really shine in 2k or 4k and will leave a single 290x in the dust. Scaling is pretty good in most situations but if you run into a game that hates crossfire, you can make a profile to disable crossfire.

I've heard lots and lots of concerns regarding the RM series from Corsair. Though, you have the RMi series, I would ask for a second opinion as this card tends to draw a lot of juice. If you get the nod, it should be ok as it packs 70 amps, enough for your card and your system. Though, X99 does draw a bit more power, you might wanna hold off on oc'ing the card. Get a 1000w unit w/ a single rail (~83 amps) and you should be fine.

I recently sold my two cards and I'm planning to buy four Ti's and probably get a Gsync monitor for the holidays. If you need more psu options, I made a list in the op. Its a bit dated but most of the mainstream, recent, and some older stuff is on there. Just remember the amd recommendations: 28 amps for each 8-pin, or 50 amps for combined (single rail). Generally, you wanna leave 20-30 amps for the rest of your system/hardware.

Good luck,


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *leonman44*
> 
> Hello guys , i am going to trade my 980ti for a 295X2 because of DX12 changes , an r9 290X is just 2 fps behind me and that kills me as i sold my 290X for that card!!!! I have found a guy who is editing and likes 980ti more than his 295X2 , i have An RM850i corsair gold psu , will i be able to run this card on this psu without voltage increase? Will i have problems with vrms using a single card? I have the Corsair 780t full tower so airflow is good. Does this card have any other problem to worry?
> 
> 
> 
> It really depends on your resolution. Hawaii does really well in 2k and 4k, but amd is known to have poor 1080 or < performance, especially in xfire. This card on its own will really shine in 2k or 4k and will leave a single 290x in the dust. Scaling is pretty good in most situations but if you run into a game that hates crossfire, you can make a profile to disable crossfire.
> 
> I've heard lots and lots of concerns regarding the RM series from Corsair. Though, you have the RMi series, I would ask for a second opinion as this card tends to draw a lot of juice. If you get the nod, it should be ok as it packs 70 amps, enough for your card and your system. Though, X99 does draw a bit more power, you might wanna hold off on oc'ing the card. Get a 1000w unit w/ a single rail (~83 amps) and you should be fine.
> 
> I recently sold my two cards and I'm planning to buy four Ti's and probably get a Gsync monitor for the holidays. If you need more psu options, I made a list in the op. Its a bit dated but most of the mainstream, recent, and some older stuff is on there. Just remember the amd recommendations: 28 amps for each 8-pin, or 50 amps for combined (single rail). Generally, you wanna leave 20-30 amps for the rest of your system/hardware.
> 
> Good luck,
Click to expand...

The RMi series are good quality PSUs, it's the RM ones I'd stay away from.

I think you'd get away with running a 295x2 on a 850w PSU with that setup, obviously 1000w would be best but I personally think it'd be fine


----------



## leonman44

Yeap , RMi is all component made in japan and has more years warranty than RM does...
Look at this benchmarks:


----------



## leonman44

Some people says that r9 295X2 will stutter cause of crossfirw , is that true?


----------



## xarot

Quote:


> Originally Posted by *leonman44*
> 
> Some people says that r9 295X2 will stutter cause of crossfirw , is that true?


It depends on game. Some games are a stuttering mess.


----------



## leonman44

Quote:


> Originally Posted by *xarot*
> 
> It depends on game. Some games are a stuttering mess.


So what do you believe guys? Is it worth it to trade my 980ti for a 295x2 (considering dx12) or it will add too much trouble to me? You own this cards so you know them better than everyone else!


----------



## SAFX

Quote:


> Originally Posted by *leonman44*
> 
> Some people says that r9 295X2 will stutter cause of crossfirw , is that true?


I had micro-stutters on GTA V, fixed it with V-Sync and FPS locked at 60, sufficient for smooth game play, but every game is different.


----------



## SAFX

Are these benchmarks for real? 295x2 places dead last?....someone forget to enable GPU #2?

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/8.html


----------



## wermad

Heard that game dont do well with amd. If you check the performance summary, 295x2 is about 15% behind the 1080 in 4k (second place).


----------



## SAFX

Quote:


> Originally Posted by *wermad*
> 
> Heard that game dont do well with amd. If you check the performance summary, 295x2 is about 15% behind the 1080 in 4k (second place).


interesting, will check on that, thx
















didn't even realize that article had 10 more benchmarks (scroll down next time!







)


----------



## Mega Man

Quote:


> Originally Posted by *SAFX*
> 
> Are these benchmarks for real? 295x2 places dead last?....someone forget to enable GPU #2?
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/8.html


I don't buy it anyway.

970sli worse perf then 970


----------



## Gdourado

Since the 295x2 is older in the market, I am wondering if there have been any issues with the AIO unit of the card.
Namely loss of coolant due to evaporation resulting on loss of cooling performance and noise due to air bubbles.
Is that an issue?


----------



## Mega Man

It will eventually.

But not that I am aware


----------



## SAFX

My 295x2 journey is likely coming to an end.

Picked up NZXT H440 earlier this year for custom loop, but failed to consider 295x2's length to accommodate rad, pump, and all the trimmings. Unable to return the case, I'll probably sell my card in exchange for the 1080, I really need the space to avoid a frustrating build.

If you're interested in purchasing my 295x2, and you're in the New York area, PM me. 1080 isn't released until June 10th, so mid June is the earliest I'd be ready to sell.
Card is stock with original cooler, hardware, box, everything, even have the shopping bag from Micro Center.....ok that was a joke but you get the point


----------



## Impsak

Hi All,

I'm new to the forum and hopefully my issue/question has not been asked 100 times before.
(I tried to search for "fan and spinning" under this thread but did not find any specific same issue I have)

I bought my ASUS Radeon 295x2 in August 2 years ago. (2014)

The card is still a beast, pushing 144hz (2560x1440p) easy in BF4 with freesync enabled etc.

I am running latest AMD Crimson drivers 16.5.2 and GPU Tweak V 1.2.0.3
(The newest GPU Tweak 1.2.2.0 resets my computer with a power surge message in bios if I choose any of the 3 "Gaming" Silent"or"OC" profiles)

If anyone has similar issues, I have made .pdf and a case about that here:
https://rog.asus.com/forum/showthread.php?85302-AMD-295x2-on-ASUS-Maximus-VIII-Hero-Alpha-Causing-power-surges

Either way, my weird issue is:

After I started using windows 10.. I have a weird things going on with my GPU fans.

We all know that when you power on the computer the fans spinns like hell.. until windows 10 drivers kicks inn. It was the same in windows 7.

When windows 10 ha finished loading, and if I wait lets say.. around 2-3 minutes after windows 10 has loaded without doing anything at all.. or if I start working on the computer straight away.. does not matter..
My ASUS Radeon 295x2 fans will start spin 100% again.. like they do before windows 10 drivers kicks inn and quiets them down.

There is a very easy fix... but it is enoying.. and sometimes I need to do this several times before the fans keeps quiet
(until I start gaming of-course then they will naturally speed up to cool the GPU)

The easy workaround fix is: "Right click windows 10 start button" and restart windows 10 (not power down computer) just a restart. In 90% of the cases the fans will not spin up again now after 2-3 minuters in idle or browsing or whatever after entering windows 10.

I dont think I had this issue with windows 7 last year when I used Windows 7.
I first upgraded from windows 7 to windows 10.
But now, after swapped from Rampage 4 extreme and 3960x to a Skylake 6700K on ASUS maximus VIII Hero Alpha, I also have windows 10 PRO (64bit) on a USB stick. (So after clean install of windows 10 and a new build, issue is still there)

It is not related to mobo/CPU or system. As I have had this issue on 2 systems now. But both times with Windows 10.

Any info or help is very welcome


----------



## Alex132

Anyone else have an issue where the tubing underneath the VRM fan would expand out ever-so-slightly to start hitting the VRM fan and make an awfully annoying clicking/thudding noise? It seems like I need to push the tubing back almost daily now...


----------



## Alex132

Quote:


> Originally Posted by *Impsak*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hi All,
> 
> I'm new to the forum and hopefully my issue/question has not been asked 100 times before.
> (I tried to search for "fan and spinning" under this thread but did not find any specific same issue I have)
> 
> I bought my ASUS Radeon 295x2 in August 2 years ago. (2014)
> 
> The card is still a beast, pushing 144hz (2560x1440p) easy in BF4 with freesync enabled etc.
> 
> I am running latest AMD Crimson drivers 16.5.2 and GPU Tweak V 1.2.0.3
> (The newest GPU Tweak 1.2.2.0 resets my computer with a power surge message in bios if I choose any of the 3 "Gaming" Silent"or"OC" profiles)
> 
> If anyone has similar issues, I have made .pdf and a case about that here:
> https://rog.asus.com/forum/showthread.php?85302-AMD-295x2-on-ASUS-Maximus-VIII-Hero-Alpha-Causing-power-surges
> 
> Either way, my weird issue is:
> 
> After I started using windows 10.. I have a weird things going on with my GPU fans.
> 
> We all know that when you power on the computer the fans spinns like hell.. until windows 10 drivers kicks inn. It was the same in windows 7.
> 
> When windows 10 ha finished loading, and if I wait lets say.. around 2-3 minutes after windows 10 has loaded without doing anything at all.. or if I start working on the computer straight away.. does not matter..
> My ASUS Radeon 295x2 fans will start spin 100% again.. like they do before windows 10 drivers kicks inn and quiets them down.
> 
> There is a very easy fix... but it is enoying.. and sometimes I need to do this several times before the fans keeps quiet
> (until I start gaming of-course then they will naturally speed up to cool the GPU)
> 
> The easy workaround fix is: "Right click windows 10 start button" and restart windows 10 (not power down computer) just a restart. In 90% of the cases the fans will not spin up again now after 2-3 minuters in idle or browsing or whatever after entering windows 10.
> 
> I dont think I had this issue with windows 7 last year when I used Windows 7.
> I first upgraded from windows 7 to windows 10.
> But now, after swapped from Rampage 4 extreme and 3960x to a Skylake 6700K on ASUS maximus VIII Hero Alpha, I also have windows 10 PRO (64bit) on a USB stick. (So after clean install of windows 10 and a new build, issue is still there)
> 
> It is not related to mobo/CPU or system. As I have had this issue on 2 systems now. But both times with Windows 10.
> 
> Any info or help is very welcome


Try disable fast boot and see if that has any affect? I haven't had any issues with my 295X2 and Windows 10. You might also want to try a fresh driver install and see if that does anything. (Driver sweeper -> latest beta drivers).


----------



## Impsak

false
Quote:


> Originally Posted by *Alex132*
> 
> Try disable fast boot and see if that has any affect? I haven't had any issues with my 295X2 and Windows 10. You might also want to try a fresh driver install and see if that does anything. (Driver sweeper -> latest beta drivers).


Hi Thanks for feedback.

I will try to disable the bios fastboot later today when I get home.
What withing the fastboot operation do you think might cause this ? (Just wondering for knowledge)









Regarding uninstalling driver... There I a very specific procedure I need to follow.. if not rolling back or installing new AMD drivers will crash my computer at game launch.

What I need to do, and what I do each time when uninstalling AMD Crimson.

1. Control Panel/Progs apps.. uninstall all AMD crimson.....
2. RESTART computer
3. Run DDUninstaller into safemode and uninstall all AMD...
4. RESTART computer
5. Run CCleaner and clean the registry x2 times (finds registry error 2nd time as well)... (had alot of issues using new/older drivers until i started using this CCleaner)
(As optional after CClenaer I also run registry booster 2016, but it also works without doing this)
6. RESTART computer

Now I can install older or newer AMD crimson drivers and get them working.

(When refering to DDuninstaller, this is the new "driversweeper")









So the only thing is to try and disable fast boot i guess.

For un-installing ASUS GPU tweak II. I need to follow same procedure. But after uninstalling I also need to manually delete two files withing the ASUS GPU tweak II folder which is left over from uninstallation.

Lets face it... this uninstalling of programs is not working. AMDs procedures... does not include anything about needing to use DDUninstaller or CCleaner.

Either way. that a different issue


----------



## Alex132

Fastboot can pause/hibernate drivers to make the booting process faster --- there might be something not properly starting up / shutting down with the AMD driver in that process.


----------



## Impsak

Quote:


> Originally Posted by *Alex132*
> 
> Fastboot can pause/hibernate drivers to make the booting process faster --- there might be something not properly starting up / shutting down with the AMD driver in that process.


Ok.

If it is windows 10 fastboot you mean, I have disabled everything related to sleep/hibernate etc. Hate that stuff.

Or is it "Bios" fastboot you are talking about ?


----------



## deggiebabeh

Greets 295 Folks o/ I have 3 of these bad boys. Recently, one has started getting very hot at the power connector. To the point that bad smell/melting is going going on. Anyone got a clue why this should be? Get the same result with a corsair 750 (just connected to gfx) and a 1350 (cant remember name) Halp!


----------



## Mega Man

loose connection generally


----------



## deggiebabeh

Hmm. So could just be a bad socket? The card runs perfectly well apart from the alarming smell :/ Have tried it with 2 PSUs/Sets a cables. I guess that suggests the socket on the GPU side. Could really really do with the bloody thing working. :/


----------



## Mega Man

maybe, maybe the wiring maybe debris but generally heat is from poor connection


----------



## T3naci0usT

Hey you guys having an issue with the Witcher 3 and crossfire? Everytime I enable it I get a lot of flickering from the menus and the game doesn't stutter but it feels like it lags then catches back up. The easiest place to see it is while riding roach, and just riding around.


----------



## kaosstar

Quote:


> Originally Posted by *T3naci0usT*
> 
> Hey you guys having an issue with the Witcher 3 and crossfire? Everytime I enable it I get a lot of flickering from the menus and the game doesn't stutter but it feels like it lags then catches back up. The easiest place to see it is while riding roach, and just riding around.


I get poor crossfire scaling in that game and some microstuttering. Are you using the latest drivers, did you disable AA, and have you manually reduced the tesselation in crimson?


----------



## T3naci0usT

Yes, 16.5.3. I have tried everything, and my buddy who is running a 295x2 with a 4790k is having the same issues.


----------



## xarot

Quote:


> Originally Posted by *T3naci0usT*
> 
> Hey you guys having an issue with the Witcher 3 and crossfire? Everytime I enable it I get a lot of flickering from the menus and the game doesn't stutter but it feels like it lags then catches back up. The easiest place to see it is while riding roach, and just riding around.


The stuttering has been there since the lauch of the game, and developer and/or AMD didn't really care ever fixing it. Disable CFX or try using 60 fps/Vsync etc...but it doesn't entirely eliminate the stuttering. Only disabling Crossfire will fix that.

It's sad that it was never fixed, as that game was a huge success.


----------



## T3naci0usT

Your telling me. I am fine with playing on 1 GPU but i was trying to get a constant 60. Oh well. I still love the game. Was seeing if anyone had any solutions.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xarot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *T3naci0usT*
> 
> Hey you guys having an issue with the Witcher 3 and crossfire? Everytime I enable it I get a lot of flickering from the menus and the game doesn't stutter but it feels like it lags then catches back up. The easiest place to see it is while riding roach, and just riding around.
> 
> 
> 
> The stuttering has been there since the lauch of the game, and developer and/or AMD didn't really care ever fixing it. Disable CFX or try using 60 fps/Vsync etc...but it doesn't entirely eliminate the stuttering. Only disabling Crossfire will fix that.
> 
> It's sad that it was never fixed, as that game was a huge success.
Click to expand...

Agreed, I love the Witcher 3 but unfortunately with the 295x2 it's a headache.

Best I could do was overclock one of the cores to 1100/1500 and that got me roughly 60fps avg at 1440p on a mix of Ultra/High settings.

Somewhat related news: I managed to find a EK Nickel block + backplate for $35 USD









Now I just need a pump for it......


----------



## rdr09

Quote:


> Originally Posted by *xarot*
> 
> The stuttering has been there since the lauch of the game, and developer and/or AMD didn't really care ever fixing it. Disable CFX or try using 60 fps/Vsync etc...but it doesn't entirely eliminate the stuttering. Only disabling Crossfire will fix that.
> 
> It's sad that it was never fixed, as that game was a huge success.


What cpu?


----------



## T3naci0usT

9590 at 5


----------



## xarot

Quote:


> Originally Posted by *rdr09*
> 
> What cpu?


Mine was 3970X/4960X/5960X...


----------



## rdr09

Quote:


> Originally Posted by *T3naci0usT*
> 
> 9590 at 5


Quote:


> Originally Posted by *xarot*
> 
> Mine was 3970X/4960X/5960X...


Checkout this video. I showed this in another thread and it seems that it was working for awhile and a patch or more is/are breaking it . .






It includes a link to amd optimization for the game. not sure if it will help 'cause i don't play the game. Note that even nVidia owners (not sli) are having a hard time tweaking this game. gl

EDIT: You guys need to test your card(s) after every driver update using synthetic benches. I recommend Heaven 4.0 and run Afterburner in the background to check usage. The core clock at load in this bench should be a *straight line* with occasional dips for cut scenes. if not, then it needs to be fixed before diving to a game.



also, make sure ulps is disabled.

@xarot, do you mind posting a link of your Firestrike run at stock?


----------



## Mega Man

Afaik Witcher 3 has sucked for everyone due to gamenotworks more or less since launch. Amd tries to fix then the new patch continually broke it...


----------



## xarot

Quote:


> Originally Posted by *rdr09*
> 
> Checkout this video. I showed this in another thread and it seems that it was working for awhile and a patch or more is/are breaking it . .
> 
> 
> 
> 
> 
> 
> It includes a link to amd optimization for the game. not sure if it will help 'cause i don't play the game. Note that even nVidia owners (not sli) are having a hard time tweaking this game. gl
> 
> EDIT: You guys need to test your card(s) after every driver update using synthetic benches. I recommend Heaven 4.0 and run Afterburner in the background to check usage. The core clock at load in this bench should be a *straight line* with occasional dips for cut scenes. if not, then it needs to be fixed before diving to a game.
> 
> 
> 
> also, make sure ulps is disabled.
> 
> @xarot, do you mind posting a link of your Firestrike run at stock?


Sorry I don't have the Ares III installed anymore. Currently trying to sell it on a local forum.


----------



## rdr09

Quote:


> Originally Posted by *xarot*
> 
> Sorry I don't have the Ares III installed anymore. Currently trying to sell it on a local forum.


no biggie.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> Afaik Witcher 3 has sucked for everyone due to gamenotworks more or less since launch. Amd tries to fix then the new patch continually broke it...


I'm actually playing it right now and it's up there as one of my favourite games, It's only Crossfire where it falls short, I'm currently running it on a mix of Ultra/High settings and Hairworks on with an R9 Fury and it's performing pretty well


----------



## Mega Man

Excuse me i screwed up. I ment only as far as multi gpu setups.... and I didn't say that......


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> Excuse me i screwed up. I meant only as far as multi gpu setups.... and I didn't say that......


That makes a bit more sense now, as for Crossfire support I honestly don't know who is at fault there anymore......I'm going with a bit of column A and a bit of column B tbh


----------



## bose25

Hi everyone,

I've just bought two of these cards with custom watercooling and I'm having some issues that I'd really appreciate some advice about.

The short of it is that I can only get one R9 295x2 to work at a time, whereas the one that isn't connected to a monitor at boot is simply not recognised by anything in Windows (neither device manager, GPU-Z or Afterburner etc).

I've confirmed both cards work perfectly well - multiple boots with the monitor connected to each card separately allows me to use them individually without problems, so this rules out an issue with the cards, PCI-e ports or motherboard.

There is also no problem with the power supply - I actually thought this was the case to begin with and have test three of them already.

I have tried numerous reinstallations of the drivers and AMD software - I've even been as far as booting using Linux and upgrading my much-loved Win7 to Win10 in the hopes that the non-monitor-connected card would be recognised, to no avail.

Does anyone have any experience with the same issue?


----------



## bobbavet

Gday Guys

Reporting in. Some may remember I bought a 2nd hand 295x2 and slapped on a factory refurbished Koolance WB on it. Totally cheaped out set up.

The card is still going great. Performs well and get 60fpsish on most titles @ 4k.
Truly can't wait for dx12 implementation in games.
Memory pooling will give this card another great kick along.

My only gripe would be single GPU performance on War Thunder. I still get 60fps @ 4k but no AA.
Wouldn't be a prob on a 32" but looks stunning beautiful full AA on 40" though 30fps kinda sucks.
But that's the vagaries of multi GPU I suppose, and hopefully will be a thing of the past.

I might add I haven't clocked the card. Can any one suggest a modest GPU/Mem OC to start with in Afterburner?

Anyone playing a UE4 game title, you can now use Xfire using the "Paragon" Xfire profile. I am currently using it with "Squad".

Does anyone know of a site that revisits benchmarking?
I am interested in improvements in the 295x2 and 295x2/290x tri fire results now compared to initial release benches.

Cheers Bob


----------



## MIGhunter

Curious about your guys' frame rates. I have a brand new i7 with 32g of memory. I just bought a new Asus monitor. So, now I am running it in 2560x1440 at 144hz and my 2nd monitor at 11080p (just for internet and TS). While playing Black Desert online, in crowded areas I am dropping to 25-30 fps.

Running watercooling setup so it isn't a heat issue.


----------



## Mega Man

Online games tend to be affected by other things esp mmos. and a poor sigh of issues on pc end


----------



## pgetueza

Hey guys, 'sup? in CCC, I can always opt to enable or disable to crossfire but since I installed Crimson drivers v 16.6.1, the crossfire option isn't there anymore. I tried using DDU and AMD's uninstall utility to no avail. I haven't tried reformatting or resetting Win 10 64 bit since it's kind of hassle to do it... Running Heaven benchmark, I usually got 2.5k.. now it's just around 1.5k although in 3D Mark, I get the performance expected from it as a dual GPU. GPU-Z says Crossfire is enabled. Weird. Unfortunately, most of the games in my library are affected. My 295x performs like a potato 290x... and per this thread: https://community.amd.com/message/2735207#comment-2735207

According to Matt,
Quote:


> The R9 295x2 and the 7990 do not have the CrossFire toggle, it applies to the Radeon Pro Duo only. To disable CrossFire on these GPUs you must use a game profile and set AMD CrossFire mode to disabled.
> How to Configure AMD CrossFire™ Using AMD Radeon™ Settings


----------



## GeekMan

Hi guys, couldn't find what I was looking for in this long thread. I have 2x XFX 295x2 cards and they experience thermal throttling. Are there any BIOSes that allow raising the inbuilt throttle limit?


----------



## Sgt Bilko

Quote:


> Originally Posted by *GeekMan*
> 
> Hi guys, couldn't find what I was looking for in this long thread. I have 2x XFX 295x2 cards and they experience thermal throttling. Are there any BIOSes that allow raising the inbuilt throttle limit?


No, the throttle limit is to protect the AIO units, not the GPU itself.


----------



## gupsterg

Quote:


> Originally Posted by *GeekMan*
> 
> Hi guys, couldn't find what I was looking for in this long thread. I have 2x XFX 295x2 cards and they experience thermal throttling. Are there any BIOSes that allow raising the inbuilt throttle limit?


ROM has a GPU temp limit, if reached GPU is throttled. This is shown as target GPU temp in Overdrive page.

Then there is PowerLimit in ROM, this limits GPU Amps/Watts, when reaching limit GPU is throttled. In OS when we add PL we increase theses values by the % we apply.

Ref hawaii bios mod thread and you can mod rom as you like, but be aware 295X2 has 2 roms, so each need modding/flashing.


----------



## Culbrelai

I have an XFX Radeon R9 295X2 "Core Edition" that also throttles at stock speeds, in Crysis 3. When it is not throttling for about the first 5 minutes of gameplay Crysis 3 runs at a very smooth 60 FPS on high (could do very high/ultra at a sacrifice of fps but still be playable but I prefer FPS) all in 4k.

Idling at ~46/47C, it then very quickly it reaches 74c and throttles to ~850mhz. It kind of disgusts me how AMD can get away with selling something that does not perform as advertised at STOCK. I have no other GPUs, just a 6700k (also stock, no throttling here!) with a Cooler Master 212 EVO in push/pull, and this 295X2. I am going to try to put it's rad in push/pull but I dont think there is enough room at the top of the case, and from what I've read it only reduces temps by ~2C.

It's in a Cooler Master Cosmos II, with literally a dozen case fans, many of them high performance Yate Loon/Scythe Ultra Kaze, all of these running at at least 70% speed or more.

Intake: 2 fans at front, 2 fans on the door (blowing onto the 295x2), additional two fans blowing onto the lower HDD bays as well.
Exhaust: 3 fans at top, 1 fan on rear.

The GPU itself has two Yate Loon 120mm high speeds (intake) blowing onto it. The rad vents out the top because the hoses are not long enough for any other configuration, except rear exhaust which I imagine would be worse than top exhaust.

Luckily I only paid $600 for this on a Newegg blowout sale (in 2015) although it will probably be my last AMD card considering my old GTX 670 4GB SLI configuration would routinely run at 100c at load at stock (reference design Zotacs) and those cards are still in perfect working order.

Apparently there is no way to raise the temperature limit via BIOS modding or anything? Is there a BIOS mod to make the fans/pumps run at 100%? It's very disappointing that the temperature envelope is so low, compared to 290Xs

Sigh, At least it doesn't throttle in Counter Strike 1.6!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Culbrelai*
> 
> I have an XFX Radeon R9 295X2 "Core Edition" that also throttles at stock speeds, in Crysis 3. When it is not throttling for about the first 5 minutes of gameplay Crysis 3 runs at a very smooth 60 FPS on high (could do very high/ultra at a sacrifice of fps but still be playable but I prefer FPS) all in 4k.
> 
> Idling at ~46/47C, it then very quickly it reaches 74c and throttles to ~850mhz. It kind of disgusts me how AMD can get away with selling something that does not perform as advertised at STOCK. I have no other GPUs, just a 6700k (also stock, no throttling here!) with a Cooler Master 212 EVO in push/pull, and this 295X2. I am going to try to put it's rad in push/pull but I dont think there is enough room at the top of the case, and from what I've read it only reduces temps by ~2C.
> 
> It's in a Cooler Master Cosmos II, with literally a dozen case fans, many of them high performance Yate Loon/Scythe Ultra Kaze, all of these running at at least 70% speed or more.
> 
> Intake: 2 fans at front, 2 fans on the door (blowing onto the 295x2), additional two fans blowing onto the lower HDD bays as well.
> Exhaust: 3 fans at top, 1 fan on rear.
> 
> The GPU itself has two Yate Loon 120mm high speeds (intake) blowing onto it. The rad vents out the top because the hoses are not long enough for any other configuration, except rear exhaust which I imagine would be worse than top exhaust.
> 
> Luckily I only paid $600 for this on a Newegg blowout sale (in 2015) although it will probably be my last AMD card considering my old GTX 670 4GB SLI configuration would routinely run at 100c at load at stock (reference design Zotacs) and those cards are still in perfect working order.
> 
> Apparently there is no way to raise the temperature limit via BIOS modding or anything? Is there a BIOS mod to make the fans/pumps run at 100%? It's very disappointing that the temperature envelope is so low, compared to 290Xs
> 
> Sigh, At least it doesn't throttle in Counter Strike 1.6!


You never said what fans your using on the rad and if you are using Ultra Kazes then you shouldn't have throttling considering a single SP120 will cool the 295x2 enough so it doesn't get anywhere near throttling temps and as I posted above the temp limit is to prevent damage to the AIO unit, not the GPU itself.

You can mod a BIOS for it (the pump always runs at 100%) but the easiest way is just to plug the fan into your motherboard or a fan controller and run it that way (like most of us)

Side note, managed to score an EK block for my 295x2 for $35 USD, was advertised with a backplate but none showed up


----------



## Mega Man

Sure he should.. his airflow is probably poor. Why would you exhaust a single rad aio when you can intake it ?.

Also just because a fan is a "yate loon" does not mean it is a good rad fan. They make decent rad fans. They make poor rad fans

Blaming amd due to user error







You then compare air cooled gpu to AIO cooled gpu there are plenty of amd air cooled gpus that won't throttle till 90c

As to the backplate







sucks sorry.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> Sure he should.. his airflow is probably poor. Why would you exhaust when you can intake.
> 
> Also just because a fan is a "yate loon" does not mean it is a good rad fan. They make decent rad fans. They make poor rad fans
> 
> Blaming amd due to user error
> 
> 
> 
> 
> 
> 
> 
> You then compare air cooled gpu to AIO cooled gpu there are plenty of amd air cooled gpus that won't throttle till 90c
> 
> As to the backplate
> 
> 
> 
> 
> 
> 
> 
> sucks sorry.


Mine was set to exhaust the entire time I was using it, then again I was using a NF-F12 at 2000-3000rpm.

it could be airflow problems, could be bad paste etc etc.

Looking at it again though with a dozen fans I'd love to see how the fans are arranged in that case


----------



## Culbrelai

Spoiler: Warning: Spoiler!



You never said what fans your using on the rad and if you are using Ultra Kazes then you shouldn't have throttling considering a single SP120 will cool the 295x2 enough so it doesn't get anywhere near throttling temps and as I posted above the temp limit is to prevent damage to the AIO unit, not the GPU itself.



Whatever the default fan is that came with the rad. I would change it but I am unsure of the 295X2's fan port and its max voltage/amperage/whatever, I know that a lot of fan controllers have issues with higher power fans and I don't want to burn something out, considering the 295x2 is already over spec for two 8 pin PCI-E connectors. If anyone could reassure me on this I could replace the stock rad fan with a YL High speed or Slipstream or something.
Quote:


> Sure he should.. his airflow is probably poor. Why would you exhaust a single rad aio when you can intake it ?.


Do you not read? The tubes are not long enough to do any other configuration besides rear exhaust (almost certainly worse than top exhaust)
Quote:


> Blaming amd due to user error rolleyes.gif You then compare air cooled gpu to AIO cooled gpu there are plenty of amd air cooled gpus that won't throttle till 90c


Not user error. Plugged in correctly. Stock cooling solution.

Should work at stock. It doesn't. It's quite black and white.

Intel has no problem with this, plop stock cooler on, guaranteed to run 24/7 at stock. Same with nVidia. Hell, even AMD CPUs.

AMD GPUs? Not so much.

EDIT: Just put it in push/pull with the stock fan pushing and a Scythe Slipstream (https://www.amazon.com/gp/product/B000W7P4XE/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1) pulling.

Now idles at ~41c and load at 72c, although I didn't test for very long and it is only mid day (77F)


----------



## Sgt Bilko

Don't bother plugging the fan into the card, plug it directly into the motherboard and control it from there.


----------



## Mega Man

I do read. Can you understand what I am saying? Or do I need to expand ?

I'll expand further.

You don't need to move the rad you can simply flip the fan.


----------



## Alex132

I have no case fans and my CPU exhaust fans sit at around 60-240rpm. My GPU radiator fans sit at a constant 800rpm and I don't exceed 65, rarely even 60, degrees celsius in anything. MX4 on the dies and I regularly clean my case/radiator.


----------



## Culbrelai

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Don't bother plugging the fan into the card, plug it directly into the motherboard and control it from there.


So the card will run if it doesn't detect a fan plugged into it? Alright I'll replace the stock fan so it'll have two high cfm fans, although based upon me adding one already with very little performance difference I don't see how it will help, it's just a really poor cooling solution, I should not have believed the hype.
Quote:


> I have no case fans and my CPU exhaust fans sit at around 60-240rpm. My GPU radiator fans sit at a constant 800rpm and I don't exceed 65, rarely even 60, degrees celsius in anything. MX4 on the dies and I regularly clean my case/radiator.


Do you live in Antarctica? It's currently ~80F in my room atm. I guess AMD intends you to only use the card in winter, with the window open...
Quote:


> I do read. Can you understand what I am saying? Or do I need to expand ?
> 
> I'll expand further.
> 
> You don't need to move the rad you can simply flip the fan.


Intake from the top is a terrible idea, heat rises and you'd be fighting it the entire way, not to mention that dust it will be sucking in. I have always been told top/back are exhaust, front/sides are intake. Bottom for cases that have them (mine does not) can be either way.

Also, here is le system



EDIT:

Also, Ninja'ed.


----------



## Mega Man

Yea. Not user error at all.

Unlike nvidia amd ref boards are beastly and over designed. Their cooling is top tier for air as well imo not so much for water


----------



## Sgt Bilko

Look, I ran mine with ONE NF-F12 and it never went over 69c in Summer (Australian summer mind you).

I have no A/C either.

It's an excellent cooling solution when used properly, AMD can't control how you use it


----------



## Mega Man

Quote:


> Originally Posted by *Culbrelai*
> 
> Intake from the top is a terrible idea, heat rises and you'd be fighting it the entire way, not to mention that dust it will be sucking in. I have always been told top/back are exhaust, front/sides are intake. Bottom for cases that have them (mine does not) can be either way.
> 
> Also, here is le system
> 
> 
> 
> EDIT:
> 
> Also, Ninja'ed.


didnt see this post

" heat rises " just needs to die.

first heat DOES NOT rise.

less dense particles are replaced with more dense particles. it is a form of buoyancy that is all. - heat affects density

secondly. unless running a 100% PASSIVE case. then HEAT DOES NOT RISE, IT GOES WHERE YOU WANT IT TO

if heat does rise in your case and you are not running a fully passive system then you have poor airflow and again need to work on it.

want proof ? use a heater

in front of said heater put a fan blowing down. does the heat go up or down?

in review,{in general } - intake at top of pc case- BETTER for rads, it will be fresh cool air, esp when exiting the rear of your case and being pushed outward and AWAY from your case. unless of course you plug your case in a hole, with no ventilation, which again comes down to user error .

o yea and in a *PC CASE* hot air does not rise. it goes where i want it to, via fans

more fun examples- in many states a common area for a furnace is in- the attic - if heat only rises, how does that house get heat, when the furnace is in the top most room of the house.

commercial buildings ( NOT ALL but most ) when they use forced air systems for heating and cooling, are the vents usually in the ceiling or the floor?

in some houses where the vents are in the ceiling vs the floor how does that house heat up in the winter, ? as we all know heat rises.

should i continue ?

as to fighting heat rising-- no, there is no " fight " quite literally a butterflys wings would "fight" more then "rising heat"

as to dust... they make filters, or cans o compressed air

or live with it, your poor choices for location limit your useages, amd did not


----------



## Culbrelai

Quote:


> Look, I ran mine with ONE NF-F12 and it never went over 69c in Summer (Australian summer mind you).
> 
> I have no A/C either.
> 
> It's an excellent cooling solution when used properly, AMD can't control how you use it


Proof?

I ran 4 395x2s in super quad crossfire once.

Isn't make believe fun?

My configuration is more than adequate. Huge case. Standard airflow arrangement. The case is not in any cubby. It is on top of my desk, with the rear exhaust fan less than a foot from my open window. Thus, AMD's cooling solution is sub par at dissipating the significant heat output of it's hardware. While I would usually try to avoid arguing semantics with intransigent AMD fans, multiple reputable sources agree with my fan setup.

http://www.gamersnexus.net/guides/692-how-many-case-fans-should-you-have

https://blog.neweggbusiness.com/over-easy/pc-cooling-how-to-set-up-computer-case-fans/

http://www.extremetech.com/computing/128313-extremetechs-guide-to-air-cooling-your-pc

I find it strange I cannot hear any pump increase in speed as temperatures increase, it sounds the same as at idle, with my ear 5 inches from the card itself. Further I'm wondering if anyone has had any temperature drops with vbios updates/mods/etc, without me having to go through ~900 pages to find out.


----------



## Mega Man

amd fans, right - no, i only have quadfire 295, a total of FIVE 290s, and several other builds, and i actually know amds system, o yea this entire thread , hmmm.... your right, we are wrong ...


----------



## Alex132

Quote:


> Originally Posted by *Culbrelai*
> 
> Do you live in Antarctica? It's currently ~80F in my room atm. I guess AMD intends you to only use the card in winter, with the window open...


I live in South Africa.

It gets pretty hot here.
Quote:


> Originally Posted by *Culbrelai*


The card is fighting for air so badly.... your issue is obvious. If you stopped being such a dick to people trying to help you here I'm sure you'd come out with a proper answer and some help much faster. But your loss, every single other person's R9 295X2 is working perfectly fine


----------



## Sgt Bilko

Quote:


> Originally Posted by *Culbrelai*
> 
> Quote:
> 
> 
> 
> Look, I ran mine with ONE NF-F12 and it never went over 69c in Summer (Australian summer mind you).
> 
> I have no A/C either.
> 
> It's an excellent cooling solution when used properly, AMD can't control how you use it
> 
> 
> 
> Proof?
> 
> I ran 4 395x2s in super quad crossfire once.
> 
> Isn't make believe fun?
> 
> My configuration is more than adequate. Huge case. Standard airflow arrangement. The case is not in any cubby. It is on top of my desk, with the rear exhaust fan less than a foot from my open window. Thus, AMD's cooling solution is sub par at dissipating the significant heat output of it's hardware. While I would usually try to avoid arguing semantics with intransigent AMD fans, multiple reputable sources agree with my fan setup.
> 
> http://www.gamersnexus.net/guides/692-how-many-case-fans-should-you-have
> 
> https://blog.neweggbusiness.com/over-easy/pc-cooling-how-to-set-up-computer-case-fans/
> 
> http://www.extremetech.com/computing/128313-extremetechs-guide-to-air-cooling-your-pc
> 
> I find it strange I cannot hear any pump increase in speed as temperatures increase, it sounds the same as at idle, with my ear 5 inches from the card itself. Further I'm wondering if anyone has had any temperature drops with vbios updates/mods/etc, without me having to go through ~900 pages to find out.
Click to expand...

I'll see if I can dig up some temp readings when I'm in front of my keyboard and not on mobile.

Food for thought: If no one else is having issues yet you are then that should tell you something.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Culbrelai*
> 
> Quote:
> 
> 
> 
> Look, I ran mine with ONE NF-F12 and it never went over 69c in Summer (Australian summer mind you).
> 
> I have no A/C either.
> 
> It's an excellent cooling solution when used properly, AMD can't control how you use it
> 
> 
> 
> Proof?
> 
> I ran 4 395x2s in super quad crossfire once.
> 
> Isn't make believe fun?
Click to expand...



^ That was after a 2 hour session of Tomb Raider with the 295x2 + R9 390x DD

GPU1: 46c (295x2)
GPU2: 49c (295x2)
GPU3: 54c (390x)

I also started to do some fan testing of my own but never completed it:


Spoiler: NF-F12 iPPC 295x2 Testing



Noctua NF-F12 Industrial PPC 2000rpm x 1 (Push)
Ambient 27c
10mins AIDA64 GPU Stability Stress Test
GPU1:69c
GPU2:74c (No throttle)

Noctua NF-F12 Industrial PPC 2500rpm x 1 (Push)
Ambient 27c
10mins AIDA64 GPU Stability Stress Test
GPU1:67c
GPU2:71c

Noctua NF-F12 Industrial PPC 3000rpm x 1 (Push)
Ambient 27c
10mins AIDA64 GPU Stability Stress Test
GPU1:64c
GPU2:68c

Noctua NF-F12 Industrial PPC 2000rpm x 2 (Push/Pull)
Ambient 27c
10mins AIDA64 GPU Stability Stress Test
GPU1:64c
GPU2:68c

Noctua NF-F12 Industrial PPC 2500rpm x 2 (Push/Pull)
Ambient 27c
10mins AIDA64 GPU Stability Stress Test
GPU1:62c
GPU2:66c

Noctua NF-F12 Industrial PPC 3000rpm x 2 (Push/Pull)
Ambient 27c
10mins AIDA64 GPU Stability Stress Test
GPU1:59c
GPU2:63c



Temps peaked and plateaued after 6-7mins so I let it run for 10 and called it.

Rad was setup as Rear Exhaust in my CM Storm Trooper, see the drop in temps?
Quote:


> I find it strange I cannot hear any pump increase in speed as temperatures increase, it sounds the same as at idle, with my ear 5 inches from the card itself. Further I'm wondering if anyone has had any temperature drops with vbios updates/mods/etc, without me having to go through ~900 pages to find out.


As I explained before the pump always runs at 100% in Asetek coolers, instead of trying to flash a new BIOS (which won't help) why don't you at least try and sort out your airflow first?


----------



## gupsterg

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I posted above the temp limit is to prevent damage to the AIO unit, not the GPU itself.


If you are meaning target GPU temperature found in OverDrive page as "the temp limit" this is concerning the GPU not AIO:-

a) I know this as the PowerPlay for 295X2 is same as 290/X 390/X.
b) the Linux driver also has comments within it as to highlight what value is.
c) I have done a few custom ROMs for 295X2 for members and so have the results from modifying this value.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> instead of trying to flash a new BIOS (which won't help) why don't you at least try and sort out your airflow first?


I would concur that first member seeking help improve airflow, etc but do not concur a modified bios wouldn't help.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I posted above the temp limit is to prevent damage to the AIO unit, not the GPU itself.
> 
> 
> 
> If you are meaning target GPU temperature found in OverDrive page as "the temp limit" this is concerning the GPU not AIO:-
> 
> a) I know this as the PowerPlay for 295X2 is same as 290/X 390/X.
> b) the Linux driver also has comments within it as to highlight what value is.
> c) I have done a few custom ROMs for 295X2 for members and so have the results from modifying this value.
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> instead of trying to flash a new BIOS (which won't help) why don't you at least try and sort out your airflow first?
> 
> Click to expand...
> 
> I would concur that first member seeking help improve airflow, etc but do not concur a modified bios wouldn't help.
Click to expand...

I was actually meaning the 75c hard limit and I don't doubt you can modify it but the pump blocks would get awful toasty.

I should rephrase a little, a modified BIOS probably would help but without him changing anything at all and just that I can't see it improving his situation.


----------



## Mega Man

Correct. Lack of heat rejection just means it will go to a higher temp. Then still throttle, just at a higher temp


----------



## Culbrelai

Quote:


> The card is fighting for air so badly.... your issue is obvious.


If it is so obvious then perhaps you'd be willing to share?

No, I am not moving it to rear exhaust (because it would suck in hot CPU air) and I am not changing the top to intake for reasons stated previously.
Quote:


> Food for thought: If no one else is having issues yet you are then that should tell you something.


https://www.techpowerup.com/forums/threads/r9-295x2-powerplay-core-clock-throttling.207863/
http://www.tomshardware.com/answers/id-2273742/unlock-max-temp-295x2.html
http://www.tomshardware.com/answers/id-2202794/295x2-throttling-low-frames.html
https://hashcat.net/forum/thread-4023.html

__
https://www.reddit.com/r/3crttz/r9_295x2_temperature_is_way_too_high/%5B/URL
Quote:


> Originally Posted by *gupsterg*
> 
> If you are meaning target GPU temperature found in OverDrive page as "the temp limit" this is concerning the GPU not AIO:-
> 
> a) I know this as the PowerPlay for 295X2 is same as 290/X 390/X.
> b) the Linux driver also has comments within it as to highlight what value is.
> c) I have done a few custom ROMs for 295X2 for members and so have the results from modifying this value.


See now this is helpful. What would you want to make one for me? I've got paypal =P


----------



## Sgt Bilko

Quote:


> Originally Posted by *Culbrelai*
> 
> Quote:
> 
> 
> 
> Food for thought: If no one else is having issues yet you are then that should tell you something.
> 
> 
> 
> Quote:
> 
> 
> 
> https://www.techpowerup.com/forums/threads/r9-295x2-powerplay-core-clock-throttling.207863/
> http://www.tomshardware.com/answers/id-2273742/unlock-max-temp-295x2.html
> http://www.tomshardware.com/answers/id-2202794/295x2-throttling-low-frames.html
> https://hashcat.net/forum/thread-4023.html
> 
> __
> https://www.reddit.com/r/3crttz/r9_295x2_temperature_is_way_too_high/%5B/URL
> 
> 
> Click to expand...
Click to expand...

I can find a bunch of random forum links where people are complaining about something, happens on an hourly basis









You don't need to be so hostile about this, we are trying to help you but you're making it hard by insulting us and not taking the advice that you're asking for......

You noticed that the temps went down when you added another (mismatched) fan to the rad right? doesn't that tell you that the cooling is adequate but it needs more airflow?

At the end of the day I honestly don't care what you do but I hope you realise that we are trying to help (even Mega in his own funny way) but if you already have a preconcived notion and were jst looking for people to agree with you then I'm sorry but this isn't the place for that.

EDIT:

You seem like a person that would benefit from a visual medium so here's something you'll like:





.


----------



## SAFX

From AMD's site the current driver list is below, I downloaded/installed the HOTFIX since, given the file size, I assumed it contained Crimson 16.3.2 plus the hotfix itself, but after installing the version I see is *16.2*? It's been a while since I updated drivers, was on 15.x before, what am I missing; is this normal brain damage for not keeping up with latest drivers?









Code:



Code:


HOTFIX       16.7.2 (296 mb)
Crimson      16.3.2 (305 mb)
Drivers only 16.3.2 (12 mb)


----------



## Culbrelai

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I can find a bunch of random forum links where people are complaining about something, happens on an hourly basis
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You don't need to be so hostile about this, we are trying to help you but you're making it hard by insulting us and not taking the advice that you're asking for......
> 
> You noticed that the temps went down when you added another (mismatched) fan to the rad right? doesn't that tell you that the cooling is adequate but it needs more airflow?
> 
> At the end of the day I honestly don't care what you do but I hope you realise that we are trying to help (even Mega in his own funny way) but if you already have a preconcived notion and were jst looking for people to agree with you then I'm sorry but this isn't the place for that.
> 
> EDIT:
> 
> You seem like a person that would benefit from a visual medium so here's something you'll like:
> 
> .


I fixed it somewhat. Replaced stock fan with another Scythe Slipstream. Cleaned off top cover's dust. Now it just barely gets to 72C. Still a pretty crappy cooling solution imo. Shame on AMD.


----------



## Mega Man

ocn wont let me edit stuff :/

https://encrypted-tbn1.gstatic.com/images?q=tbn:ANd9GcSbmqTm4Wqfm6-TgfHHcT5rPfg3d2iqeZ67n88OJrcUefsh-IUB-g


----------



## Paul17041993

Quote:


> Originally Posted by *Culbrelai*


The 295's radiator is only getting 50% of the required airflow, at best. The vent where the orange fan is should be blocked completely and you'll want high airflow fans on both the front and side panels. That'll boost it's airflow to 75%, but the remaining 25% is still unavailable unless you removed the CPU heatsink and replaced it with another AIO radiator or custom loop.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Culbrelai*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I can find a bunch of random forum links where people are complaining about something, happens on an hourly basis
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You don't need to be so hostile about this, we are trying to help you but you're making it hard by insulting us and not taking the advice that you're asking for......
> 
> You noticed that the temps went down when you added another (mismatched) fan to the rad right? doesn't that tell you that the cooling is adequate but it needs more airflow?
> 
> At the end of the day I honestly don't care what you do but I hope you realise that we are trying to help (even Mega in his own funny way) but if you already have a preconcived notion and were jst looking for people to agree with you then I'm sorry but this isn't the place for that.
> 
> EDIT:
> 
> You seem like a person that would benefit from a visual medium so here's something you'll like:
> 
> .
> 
> 
> 
> I fixed it somewhat. Replaced stock fan with another Scythe Slipstream. Cleaned off top cover's dust. Now it just barely gets to 72C. Still a pretty crappy cooling solution imo. Shame on AMD.
Click to expand...

Is this set as exhaust or intake?

Because as Paul pointed out, the CPU cooler is sucking away air that the 295x2 needs, try it as intake and we'll see how you fare.

and again, you can blame AMD all you want but in a properly ventilated case the 295x2 (even with the stock fan) will not thermal throttle.


----------



## Mega Man

In other news ocn-aquaero-owners-club had some issues yesterday. My posts were not posting.and I couldn't edit them. But Sgt Bilko is correct ebkac not "shame on amd"


----------



## enovid

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> ^ That was after a 2 hour session of Tomb Raider with the 295x2 + R9 390x DD
> 
> GPU1: 46c (295x2)
> GPU2: 49c (295x2)
> GPU3: 54c (390x)
> 
> I also started to do some fan testing of my own but never completed it:
> 
> 
> Spoiler: NF-F12 iPPC 295x2 Testing
> 
> 
> 
> Noctua NF-F12 Industrial PPC 2000rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:69c
> GPU2:74c (No throttle)
> 
> Noctua NF-F12 Industrial PPC 2500rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:67c
> GPU2:71c
> 
> Noctua NF-F12 Industrial PPC 3000rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:64c
> GPU2:68c
> 
> Noctua NF-F12 Industrial PPC 2000rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:64c
> GPU2:68c
> 
> Noctua NF-F12 Industrial PPC 2500rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:62c
> GPU2:66c
> 
> Noctua NF-F12 Industrial PPC 3000rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:59c
> GPU2:63c
> 
> 
> 
> Temps peaked and plateaued after 6-7mins so I let it run for 10 and called it.
> 
> Rad was setup as Rear Exhaust in my CM Storm Trooper, see the drop in temps?
> As I explained before the pump always runs at 100% in Asetek coolers, instead of trying to flash a new BIOS (which won't help) why don't you at least try and sort out your airflow first?


I've spent the last six hours reading through hundreds of comments on this thread trying to find a solution to my throttling issues (which is something I didn't notice for over two years). Sgt Bilko, I've decided to buy a set of Noctua NF-F12 Industrial PPC 3000rpm fans, but there are a few variations of it listed on amazon and I was wondering if you could direct me to the one you bought.

I also want to say that I really appreciate you guys being so active and helpful on this forum. Just in the past couple of hours I've been able to reduce the amount of throttling to near nothing by following some of the advice on here - moved pc from the floor onto my desk, reduced core voltage by 40mv, and removed one side panel. Most importantly, I realized that my rig has absolutely terrible airflow - which is something I'm going to fix asap. You guys are awesome, thanks again.


----------



## Mega Man

You didn't ask but

http://m.newegg.com/Product/index?itemnumber=N82E16835608026

Iirc Sgt has this fan

http://m.newegg.com/Product/index?itemnumber=N82E16835608052

I would look into gts as well

2150 rpm fans that are out are awesome and low noise.

You can now find them in both pwm and voltage controlled.also all black or the Grey black.you can go higher speeds of you want

Coolerguys,Daz mode and performance Pcs sells them


----------



## Sgt Bilko

Quote:


> Originally Posted by *enovid*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> ^ That was after a 2 hour session of Tomb Raider with the 295x2 + R9 390x DD
> 
> GPU1: 46c (295x2)
> GPU2: 49c (295x2)
> GPU3: 54c (390x)
> 
> I also started to do some fan testing of my own but never completed it:
> 
> 
> Spoiler: NF-F12 iPPC 295x2 Testing
> 
> 
> 
> Noctua NF-F12 Industrial PPC 2000rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:69c
> GPU2:74c (No throttle)
> 
> Noctua NF-F12 Industrial PPC 2500rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:67c
> GPU2:71c
> 
> Noctua NF-F12 Industrial PPC 3000rpm x 1 (Push)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:64c
> GPU2:68c
> 
> Noctua NF-F12 Industrial PPC 2000rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:64c
> GPU2:68c
> 
> Noctua NF-F12 Industrial PPC 2500rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:62c
> GPU2:66c
> 
> Noctua NF-F12 Industrial PPC 3000rpm x 2 (Push/Pull)
> Ambient 27c
> 10mins AIDA64 GPU Stability Stress Test
> GPU1:59c
> GPU2:63c
> 
> 
> 
> Temps peaked and plateaued after 6-7mins so I let it run for 10 and called it.
> 
> Rad was setup as Rear Exhaust in my CM Storm Trooper, see the drop in temps?
> As I explained before the pump always runs at 100% in Asetek coolers, instead of trying to flash a new BIOS (which won't help) why don't you at least try and sort out your airflow first?
> 
> 
> 
> I've spent the last six hours reading through hundreds of comments on this thread trying to find a solution to my throttling issues (which is something I didn't notice for over two years). Sgt Bilko, I've decided to buy a set of Noctua NF-F12 Industrial PPC 3000rpm fans, but there are a few variations of it listed on amazon and I was wondering if you could direct me to the one you bought.
> 
> I also want to say that I really appreciate you guys being so active and helpful on this forum. Just in the past couple of hours I've been able to reduce the amount of throttling to near nothing by following some of the advice on here - moved pc from the floor onto my desk, reduced core voltage by 40mv, and removed one side panel. Most importantly, I realized that my rig has absolutely terrible airflow - which is something I'm going to fix asap. You guys are awesome, thanks again.
Click to expand...

You're welcome mate









Happy to see someone is finding the info useful









As per the fans; The ones that Mega Man linked (IP52 iPPC 3000rpm NF-F12s) are the ones that I have but I will warn you now they are loud buggers when at 2500rpm or above.

When I ordered them there were no Gentle Typhoons to be found anywhere (without costing more than a 295x2) otherwise I would have done that and so that's what I'm recommending to you.

Quote:


> Originally Posted by *Mega Man*
> 
> You didn't ask but
> 
> http://m.newegg.com/Product/index?itemnumber=N82E16835608026
> 
> Iirc Sgt has this fan
> 
> http://m.newegg.com/Product/index?itemnumber=N82E16835608052
> 
> I would look into gts as well
> 
> 2150 rpm fans that are out are awesome and low noise.
> 
> You can now find them in both pwm and voltage controlled.also all black or the Grey black.you can go higher speeds of you want
> 
> Coolerguys,Daz mode and performance Pcs sells them


^ What he said

also if you do want to go for iPPC NF-F12s make sure you get the IP52 variants and not the IP67 ones, the IP67 3000rpm fans are rated for 24v and in a PC will only get to 1600rpm or so unlike the IP52 3000rpm fans which are 12v and will hit around 3000rpm.


----------



## enovid

@Sgt Bilko and @Mega Man, exactly which fan do you recommend?

It would be perfect if I could get a specific recommendation since you've already done the homework.









*My current setup:*

Evo 212 for cpu
Stock gpu rad
One extremely cheap fan that came with my $30 case.

*Confusions:*

What is a gts?
Is <3000 rpm ever not enough?
I found some fans on amazon called gentle typhoon but there are a few variants.
Is noise an issue if I exclusively fire up the pc for gaming (how annoying can it be)?
I'm also ignorant of pwm vs voltage controlled.


----------



## Culbrelai

You guys think my Corsair AX1200 1200w Gold PSU could handle two 295X2s?

It's got nothing really else to run of note besides my 6700k (unoverclocked) on the Maximus VIII Hero, 5 hard drives (2 of which are SSDs) and about a dozen fans?

Baring that, what's the best 3rd card to crossfire with a 295x2 from the current gen, a R9 390?


----------



## Mega Man

Quote:


> Originally Posted by *enovid*
> 
> @Sgt Bilko and @Mega Man, exactly which fan do you recommend?
> 
> It would be perfect if I could get a specific recommendation since you've already done the homework.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *My current setup:*
> 
> Evo 212 for cpu
> Stock gpu rad
> One extremely cheap fan that came with my $30 case.
> 
> *Confusions:*
> 
> What is a gts?
> Is <3000 rpm ever not enough?
> I found some fans on amazon called gentle typhoon but there are a few variants.
> Is noise an issue if I exclusively fire up the pc for gaming (how annoying can it be)?
> I'm also ignorant of pwm vs voltage controlled.


First gts are gentle typhoons

I would need more info from you do you prefer silence or performance

From your answers already I would assume silence. 2150rpm imo well be fine (top end of low speed fans I can look into links after dinner.

Voltage controlled means the speed is controlled by lowering the volts and pwm means it is controlled by pwm signal. (Voltage is 3 pin fan connector, pwm is 4 pin)
Noise is a personal preference. Some don't care. Some can't stand it.
Assuming you are going to let the card control the fan I would go with voltage control iirc

Quote:


> Originally Posted by *Culbrelai*
> 
> You guys think my Corsair AX1200 1200w Gold PSU could handle two 295X2s?
> 
> It's got nothing really else to run of note besides my 6700k (unoverclocked) on the Maximus VIII Hero, 5 hard drives (2 of which are SSDs) and about a dozen fans?
> 
> Baring that, what's the best 3rd card to crossfire with a 295x2 from the current gen, a R9 390?


Worth a shot. I think you will be fine but 500wx2 (assuming you have enough amps iirc it is single rail so you should )will leave you with 200w for everything else


----------



## enovid

Quote:


> Originally Posted by *Mega Man*
> 
> I would need more info from you do you prefer silence or performance
> 
> From your answers already I would assume silence. 2150rpm imo well be fine (top end of low speed fans I can look into links after dinner).


I prefer performance. I always use noise cancelling earbuds, so I don't think noise will be a big deal - unless it's unbearable for my housemates who sleep in a room that shares a wall with mine, but I doubt they're that loud.

Thanks for clearing up the other confusions.


----------



## Mega Man

You can look into the high speed gts. They do get noisy though


----------



## enovid

Quote:


> Originally Posted by *Mega Man*
> 
> You can look into the high speed gts. They do get noisy though


@Mega Man @Sgt Bilko

I can't seem to find any gts faster than 2150 rpm. They also don't seem nearly as popular as Noctua. Why do you prefer them?


----------



## Alex132

I'd recommend either ~1850rpm Gentle Typhoons that you control via motherboard PWM.

Or EK Vardar ER 120mm fans. I have the ER ones, they can go from ~200rpm to 2200rpm. Amazing range of control.


----------



## Sgt Bilko

Quote:


> Originally Posted by *enovid*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> You can look into the high speed gts. They do get noisy though
> 
> 
> 
> @Mega Man @Sgt Bilko
> 
> I can't seem to find any gts faster than 2150 rpm. They also don't seem nearly as popular as Noctua. Why do you prefer them?
Click to expand...

The NF-F12s are a very good choice for an all purpose fan and Noctua have an amazeballs warranty but for radiators GTs (Gentle Typhoons) or EK Vardars are the way to go.
Quote:


> Originally Posted by *Alex132*
> 
> I'd recommend either ~1850rpm Gentle Typhoons that you control via motherboard PWM.
> 
> Or EK Vardar ER 120mm fans. I have the ER ones, they can go from ~200rpm to 2200rpm. Amazing range of control.


Agreed


----------



## SAFX

It's been fun, but all good things come to an end; just scored EVGA 1080 SC, $649, not bad considering prices on ebay/Amazon are 10-30% higher than retail.

Still got my 295x2, king of the hill for almost 2 years, sad to see it go, anyway, see you guys around on the forums,


----------



## SUB750T

So I have two R9 295x2 (AMD reference cards) running in crossfire yesterday when my PC shut down suddenly. After troubleshooting I found out that whatever my second card is plug in (with two 8 pin power cables) my PC will not start. There is a red light on my ATX 1500i.
The card seems to be fine while both power cables are disconnected and the power is coming from pci slot, i can see the card memory fan is sniping. I moved the card in the main PCI slot (where my other R9 295x2 is working fine and still experienced the same issue...

What could get bad? Power, water pump or something else?


----------



## Mega Man

iirc you have to shut down multi rail to use them via the corsair software


----------



## jumpman971

hi, i have a r9 295x2 for year now and i have stuttering + screen teering in every game with crossfire activated.
I have what's look like temp-fix (that i'm using for a year now since i've not found a real fix): I can play my game with crossfire activated by creating a custom amd cc profile. I turn off crossfire and i launch my game. Then i go back to windows (with my game still running and i activate crossfire and this work. You will probably think that this is not working but for example, every arkham game never works with crossfire. How do i know this? Because when i do my trick, there's Screen Flickering and then, if i go back to windows (with my arkham game still launch), if i deactivate crossfire and that i go back to my arkham game, then there's no Screen Flickering anymore.
So the problem with that trick is that i have to this trick everytime for every game and i'm not sure the crossfire is fully working too (for example, i dont have the crossfire icon in the top right corner of my game screen but my 2nd gpu seems to work because the temperature is almost the same as the one of my second gpu and because there's is change in game).
So i wanted to know if someone could help me? Because i'm getting tired of this gpu. I was in nvidia before and wanted to try amd, and i'm very disapointed with this card (ok it's the problem come from crossfire but amd doesn't fix it???).


----------



## Mega Man

Perfectly reasonable.

First most gameworks titles have issues. It is common with gameworks * tin foil hat*

I dont remember stutterung that bad in some of the batman gameibut it has been awhile. I would have to check now .

I have not tried the newest batman. Afaik both sides have issues with it. Several thing you can do that may help. One is disable upls. There's an amd thread that I don't have a link to atm (on mobile ) tsm iirc wrote it or sugarhell

Daul gpu cards are a different beast i have found then single gpus

I prefer single gpus (in quadfire of course )


----------



## jumpman971

Quote:


> Originally Posted by *Mega Man*
> 
> Perfectly reasonable.
> 
> First most gameworks titles have issues. It is common with gameworks * tin foil hat*
> 
> I dont remember stutterung that bad in some of the batman gameibut it has been awhile. I would have to check now .
> 
> I have not tried the newest batman. Afaik both sides have issues with it. Several thing you can do that may help. One is disable upls. There's an amd thread that I don't have a link to atm (on mobile ) tsm iirc wrote it or sugarhell
> 
> Daul gpu cards are a different beast i have found then single gpus
> 
> I prefer single gpus (in quadfire of course )


First, thanks for your response.
And i have to be clear: the problem that i have with my dual gpu is with every game!!! Except if i use my trick to allow me to play in crossfire, every game have stuttering and screen teering. Here's 2 videos of my issues with gta v: 



 



This issues are happening in every game if my crossfire is activated by default.
And yeah, for batman arkham city and origins it's more like stuttering + flickering (if i use my trick to play in crossfire).
So, thanks for your advice on ulps. I'm gonna try this right now.
And Yeah, i've chose this video card because it was "optimised for 4k" and cheap (~800$ instead of more than 1000$).
So i didn't realy think about buying two video card for crossfire.

EDIT:
no doesn't change a thing with ulps disabled...


----------



## Mega Man

hmm that sounds like something else... hmm can you put your rig in your sig >? ( see link in my sig about rigbuilder )

i have not had a whole lot of time to play recently :/ but yes i can see some issues with the newer drivers :/


----------



## jumpman971

Ok so here's my rig: http://www.overclock.net/lists/display/view/id/6558875
So, because i'm at work right now, i don't have every information and some are missing. For exemple, i have 16 go of ddr4 RAM (if i remember, i have 4x4go DDR4) but i don't remember the frequency (something like 2400mhz, not sure).

EDIT:
Nothing is overcloak on my PC (sometimes, i overcloak my gpu but that's it).


----------



## buuhh99

Running ETHOS 1.0.6 / Ubuntu 16.04

Have a Dual R9 295x2 Rig that has been running stable for a while now (6 months?). Last night it went down.

Boots and runs on R9295x2 #1 in the primary slot without R9295x2 #2 installed.

Mounting only R9295x2 #2 tin the primary slot, wont boot. *leads me to believe dead card*

However, with #1 in the primary and #2 in the secondary. I can boot and run the OS in no AMD driver mode from boot menu. However it is giving me OPENCL timeout errors. I think this might be because the driver isnt installed but cant tell. aticonfig --list-adapters shows both cards

I can run in recovery mode (dont really know what I am doing) and get some black screens.

With #1 in Primary and #2 in secondary, Normal boot it hangs on Stopping System V Runlevel compatibility.

Any thoughts / How should I test if R9295x2 #2 is still good? Are there any linux apps I can run to test GPU health?

If the GPU is dead is there any place I can send it to, to repair?

Thanks


----------



## jumpman971

Damn my trick to make my crossfire work never worked actualy...
With msi burner, if i do my "trick", my second gpu is still at 300 mhz intstead of ~1000 mhz...
If i dont do my trick and let the crossfire enabled normaly, my two gpu are at ~1000mhz but i have the issues that have said before... I wonder if it could come from the driver of my screen or something with it, i dont know

EDIT:
Hmmmm it might actualy come from my screen:
http://www.tomshardware.co.uk/forum/id-2175663/290-crossfire-u2868pqu-60hz.html


----------



## ramos29

never expected myself that one day i will go trifire, yesterday i did it
maybe it was not the best of solutions but i paired a 290 tri x with my 295x2
first i got mad about the performance : huge fps drop, core clock throtteling dispite no overheating, the 290 trix did not exeed 70°
after hours of twicking i figured out that it was the vsync+ frtc which were the prblem, they were preventing my trifire to run preperly
bf4 4k 4msaa 90 fps !
no actual game needs such firepower even at 4k
i think i will turn off the 290 untill battlefield 1 sees the day, till then a " single " 295x2 is more then enough
letting the cards working will cost me darely as nearly 1000w are being sucked from the wall


----------



## ramos29

how to disable the card i added without corrupting the driver or remove the card


----------



## Paul17041993

Quote:


> Originally Posted by *ramos29*
> 
> how to disable the card i added without corrupting the driver or remove the card


Unplug the PCIe cables while the system is powered off, though I don't know how well motherboards tend to handle this as they may or may not abort POST.


----------



## Sgt Bilko

It's official.....I've gone mad, I found another 295x2 cheap locally, will be here within the next 2 weeks


----------



## dagget3450

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's official.....I've gone mad, I found another 295x2 cheap locally, will be here within the next 2 weeks


----------



## Dagamus NM

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's official.....I've gone mad, I found another 295x2 cheap locally, will be here within the next 2 weeks


I hope it is cold in the room where you will have this because it is about to get warm in there.


----------



## Paul17041993

I want 3-4 295X2's for mGPU prototyping...


----------



## Rikan S087

Hello, first my system specs:

Motherboard: ASUS Crosshair V Formula Z
Processor: AMD FX-8370 8-core 4.5ghz
Memory: 16GB G.Skill 1866 DDR3
Video: Sapphire Radeon R295x
HDD: 60GB OCZ SSD, Hitatchi 1TB Internal, Samsung 2TB External 120GB Kingston SSD
PSU: 1000w
Case: Antec DF-85 (MODDED)
Optical: ASUS Blueray/DVD/Cd-ROM RW/R
Monitor: Triple 22" HD BENQ Eyefinity
OS: Windows 7 Premium 64bit

As mentioned above, I own a 295x (two cores on one board).

Ever since owning it it has run great and the crossfire has run very well. However, ever since Crimson Driver 16.5.1 (the last driver to work properly), the crossfire toggle switch has disappeared.
Normally it appears in Global Settings, or at least in created game-profiles. It simply has vanished. I have attached some screens to show you what I mean. Notice how the crossfire toggle has vanished.

24670d1473216104-crossfire-issue-gpuz-snap.jpg 71k .jpg file


24669d1473216104-crossfire-issue-radeon-additional-settings-.jpg 66k .jpg file


24668d1473216104-crossfire-issue-game-profile-example.jpg 61k .jpg file


24667d1473216104-crossfire-issue-global-settings.jpg 64k .jpg file


I can confirm that the system recognized both cores and crossfire is enabled (GPUz shows both enabled and performance in games reflects that). The issue is that as you all know, some games do not run well with crossfire enabled for various reasons and gamers therefore need to be able to disable it, something I would do with the crossfire toggle switch that should be there. I am just baffled by this and there are others with 295x with the same issue. Thoughts?


----------



## Mega Man

1

Please don't attach files (paperclip) use the little picture of a picture. Most will not download your pictures.

2
This has been this way for a long time. It is not an issue. To disable crossfire make a profile for that game


----------



## Rikan S087

1. I am new to the site and didn't know about the paperclip rules. I will repost the pics.

2. It is an issue. If by "make a profile" you mean add a game-profile within Crimson, you may notice I already mentioned the crossfire toggle is missing from game profiles too. Unless of course you mean make a profile through some other method?


----------



## Rikan S087




----------



## Mega Man

Quote:


> Originally Posted by *Rikan S087*
> 
> 1. I am new to the site and didn't know about the paperclip rules. I will repost the pics.
> 
> 2. It is an issue. If by "make a profile" you mean add a game-profile within Crimson, you may notice I already mentioned the crossfire toggle is missing from game profiles too. Unless of course you mean make a profile through some other method?


just so you know i was not criticizing you i was trying to help you. most dont download pics - no offense was ment

ill have to show you what i mean as i cant do it by memory when i have a min on my main pc not my laptop


----------



## Rikan S087

No not offended at all, just pissy. I am at my wits end with this and it's frustrating to hear someone say it isn't an issue and seemingly dismiss my problem. I know you mean to help.

In the pics you can see the crossfire options are missing from created game profiles, global settings, and the entire "gaming tab" and all game related settings are missing from "Radeon additional Settings".

Is 5his symptom exclusive to me or are all 295x users experiencing 5his and simply have found a workaround? I know how to create game profiles within Crimson and the toggle is missing from there too, since 16.5.1.

Anyways, I appreciate all help and don't want to come off as snippy. It's an expensive card and has horrible driver support. Might go green and single gpu next time around because of this experience.

Hoping the people here can help resolve this.


----------



## Mega Man

both mine are still there but it may be i run 2 cards, iirc they did remove the crossfire slider.

one simple solution would be to run it in windowed without boarders- sorry :/


----------



## Paul17041993

I'd suggest everyone affected to report that bug to AMD as the slider should always be there if more than one AMD GPU is present, however for the 295X2 it should be enabled by default unless an additional card is present.

http://www.amd.com/report
http://support.amd.com/en-us/contact/email-form

@Mega Man it works for you because you have multiple cards, the drivers look for the physical cards and their BIOS's but are also supposed to check if multiple GPUs exist on the card, the later of which isn't working correctly.


----------



## Rikan S087

That's my very point--it should recognize both cores as separate cards and has in the past (up until 16.5.1 for me). It is all well and fine that it automatically enables crossfire for the 295 but for obvious reasons we need the ability to turn it off. So I am wondering: does anyone one else in this owner's club have a SINGLE 295x unit and is also missing the crossfire toggle switches?


----------



## Mega Man

You know what this means? You should buy another!


----------



## dagget3450

Quote:


> Originally Posted by *Paul17041993*
> 
> I'd suggest everyone affected to report that bug to AMD as the slider should always be there if more than one AMD GPU is present, however for the 295X2 it should be enabled by default unless an additional card is present.
> 
> http://www.amd.com/report
> http://support.amd.com/en-us/contact/email-form
> 
> @Mega Man it works for you because you have multiple cards, the drivers look for the physical cards and their BIOS's but are also supposed to check if multiple GPUs exist on the card, the later of which isn't working correctly.


I was testing 2x FuryX this last weekend and i found this to be a problem. After a reboot or two, i lost the Crossfire option from crimson and the old UI as well. After trying to keep it working i ended up installig a 3rd FuryX i had on hand and now Crossfire stays on all the time. Not sure what they have going on but it's also happening for Fiji with 2 cards as well.


----------



## Rx4speed

I didn't mean to start a spat. We are all having fun, right?

Mega -
Quote:


> Originally Posted by *Rikan S087*
> 
> Hello, first my system specs:
> 
> Motherboard: ASUS Crosshair V Formula Z
> Processor: AMD FX-8370 8-core 4.5ghz
> Memory: 16GB G.Skill 1866 DDR3
> Video: Sapphire Radeon R295x
> HDD: 60GB OCZ SSD, Hitatchi 1TB Internal, Samsung 2TB External 120GB Kingston SSD
> PSU: 1000w
> Case: Antec DF-85 (MODDED)
> Optical: ASUS Blueray/DVD/Cd-ROM RW/R
> Monitor: Triple 22" HD BENQ Eyefinity
> OS: Windows 7 Premium 64bit
> 
> As mentioned above, I own a 295x (two cores on one board).
> 
> Ever since owning it it has run great and the crossfire has run very well. However, ever since Crimson Driver 16.5.1 (the last driver to work properly), the crossfire toggle switch has disappeared.
> Normally it appears in Global Settings, or at least in created game-profiles. It simply has vanished. I have attached some screens to show you what I mean. Notice how the crossfire toggle has vanished.
> 
> 24670d1473216104-crossfire-issue-gpuz-snap.jpg 71k .jpg file
> 
> 
> 24669d1473216104-crossfire-issue-radeon-additional-settings-.jpg 66k .jpg file
> 
> 
> 24668d1473216104-crossfire-issue-game-profile-example.jpg 61k .jpg file
> 
> 
> 24667d1473216104-crossfire-issue-global-settings.jpg 64k .jpg file
> 
> 
> I can confirm that the system recognized both cores and crossfire is enabled (GPUz shows both enabled and performance in games reflects that). The issue is that as you all know, some games do not run well with crossfire enabled for various reasons and gamers therefore need to be able to disable it, something I would do with the crossfire toggle switch that should be there. I am just baffled by this and there are others with 295x with the same issue. Thoughts?


I have this EXACT same problem. Latest version where i have any control of Crossfire is 16.3.1, Anything after that and there is NO mention of Crossfire in radeon settings or additional settings. So, I install 16.3.1 and then JUST the latest drivers. But, lately, I'm getting not Crossfire in Assetto Corsa. It's gotten so bad, I said F it and bought a new card and my 295X2 is on ebay.


----------



## Dagamus NM

Just out of curiosity, what did you get to replace your 295x2?

I am happily running 7680x1440 with my pair of 295x2s. I cannot remember the last time I even looked at the crossfire slider. I pretty much only go into Radeon settings to enable eyefinity or turn it off.

Just not even really a thought honestly.


----------



## Rx4speed

I run 5760x1200, but that resolution only for driving sims like iRacing (which doesnt utilize multiGPUs and Assetto Corsa. Most other games i just run one screen. What drivers are you on, and would you do me a favor and go into radeon setting and see if the global crossfire slider is there and a slider for each game? Maybe because you have two physical cards, your driver installation sees this and gives you crossfire options. I'm thinking my issue was that for whatever reason, the drivers after 16.3.1 don't recognize that they need to install CF options on a single 295x2 system because both GPU's are on the single card. It's like AMD just kicked the 295x2 to the curb. I bought a gigabyte 980ti waterforce. I may buy a 2nd one, but right now it's doing great for my needs. I had a R9 290 before the 295X2. I bought the 295x2 when i went to triple monitors. I liked the 290, and I had I not run into CF issues with my 295X2, I'd still own it. I loved the card. I'm not a fanboy of either company and I could care less about power consumption and noise. "Drill for oil in Alaska and while we are there, log it and re-plant"....Dennis Miller.


----------



## y4h3ll

Hey Budd just switched from my 295x2 aboit 9 months ago but before I switched to 2x 390X's hydra editions I had issues with my slider missing aswell, now 9 months later I have a evga 1080 FTW and was promised a boost clock of 1860 but instead my boost was 2047 out of the box and it's Litterally 20% more powerful than both 390xs together. it took me 3 years to switch back to nvidia as the 770s just where crap. But for $679 deff cheaper than 2 390xs and more powerfull. Just my 2 cents.
0 problems
Quote:


> Originally Posted by *Rx4speed*
> 
> I run 5760x1200, but that resolution only for driving sims like iRacing (which doesnt utilize multiGPUs and Assetto Corsa. Most other games i just run one screen. What drivers are you on, and would you do me a favor and go into radeon setting and see if the global crossfire slider is there and a slider for each game? Maybe because you have two physical cards, your driver installation sees this and gives you crossfire options. I'm thinking my issue was that for whatever reason, the drivers after 16.3.1 don't recognize that they need to install CF options on a single 295x2 system because both GPU's are on the single card. It's like AMD just kicked the 295x2 to the curb. I bought a gigabyte 980ti waterforce. I may buy a 2nd one, but right now it's doing great for my needs. I had a R9 290 before the 295X2. I bought the 295x2 when i went to triple monitors. I liked the 290, and I had I not run into CF issues with my 295X2, I'd still own it. I loved the card. I'm not a fanboy of either company and I could care less about power consumption and noise. "Drill for oil in Alaska and while we are there, log it and re-plant"....Dennis Miller.


----------



## Rx4speed

Quote:


> Originally Posted by *y4h3ll*
> 
> Hey Budd just switched from my 295x2 aboit 9 months ago but before I switched to 2x 390X's hydra editions I had issues with my slider missing aswell, now 9 months later I have a evga 1080 FTW and was promised a boost clock of 1860 but instead my boost was 2047 out of the box and it's Litterally 20% more powerful than both 390xs together. it took me 3 years to switch back to nvidia as the 770s just where crap. But for $679 deff cheaper than 2 390xs and more powerfull. Just my 2 cents.
> 0 problems


Just so everyone is clear, I'm not here to debate AMD vs nVidia. I LOVED my 295X2. It's just 50% of the time, i had this badass card installed and was running games like i had only one 290x. Plus the time involved in researching the sliders and tweaking AFrvs1to1vsCustomeProfile, Jesus, it just wore me into submission. My new card is suppose to boost at 1317, but boosts at 1455. It's killing it in all my games and it's hassle free. Now, in the future, I may buy another 980ti waterforce for SLI, but something tells me the grass is no greener in that pasture. I considered a FuryX, but after joining the voice of displeasure with AMD regarding the missing sliders for us 295X2 owners, and getting a big F U from AMD about it, I chose to go green this time. Sucks cause i was all AMD...


----------



## sub0seals

Sounds like I have a lot of problems to look forward to when I install my AMD R9 295X2 ! I have been working on a build for a year,getting a few parts every month. I finally got the hard to find Aquacomputer Vesuvius Black Nickel full coverage water block,seems it will all be for nothing as all I have heard about this card seems to be all bad new and nothing but problems. I was hoping for a finally awesome card from AMD, now it sounds like nothing but disappointments to look forward to. Maybe I should just sell it before I use it, to avoid losing any more value and just buy Nvidia. To bad!


----------



## lanofsong

Hey R9 295X2 owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Sgt Bilko

Quote:


> Originally Posted by *lanofsong*
> 
> Hey R9 295X2 owners,
> 
> Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.
> 
> September Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


I should but I really can't be bothered plugging them in again for a while haha.

Give me a Poke for the forum folding war, I was Team AMD last year, Team Intel the year before so we'll see where I end up this time









Side note: http://www.3dmark.com/fs/10158927

All off a Single 1200w PSU


----------



## VegaSecureA

I bought all you see as a Christmas present to myself last year, lol. Happy customer of the FX-9590 for three years now, 2 years for the 2 R295X2s









MOBO: ASUS Crosshair V Formula-Z
CPU: AMD FX-9590 (OC 4.8Ghz)
CPU COOLER: AMD FX Series AIO Liquid Cooler
GPU(s): 2 AMD R9 295X2 (Clock: OC 1038Mhz; Memory: OC 1500Mhz)
Memory: 4 AMD R9 Gamer Series 8GB sticks (OC 2133Mhz)
Boot: AMD RAMDisk Gamer Edition (20GB)
Storage: 10 AMD Radeon R3 Series 480GB (8 SATA in RAID 0; 2 eSATA)
PSU: Corsair AX1500i
Case: Lian-Li DK-01
OS: Microsoft Windows 10 Pro














I took a picture of my SSDs but not of the RAM. Forgot, will update accordingly.

When the flagship enthusiast Zen processor comes out I will be getting that and corresponding mobo and ddr4 ram, upgrading the case to DK-04, and the 2 R9 295X2s to 3 Pro Duos.


----------



## sub0seals

So.....i have a noob question..what is a Foldathon and what is it for and how do you get involved?


----------



## Mega Man

That I see as a loaded question


----------



## sub0seals

Meaning what? I am new to PC building and water cooling so I'm just trying to learn ! So mega man.......maybe you can help,you seem to be a pretty knowledgeable active member!


----------



## Mega Man

i was on mobile, it felt like that was a lead up to the how to on how to fold
Quote:


> Originally Posted by *lanofsong*
> 
> Hey R9 295X2 owners,
> 
> Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.
> 
> September Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


----------



## bobbavet

Gday Guys.

Well my tenure with the 295x2 is nearly over. I am eagerly awaiting the arrival of AMD's next rumored dual Gpu iteration and started saving.

I usually try the duals from both AMD and Nvidia, but since Nvidia now continually let down, I have decided to try a GTX1080 to have a look at what they have been up to.

Was wondering if anyone else has tried a 1080 and what the general fps experience has been compared to the 295x2. Better, poorer or up and down depending on games.

Highlight of the 295x2 for me is any game @ 4k that has a crossfire profile. Just awesomeness with 50 - 60+ fps 4k in any title. It's still a monster.
In particular SQUAD which is still in dev and not fully optimized, but am pulling anywhere from high 40's to 80 fps @ 4k. I am using the "paragon" profile for x-fire.
Only low light was the loss to xfire with War Thunder, which hopefully will be sorted out using the 1080.

Any thoughts on the specs of a new Dual GPU? I am thinking single Gpu will be on par or slightly better than the 1080 performance with 8gb HBM.
Double that up in a package and it will be a can of whoop ass. Hoping the pricing will be 1.5 that of a 1080 sli set up. If it is priced anything like the Pro Duo maybe a bit iffy for me.

I am a little peeved with AMD with Vulkan and DX12 atm. The sell point is Multi GPU support built into games (no more profiles), also with the ability of memory pooling.
It's just not happening. Can't see it happening unless AMD put small multi gpu/apu in consoles. Patience is not on of my virtues. lol

Hang loose guys.

P.S. Hey VegaSecureA, what is that keyboard? Is that one of those "Paradise" boards?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Culbrelai*
> 
> You guys think my Corsair AX1200 1200w Gold PSU could handle two 295X2s?
> 
> It's got nothing really else to run of note besides my 6700k (unoverclocked) on the Maximus VIII Hero, 5 hard drives (2 of which are SSDs) and about a dozen fans?
> 
> Baring that, what's the best 3rd card to crossfire with a 295x2 from the current gen, a R9 390?


Sorry for the really late reply but it can: http://www.3dmark.com/fs/10158927

^ Done on a AX1200i
Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys.
> 
> Well my tenure with the 295x2 is nearly over. I am eagerly awaiting the arrival of AMD's next rumored dual Gpu iteration and started saving.
> 
> I usually try the duals from both AMD and Nvidia, but since Nvidia now continually let down, I have decided to try a GTX1080 to have a look at what they have been up to.
> 
> Was wondering if anyone else has tried a 1080 and what the general fps experience has been compared to the 295x2. Better, poorer or up and down depending on games.
> 
> Highlight of the 295x2 for me is any game @ 4k that has a crossfire profile. Just awesomeness with 50 - 60+ fps 4k in any title. It's still a monster.
> In particular SQUAD which is still in dev and not fully optimized, but am pulling anywhere from high 40's to 80 fps @ 4k. I am using the "paragon" profile for x-fire.
> Only low light was the loss to xfire with War Thunder, which hopefully will be sorted out using the 1080.
> 
> Any thoughts on the specs of a new Dual GPU? I am thinking single Gpu will be on par or slightly better than the 1080 performance with 8gb HBM.
> Double that up in a package and it will be a can of whoop ass. Hoping the pricing will be 1.5 that of a 1080 sli set up. If it is priced anything like the Pro Duo maybe a bit iffy for me.
> 
> I am a little peeved with AMD with Vulkan and DX12 atm. The sell point is Multi GPU support built into games (no more profiles), also with the ability of memory pooling.
> It's just not happening. Can't see it happening unless AMD put small multi gpu/apu in consoles. Patience is not on of my virtues. lol
> 
> Hang loose guys.
> 
> P.S. Hey VegaSecureA, what is that keyboard? Is that one of those "Paradise" boards?


Good luck with the future ventures mate


----------



## asbator

Could someone
Quote:


> Originally Posted by *bobbavet*
> 
> Gday Guys.
> 
> Well my tenure with the 295x2 is nearly over. I am eagerly awaiting the arrival of AMD's next rumored dual Gpu iteration and started saving.
> 
> I usually try the duals from both AMD and Nvidia, but since Nvidia now continually let down, I have decided to try a GTX1080 to have a look at what they have been up to.
> 
> Was wondering if anyone else has tried a 1080 and what the general fps experience has been compared to the 295x2. Better, poorer or up and down depending on games.
> 
> Highlight of the 295x2 for me is any game @ 4k that has a crossfire profile. Just awesomeness with 50 - 60+ fps 4k in any title. It's still a monster.
> In particular SQUAD which is still in dev and not fully optimized, but am pulling anywhere from high 40's to 80 fps @ 4k. I am using the "paragon" profile for x-fire.
> Only low light was the loss to xfire with War Thunder, which hopefully will be sorted out using the 1080.
> 
> Any thoughts on the specs of a new Dual GPU? I am thinking single Gpu will be on par or slightly better than the 1080 performance with 8gb HBM.
> Double that up in a package and it will be a can of whoop ass. Hoping the pricing will be 1.5 that of a 1080 sli set up. If it is priced anything like the Pro Duo maybe a bit iffy for me.
> 
> I am a little peeved with AMD with Vulkan and DX12 atm. The sell point is Multi GPU support built into games (no more profiles), also with the ability of memory pooling.
> It's just not happening. Can't see it happening unless AMD put small multi gpu/apu in consoles. Patience is not on of my virtues. lol
> 
> Hang loose guys.
> 
> P.S. Hey VegaSecureA, what is that keyboard? Is that one of those "Paradise" boards?


I have two MSI 1070 Gaming X for 4k and i'm simply amazed by SLI scaling and perfect smoothness. Scaling is about 90% in optimized games, whole system draws just 300W from the wall and GPUs are totally silent. Single 1080 can be a little bit to short for 4k on max settings, but it's great buy as well, especially Zotac Extrme Amp, ive seen OC results between reference 1080 and Titan XP. Nvidia 1000 cards are very easy to OC thx to EVGA PrecisionX. It has auto-OC test that finds highest stable clocks for every voltage level, it can be ran overnight.
Overall i have 2 GTX 1070, 2 GTX 470 (Sapphire) and 2 480 (XFX) and must say that Nvidia overtook AMD by years in every cathegory (except price). Just stay away from Gigabyte, it was my initaial buy and i returned it because of bad quality and design.


----------



## Sgt Bilko

Quote:


> Originally Posted by *asbator*
> 
> Could someone
> Quote:
> 
> 
> 
> Originally Posted by *bobbavet*
> 
> Gday Guys.
> 
> Well my tenure with the 295x2 is nearly over. I am eagerly awaiting the arrival of AMD's next rumored dual Gpu iteration and started saving.
> 
> I usually try the duals from both AMD and Nvidia, but since Nvidia now continually let down, I have decided to try a GTX1080 to have a look at what they have been up to.
> 
> Was wondering if anyone else has tried a 1080 and what the general fps experience has been compared to the 295x2. Better, poorer or up and down depending on games.
> 
> Highlight of the 295x2 for me is any game @ 4k that has a crossfire profile. Just awesomeness with 50 - 60+ fps 4k in any title. It's still a monster.
> In particular SQUAD which is still in dev and not fully optimized, but am pulling anywhere from high 40's to 80 fps @ 4k. I am using the "paragon" profile for x-fire.
> Only low light was the loss to xfire with War Thunder, which hopefully will be sorted out using the 1080.
> 
> Any thoughts on the specs of a new Dual GPU? I am thinking single Gpu will be on par or slightly better than the 1080 performance with 8gb HBM.
> Double that up in a package and it will be a can of whoop ass. Hoping the pricing will be 1.5 that of a 1080 sli set up. If it is priced anything like the Pro Duo maybe a bit iffy for me.
> 
> I am a little peeved with AMD with Vulkan and DX12 atm. The sell point is Multi GPU support built into games (no more profiles), also with the ability of memory pooling.
> It's just not happening. Can't see it happening unless AMD put small multi gpu/apu in consoles. Patience is not on of my virtues. lol
> 
> Hang loose guys.
> 
> P.S. Hey VegaSecureA, what is that keyboard? Is that one of those "Paradise" boards?
> 
> 
> 
> I have two MSI 1070 Gaming X for 4k and i'm simply amazed by SLI scaling and perfect smoothness. Scaling is about 90% in optimized games, whole system draws just 300W from the wall and GPUs are totally silent. Single 1080 can be a little bit to short for 4k on max settings, but it's great buy as well, especially Zotac Extrme Amp, ive seen OC results between reference 1080 and Titan XP. Nvidia 1000 cards are very easy to OC thx to EVGA PrecisionX. It has auto-OC test that finds highest stable clocks for every voltage level, it can be ran overnight.
> Overall i have 2 GTX 1070, 2 GTX 470 (Sapphire) and 2 480 (XFX) and must say that Nvidia overtook AMD by years in every category (except price). Just stay away from Gigabyte, it was my initial buy and i returned it because of bad quality and design.
Click to expand...

300w total system draw? I can't believe that when the 1070 is around 150-160w alone, What else is in your system?

And sorry but it's kinda silly to compare a 1070/1080 to the 470/480 since they are completely different price and performance brackets.


----------



## asbator

Can someone confirm 100% that Sapphire 295x2 has dual BIOS?
Alas i botched my flash by loading BIOS from first switch setting and flashed it to second.
Surprisingly my modification of mem straps seems to work in both switch settings (!)
And that makes me wonder if this card really has 2x2 BIOSes, or it is just a dummy, service switch or whatever.
I've read that this switch changes order of GPUS, but not for me, at least not after how i flashed it... :|


----------



## asbator

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 300w total system draw? I can't believe that when the 1070 is around 150-160w alone, What else is in your system?
> 
> And sorry but it's kinda silly to compare a 1070/1080 to the 470/480 since they are completely different price and performance brackets.


No problem








I'm meant overall technological advancement, like computing power per Watt, easyness of configuration and adjustment. Price is just effect of this advancement.
I'll measure Wattage again in the evening, maybe i confused something, but those cards draw amazingly little power anyway.


----------



## Sgt Bilko

Quote:


> Originally Posted by *asbator*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 300w total system draw? I can't believe that when the 1070 is around 150-160w alone, What else is in your system?
> 
> And sorry but it's kinda silly to compare a 1070/1080 to the 470/480 since they are completely different price and performance brackets.
> 
> 
> 
> No problem
> 
> 
> 
> 
> 
> 
> 
> 
> I'm meant overall technological advancement, like computing power per Watt, easyness of configuration and adjustment. Price is just effect of this advancement.
> I'll measure Wattage again in the evening, maybe i confused something, but those cards draw amazingly little power anyway.
Click to expand...

all those points are up for debate in any which way, both sides have their merits and downfalls, but I cannot agree that price should continually raise just because the newest generation is faster.

That train of thought leads to mid range GPUs costing a small fortune and PC gaming becoming more and more expensive and eventually driving people away from it and to consoles which no-one wants.


----------



## asbator

Quote:


> Originally Posted by *Sgt Bilko*
> 
> all those points are up for debate in any which way, both sides have their merits and downfalls, but I cannot agree that price should continually raise just because the newest generation is faster.
> 
> That train of thought leads to mid range GPUs costing a small fortune and PC gaming becoming more and more expensive and eventually driving people away from it and to consoles which no-one wants.


Overall price of computing power falls constantly. This peculiar scenario is effect of little concurence - there are just two GPU makers and one struggles to keep up the pace.


----------



## RWGTROLL

Quote:


> Originally Posted by *asbator*
> 
> I have two MSI 1070 Gaming X for 4k and i'm simply amazed by SLI scaling and perfect smoothness. Scaling is about 90% in optimized games, whole system draws just 300W from the wall and GPUs are totally silent. Single 1080 can be a little bit to short for 4k on max settings, but it's great buy as well, especially Zotac Extrme Amp, ive seen OC results between reference 1080 and Titan XP. Nvidia 1000 cards are very easy to OC thx to EVGA PrecisionX. It has auto-OC test that finds highest stable clocks for every voltage level, it can be ran overnight.
> Overall i have 2 GTX 1070, 2 GTX 470 (Sapphire) and 2 480 (XFX) and must say that Nvidia overtook AMD by years in every cathegory (except price). Just stay away from Gigabyte, it was my initaial buy and i returned it because of bad quality and design.


no need to test http://www.guru3d.com/articles-pages/geforce-gtx-1070-2-way-sli-review,4.html
300w total system wattage I don't think so.


----------



## Sgt Bilko

Quote:


> Originally Posted by *asbator*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> all those points are up for debate in any which way, both sides have their merits and downfalls, but I cannot agree that price should continually raise just because the newest generation is faster.
> 
> That train of thought leads to mid range GPUs costing a small fortune and PC gaming becoming more and more expensive and eventually driving people away from it and to consoles which no-one wants.
> 
> 
> 
> Overall price of computing power falls constantly. This peculiar scenario is effect of little concurence - there are just two GPU makers and one struggles to keep up the pace.
Click to expand...

R9 290x: 5.6 Tflops @ $550 ($98 per Tflop)

GTX 780 Ti 5.04 Tflops @ $699 ($138 per Tflop)

R9 390x 5.9 Tflops @ $429 ($72 per Tflop)

GTX 980 5.3 Tflops @ $550 ($103 per Tflop)

R9 Fury X 8.6 Tflops @ $650 ($75 per Tflop)

GTX 980Ti 6.5 Tflops @ $649 ($99 per Tflop)

RX 480 5.5 Tflops @ $229 (*$41 per Tflop*)

GTX 1080 9 Tflops @ $599 ($66 per Tflop)

Anything you'd like to add?


----------



## Mega Man

Bilko I heart you !

Also, my fav high (ish) end to low end comparison.

Nvidia does some well. But to say amd struggles...


----------



## Energylite

Hey all ! im glad to belong to this "R9 295x2 family", so as you think im an owner of this GPU from France and im here to know and may be receive help because since i play of the most recent video games, my GPU trhottle like no one can throttle after 5 min of game even when i put the fan at max speed. So i want to "unlock" the max Temp of 75°C and i would to now if is it possible to unlock the max temp with a "custom" bios ? and if the answer is "yes", how i can edit it or someone know how to do it ?
Ty for your response
kiss








Energylite


----------



## Mega Man

look into the Hawaii bios editing thread


----------



## Energylite

Oh OK, thx mate !


----------



## Mega Man

gl!!


----------



## xarot

Quote:


> Originally Posted by *Energylite*
> 
> Hey all ! im glad to belong to this "R9 295x2 family", so as you think im an owner of this GPU from France and im here to know and may be receive help because since i play of the most recent video games, my GPU trhottle like no one can throttle after 5 min of game even when i put the fan at max speed. So i want to "unlock" the max Temp of 75°C and i would to now if is it possible to unlock the max temp with a "custom" bios ? and if the answer is "yes", how i can edit it or someone know how to do it ?
> Ty for your response
> kiss
> 
> 
> 
> 
> 
> 
> 
> 
> Energylite


I too had severe throttling issues when I had this card.

The 75 degrees throttling limit is to simply prevent the AIO pump from dying from heat. 75c is not a whole lot for GPUs but since the 295X2 comes with only a 120mm AIO/radiator setup to dissipate 500W of heat, there's not much to be done really. It would have done better with a way bigger radiator. On customer water cooling nobody would even try running such high-wattage card under at least a 240mm radiator.


----------



## Energylite

Quote:


> Originally Posted by *Mega Man*
> 
> gl!!


Yeah thanks !
Quote:


> Originally Posted by *xarot*
> 
> I too had severe throttling issues when I had this card.
> 
> The 75 degrees throttling limit is to simply prevent the AIO pump from dying from heat. 75c is not a whole lot for GPUs but since the 295X2 comes with only a 120mm AIO/radiator setup to dissipate 500W of heat, there's not much to be done really. It would have done better with a way bigger radiator. On customer water cooling nobody would even try running such high-wattage card under at least a 240mm radiator.


But I don't have money to buy a radiator xD so I'm looking for a cheap solution. But that's true if I fail may be my GPU I'll be broken but I need to try !


----------



## Sgt Bilko

Quote:


> Originally Posted by *Energylite*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> gl!!
> 
> 
> 
> Yeah thanks !
> Quote:
> 
> 
> 
> Originally Posted by *xarot*
> 
> I too had severe throttling issues when I had this card.
> 
> The 75 degrees throttling limit is to simply prevent the AIO pump from dying from heat. 75c is not a whole lot for GPUs but since the 295X2 comes with only a 120mm AIO/radiator setup to dissipate 500W of heat, there's not much to be done really. It would have done better with a way bigger radiator. On customer water cooling nobody would even try running such high-wattage card under at least a 240mm radiator.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> But I don't have money to buy a radiator xD so I'm looking for a cheap solution. But that's true if I fail may be my GPU I'll be broken but I need to try !
Click to expand...

Case airflow is key to getting the 295x2 running sweet, get your airflow sorted out and even the stock fan will take care of the temps well enough and if that fails, grab a better 120mm fan and plug it into the motherboard and control it that way


----------



## Juris

I know this is a long shot but if anyone in the 295 club has an Aquacomputer Kryographics waterblock in nickel, clear or black edition they are looking to sell let me know. Been searching everywhere for one to complete my custom loop. Cheers.


----------



## sub0seals

Goodluck man. I have been looking for several,several months and I finally found one in extreme mint condition.It is the Black Nickel Edition. Only thing is I might consider selling the entire setup because my monitor the wife bought me is the Acer Predator 2K version 27inch,so I will not benefit from this card because it has no G-Sync that I need so I either switch monitors or graphics cards I guess. Lol.


----------



## Mega Man

Or.... don't use it


----------



## sub0seals

Lol...then it would just be a waste of money!


----------



## Mega Man

No. Not using a function is just that


----------



## Juris

Quote:


> Originally Posted by *sub0seals*
> 
> Goodluck man. I have been looking for several,several months and I finally found one in extreme mint condition.It is the Black Nickel Edition. Only thing is I might consider selling the entire setup because my monitor the wife bought me is the Acer Predator 2K version 27inch,so I will not benefit from this card because it has no G-Sync that I need so I either switch monitors or graphics cards I guess. Lol.


Thanks for the reply man. They are crazy rare but then again I can't imagine they were made in their millions. If you do decide to go the G-sync card route and you're looking to sell the waterblock please let me know. I'd be very interested. Cheers.


----------



## sub0seals

Thanks for the data Mega..your always so helpful. lol


----------



## grifers

Hi. This card have bios unlock 75º degrees limit?. Thanks. Sorry my bad language


----------



## Sgt Bilko

Quote:


> Originally Posted by *grifers*
> 
> Hi. This card have bios unlock 75º degrees limit?. Thanks. Sorry my bad language


75c limit is to protect the AIO pump from heat damage.


----------



## grifers

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 75c limit is to protect the AIO pump from heat damage.


Ok thanks, so what is the solution? Better 2 fans for the pump?. Thanks! and sorry my language again.


----------



## Sgt Bilko

Quote:


> Originally Posted by *grifers*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 75c limit is to protect the AIO pump from heat damage.
> 
> 
> 
> Ok thanks, so what is the solution? Better 2 fans for the pump?. Thanks! and sorry my language again.
Click to expand...

Better fans, case airflow etc all help


----------



## grifers

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Better fans, case airflow etc all help


Ok Thanks. I Have a Silverstone raven 03 case, there is not a problem (I think







). Im planing to buy better fans. Bye and thanks again.


----------



## Chephren

Hi there,

my problem is, that Crossfire Options in Gameprofiles are gone since Driver 16.5.1.

Win7 64 bit, and GPU Z and Devicemanager shows this Card as R9 295x2, and not R9 Series 200.


----------



## diggiddi

Quote:


> Originally Posted by *Chephren*
> 
> Hi there,
> 
> my problem is, that Crossfire Options in Gameprofiles are gone since Driver 16.5.1.
> 
> Win7 64 bit, and GPU Z and Devicemanager shows this Card as R9 295x2, and not R9 Series 200.
> 
> 
> Spoiler: Warning: Spoiler!


Reinstall Crimson but turn of your firewall until installation is complete and see if that helps


----------



## nosequeponer

hello everyone here, i´m about to get a 295x2, actualy was thinking about cfx my 290x, but the 295x2 at the right Price came across..

the thing is that i allready have the 290x and the CPU under wáter, and was thinking about what to do to include the 295 into the loop...

it is almos impossible to find stock for any full cover wáter block, excep fot the alphacool..

so i have a lot of questions....:

how good is it?? because it seems to me to be like the stock system, without the fan for the vrm and phases... (so i´m guessing not that good..)

also was thinking about rip off the stock rad, , and dismantel the pumps, and then conect it to my actual loop (good idea?? bad idea?? the tubing diameter is different, si i´ll need some kind of adaptor..)

tri-fire is out of question.... (not enough PSU...)

thanks in advance


----------



## Dagamus NM

Quote:


> Originally Posted by *nosequeponer*
> 
> hello everyone here, i´m about to get a 295x2, actualy was thinking about cfx my 290x, but the 295x2 at the right Price came across..
> 
> the thing is that i allready have the 290x and the CPU under wáter, and was thinking about what to do to include the 295 into the loop...
> 
> it is almos impossible to find stock for any full cover wáter block, excep fot the alphacool..
> 
> so i have a lot of questions....:
> 
> how good is it?? because it seems to me to be like the stock system, without the fan for the vrm and phases... (so i´m guessing not that good..)
> 
> also was thinking about rip off the stock rad, , and dismantel the pumps, and then conect it to my actual loop (good idea?? bad idea?? the tubing diameter is different, si i´ll need some kind of adaptor..)
> 
> tri-fire is out of question.... (not enough PSU...)
> 
> thanks in advance


I wouldn't rip them apart unless you are going to install a different pump. Kind of a lot of work.

What was a good price that prompted you to buy this setup?

I am going to be updating my pair of 295x2s to a pair of Titan XPs in the coming weeks. I have the hardware, just need to assemble everything.

At this point I would be selling my 295x2s with Full cover EK waterblocks as well as the original coolers in the original packaging. Trying to get an idea of how much to list them at.


----------



## nosequeponer

Quote:


> Originally Posted by *Dagamus NM*
> 
> I wouldn't rip them apart unless you are going to install a different pump. Kind of a lot of work.
> 
> What was a good price that prompted you to buy this setup?
> 
> I am going to be updating my pair of 295x2s to a pair of Titan XPs in the coming weeks. I have the hardware, just need to assemble everything.
> 
> At this point I would be selling my 295x2s with Full cover EK waterblocks as well as the original coolers in the original packaging. Trying to get an idea of how much to list them at.


hello:

think prices here would be a Little different of what you can get there (i´m in Spain), but i paid 300€ (arround $320), considering i could get another 290x for arround 150€-175€ (or sell mine for that) is not a bad Price, i think...

as you have the EK, yours should be a bit more expensive, i think..

i was looking for any of those EK blocks, but they are imposible to find... if, by any chance, you think about dissemble any of them, please let me know...


----------



## Dagamus NM

Quote:


> Originally Posted by *nosequeponer*
> 
> hello:
> 
> think prices here would be a Little different of what you can get there (i´m in Spain), but i paid 300€ (arround $320), considering i could get another 290x for arround 150€-175€ (or sell mine for that) is not a bad Price, i think...
> 
> as you have the EK, yours should be a bit more expensive, i think..
> 
> i was looking for any of those EK blocks, but they are imposible to find... if, by any chance, you think about dissemble any of them, please let me know...


If I found a buyer that wasn't interested in the EK block then I will.


----------



## nosequeponer

Quote:


> Originally Posted by *Dagamus NM*
> 
> If I found a buyer that wasn't interested in the EK block then I will.


thanks!!!

i´ll try to find a waterblock in the mean time...

has anyone tried the alphacool??

seems to be the only one with some stock left....


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> If I found a buyer that wasn't interested in the EK block then I will.
> 
> 
> 
> thanks!!!
> 
> i´ll try to find a waterblock in the mean time...
> 
> has anyone tried the alphacool??
> 
> seems to be the only one with some stock left....
Click to expand...

I've got an EK block for the 295x2, haven't gotten around to setting my loop back up to play with it though.


----------



## nosequeponer

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> If I found a buyer that wasn't interested in the EK block then I will.
> 
> 
> 
> thanks!!!
> 
> i´ll try to find a waterblock in the mean time...
> 
> has anyone tried the alphacool??
> 
> seems to be the only one with some stock left....
> 
> Click to expand...
> 
> I've got an EK block for the 295x2, haven't gotten around to setting my loop back up to play with it though.
Click to expand...

U mean u have it laying arround unused??

If u want to, maybe we can talk about it...

Enviado desde mi iPhone utilizando Tapatalk


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dagamus NM*
> 
> If I found a buyer that wasn't interested in the EK block then I will.
> 
> 
> 
> thanks!!!
> 
> i´ll try to find a waterblock in the mean time...
> 
> has anyone tried the alphacool??
> 
> seems to be the only one with some stock left....
> 
> Click to expand...
> 
> I've got an EK block for the 295x2, haven't gotten around to setting my loop back up to play with it though.
> 
> Click to expand...
> 
> U mean u have it laying arround unused??
> 
> If u want to, maybe we can talk about it...
> 
> Enviado desde mi iPhone utilizando Tapatalk
Click to expand...

I have one sitting on my desk next to me not being used, I bought it used and just haven't gotten around to actually hooking it up since the 295x2 isn't my primary card anymore


----------



## MIGhunter

Question about temps.

So, I've probably never used my card in Crossfire mode because you have to be in Fullscreen mode for Crossfire to work. I don't like fullscreen mode. So, I was testing it out this week on Black Desert Online. I'm using CPUID HWmonitor on my 2nd screen so I can see the temps and activity of the card. I've tested this with and without the program called Clock Blocker from Guru3d that keeps your 2nd card from downclocking. Running with Crossfire enabled, game in Fullscreen and Clock buster running, both cards are active at 1018Mhz Graphics and 1250Mhz Memory. Temps on card 1 are consistently 4 degrees cooler than card 2. Using Fraps to monitor FPS, I'm usually around 90 at 2560.1440 on top settings and 144hz monitor. When the cards get really working, I will notice a dip down to like 30fps for a brief second. When it does, my GPU on the 2nd card reads 64 degrees and it shows Graphics at 300Mhz and Memory at 150Mhz. So obviously it's throttling, which I feel like 64 degrees is pretty low to be throttling. Also, why would my cards be 4 degrees different? I think I might know cause it's a water cooled setup and maybe the TIM or something isn't as good as the 1st card but idk. Is this a normal variant? Why would it throttle at 64 degrees? My 290x didn't throttle until 90 degrees.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MIGhunter*
> 
> Question about temps.
> 
> So, I've probably never used my card in Crossfire mode because you have to be in Fullscreen mode for Crossfire to work. I don't like fullscreen mode. So, I was testing it out this week on Black Desert Online. I'm using CPUID HWmonitor on my 2nd screen so I can see the temps and activity of the card. I've tested this with and without the program called Clock Blocker from Guru3d that keeps your 2nd card from downclocking. Running with Crossfire enabled, game in Fullscreen and Clock buster running, both cards are active at 1018Mhz Graphics and 1250Mhz Memory. Temps on card 1 are consistently 4 degrees cooler than card 2. Using Fraps to monitor FPS, I'm usually around 90 at 2560.1440 on top settings and 144hz monitor. When the cards get really working, I will notice a dip down to like 30fps for a brief second. When it does, my GPU on the 2nd card reads 64 degrees and it shows Graphics at 300Mhz and Memory at 150Mhz. So obviously it's throttling, which I feel like 64 degrees is pretty low to be throttling. Also, why would my cards be 4 degrees different? I think I might know cause it's a water cooled setup and maybe the TIM or something isn't as good as the 1st card but idk. Is this a normal variant? Why would it throttle at 64 degrees? My 290x didn't throttle until 90 degrees.


I'm not even going to ask why you have a 295x2 but never used both cores but ok.....

the temp variance between the cores is because of the way the AIO loop is set up, the coolant goes from the first core and takes heat away and then continues to the second core, since the coolant is warmer at the input of core 2 than core 1 less heat can be extracted.

as for BDO, from what I know of it Crossfire is very finicky and it's quite CPU bound as well so it might be the i5 holding it back.

the throttling for the 295x2 starts at 75c and that is to protect the AIO loop more than anything else but it is possible that the vrms are overheating and that is causing it to throttle too.


----------



## MIGhunter

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm not even going to ask why you have a 295x2 but never used both cores but ok.....
> 
> the temp variance between the cores is because of the way the AIO loop is set up, the coolant goes from the first core and takes heat away and then continues to the second core, since the coolant is warmer at the input of core 2 than core 1 less heat can be extracted.
> 
> as for BDO, from what I know of it Crossfire is very finicky and it's quite CPU bound as well so it might be the i5 holding it back.
> 
> the throttling for the 295x2 starts at 75c and that is to protect the AIO loop more than anything else but it is possible that the vrms are overheating and that is causing it to throttle too.


Sorry, forgot there are 2 computers in my profile. Not sure how to move them so the default is my other PC. I'm running an i7 5820 OC'd to 4.2Ghz. Also, I should have been more clear. I'm running a full water cooled setup with EKWB block and backplate. I used Thermal Grizzly Hydronaut High Performance Thermal Greas and Fujipoly SARCON X-E Extreme System Builder Thermal Padding for the GPU.

That aside, why would it throttle at 64 degrees if it's meant to throttle at 75? It's actually better to run not in Crossfire mode and stay consistantly 50-60fps than it is to run Crossfire and get 90fps till it throttles me to 35fps. It's not very stable. Right now, I'm in windowed borderless mode, getting 55-55 fps and the temp is only 54 degrees. As soon as I go full screen, fps jumps to 90 but the temps jump to 60 and 64 respectively. Which is when I notice it downclocking.

To answer your 1st comment. I guess I never knew that you had to be in Full Screen mode for Crossfire to work. When BDO came out, ppl said crossfire was broken in the NA version so I just assumed it didn't work.


----------



## Sgt Bilko

I'm guessing it's a power management problem, could try upping the powerlimit or disabling ULPS via Afterburner/Trixx whatever you use.


----------



## nosequeponer

Got the 295 instaled, seems to me de performance is not as expected, in bf1, the gpu usage is all over the place.. sometimes both are getting 100%, then one drop to 10, then go up to 50, then again 100, and both cores do the same randomly...
Strange..


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> Got the 295 instaled, seems to me de performance is not as expected, in bf1, the gpu usage is all over the place.. sometimes both are getting 100%, then one drop to 10, then go up to 50, then again 100, and both cores do the same randomly...
> Strange..


Don't you have a 2500k?

if so then that's you answer right there, BF1 hates quad cores

But if you don't then please fill out your sig rig (link in my sig)


----------



## nosequeponer

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Don't you have a 2500k?
> 
> if so then that's you answer right there, BF1 hates quad cores
> 
> But if you don't then please fill out your sig rig (link in my sig)


i still have the 2500k, it never goes above 90% usage, (4.8)

the think is that, with the previous 290x, it was butter smooth, but now, with the 295x, it has less fps, and flickers in the menus.. one funny thing is that, it seems 1 in each 2 letters are missing in the menus.. something like m lti l yer , instead of multilpayer.

i´ll try to reinstal the drivers again, to see if that could be the problem..


----------



## Alex132

Ayyy 2500k + 295X2 fam









I feel like the 295x2 is more and more a 290X for me. Rarely any games I play take advantage of the 2nd GPU


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Don't you have a 2500k?
> 
> if so then that's you answer right there, BF1 hates quad cores
> 
> But if you don't then please fill out your sig rig (link in my sig)
> 
> 
> 
> i still have the 2500k, it never goes above 90% usage, (4.8)
> 
> the think is that, with the previous 290x, it was butter smooth, but now, with the 295x, it has less fps, and flickers in the menus.. one funny thing is that, it seems 1 in each 2 letters are missing in the menus.. something like m lti l yer , instead of multilpayer.
> 
> i´ll try to reinstal the drivers again, to see if that could be the problem..
Click to expand...

Yeah....that is most definately a driver issue, I remember having somethign similar happen with Battlefront when it launched


----------



## MIGhunter

Quote:


> Originally Posted by *nosequeponer*
> 
> Got the 295 instaled, seems to me de performance is not as expected, in bf1, the gpu usage is all over the place.. sometimes both are getting 100%, then one drop to 10, then go up to 50, then again 100, and both cores do the same randomly...
> Strange..


Watch your stats on your 2nd gpu. See the trends and clock speeds. If you hit 65C it's throttling cause of heat. Activate over drive and it well change it to 75C like it's supposed to be. Had this exact problem with mine. Also, you can change the power management to keep the card from down clocking if it's doing that. There's also a program called clock blocker on guru 3d that will stop that too

Finally, always use DDU when doing the drivers.


----------



## nosequeponer

i´ll check later the temp threshold, thanks for the advice.

let´s see if i can manage to make it work as it should


----------



## nosequeponer

now it seems to be working properly, BF1, 1440p ultra resolution , arround 70-80 fps...

but it is now reaching the limits of the 2500k... all the time in the 95% or above usage..

don´t want to spend more in a new mobo...


----------



## DMatthewStewart

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm guessing it's a power management problem, could try upping the powerlimit or disabling ULPS via Afterburner/Trixx whatever you use.


Is ULPS checkbox still working when using Crimson 16.12.1 ReLive and AB?


----------



## lanofsong

Hey AMD R9 295X2 owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Code-Red

Hey guys, having issues with a 295x2 I just bought. On the left BIOS, it'll crash shortly before the Windows login, right, it'll crash a few minutes after I login. The screen just goes black, the fan goes quiet. Any idea what this is or how it can be fixed?


----------



## Code-Red

Double post


----------



## nosequeponer

If the other bios work fine, try to flash a new stock bios in the position that fails, maybe that one has crazy valors, or it''s for massive oc or something


----------



## Samuris

It's seems to be a too low idle vddc. Like r9 290 blackscreen Who needed +25mv on afterburner, try to fast launch iturbo before bug And put +25mv just for see the result


----------



## Code-Red

Neither BIOS works properly, and I don't have near enough time to install the drivers in Windows to tune anything.


----------



## nosequeponer

Get a stock one and flahs it to the card, then try again


----------



## martin042068

Hello all,

A game that runs really great on the 295x2 is Deus EX Mankind for me so far. (Had the game a few days.) Have everything maxed to Ultra setting on DX12. it's using both GPUs and seems to be 100% scaling. Amazing! Really like this forum BTW.


----------



## Mega Man

Welcome and I am glad you like you card, the game, and this forum


----------



## martin042068

Thank you for the welcome. I really like 295x2 and have no plans of going Green. Maybe if AMD newer cards bring something we'll see. Seems as of late, AMD has made big improvements in driver support.


----------



## nosequeponer

So far, i'm getting bench results between nvidias 1070-1080, and that's with a 2500k...
Waiting for my new 28" 4k monitor..


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> So far, i'm getting bench results between nvidias 1070-1080, and that's with a 2500k...
> Waiting for my new 28" 4k monitor..


It really comes into it's own at 4k, just remember to turn AA down/off, at 28" 4k you won't see a benefit and it'll free up alot of Vram as well


----------



## nosequeponer

the good thing is that i´m getting those numbers, compared to reviews, and they use like 6 cores I7...

can not wait for the monitor to arrive...


----------



## Paul17041993

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It really comes into it's own at 4k, just remember to turn AA down/off, at 28" 4k you won't see a benefit and it'll free up alot of Vram as well


eeeeeeh that's subject to debate, I still want at least 4x SSAA on my 24" 4k, however this particular display has very rich colour so the edges may pop out much more than other displays.

Also depends on the user's eyesight of course...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It really comes into it's own at 4k, just remember to turn AA down/off, at 28" 4k you won't see a benefit and it'll free up alot of Vram as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> eeeeeeh that's subject to debate, I still want at least 4x SSAA on my 24" 4k, however this particular display has very rich colour so the edges may pop out much more than other displays.
> 
> Also depends on the user's eyesight of course...
Click to expand...

I've had three 4k monitors to date (2x 27" LG IPS and 1x Samsung 28" TN) and I personally haven't seen a benefit to using AA on them, the upsides of not using it outweigh the frustration of hitting that Vram wall imo

But as you said it is up to the individual user.


----------



## nosequeponer

still got the 290x liying arround, so i was thinking about a trifire with the 295x2..

a 1200w psu should be more than enough for job right??


----------



## martin042068

I'll have to try turning off AA to see if there's a difference. Been leaving everything on my LG 34UC98 and 295x2 .


----------



## martin042068

Has anyone notices on the new Crimson Relive drivers (version: 16.12.1) Corsair link no longer shows pump or temperature information on the 295x2? Is still lists the card as to R9 200 Series:


----------



## martin042068

Yes, should be just fine with plenty of overhead.


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> still got the 290x liying arround, so i was thinking about a trifire with the 295x2..
> 
> a 1200w psu should be more than enough for job right??


Yes, 1200w is enough for a 295x2 and 290x, just make sure if you have multiple 12v rails to sort them out.
Quote:


> Originally Posted by *martin042068*
> 
> I'll have to try turning off AA to see if there's a difference. Been leaving everything on my LG 34UC98 and 295x2 .


At 3440x1440 AA helps things look better, at 4k it doesn't for me personally.

It's also dependant on screen size as well, a 34" 3440x1440 Ultrawide monitor has about the same pixel density of a 27" 2560x1440 monitor.

a 28" 4k monitor has a much greater pixel density, hence why I leave AA off in most games.

I have both a 34" Ultrawide and 28" 4k


----------



## F4ze0ne

Quote:


> Originally Posted by *nosequeponer*
> 
> still got the 290x liying arround, so i was thinking about a trifire with the 295x2..


I personally wouldn't bother adding the 290x. The support for trifire has been pretty poor with current games.


----------



## nosequeponer

got the new monitor, no luck making it working at 4k.... it doesn´t recognize that resolution.....

if i conect it via mini dp-dp cable, it does´t give any signal...

i´m using the minidp to hdmi adaptor and the it works, but no sign of the 4k resolution....


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> got the new monitor, no luck making it working at 4k.... it doesn´t recognize that resolution.....
> 
> if i conect it via mini dp-dp cable, it does´t give any signal...
> 
> i´m using the minidp to hdmi adaptor and the it works, but no sign of the 4k resolution....


What Monitor?

Some of them are a bit funny with mDP to DP connections.

98% of my issues with getting 4k to display is the cable itself.


----------



## nosequeponer

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What Monitor?
> 
> Some of them are a bit funny with mDP to DP connections.
> 
> 98% of my issues with getting 4k to display is the cable itself.


it´s a japannext jnt280UHD

the funny thing is that it connect via mini dp to hdmi apdapter, (but no 4k resolution avaliable..) but if i connect the minidp to dp calbe in the same minidp port it does nothing...

i´ll try to get a new minidp to dp cable, and try again...

so far, windows is telling me 1080p as recomended, and maximum 3200x1800....

i have a proper hdmi 2.0 cable conected, but no more resolution

i can also try to find an active mini dp to hdmi 2.0 adapter..


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> What Monitor?
> 
> Some of them are a bit funny with mDP to DP connections.
> 
> 98% of my issues with getting 4k to display is the cable itself.
> 
> 
> 
> it´s a japannext jnt280UHD
> 
> the funny thing is that it connect via mini dp to hdmi apdapter, (but no 4k resolution avaliable..) but if i connect the minidp to dp calbe in the same minidp port it does nothing...
> 
> i´ll try to get a new minidp to dp cable, and try again...
> 
> so far, windows is telling me 1080p as recomended, and maximum 3200x1800....
> 
> i have a proper hdmi 2.0 cable conected, but no more resolution
> 
> i can also try to find an active mini dp to hdmi 2.0 adapter..
Click to expand...

HDMI 2.0 won't work for a couple of reasons.

1. The 295x2 only supports up to HDMI 1.4a (which is a max of 4k @ 30fps) even though it doesn't have a port for it.

2. you are using an adaptor which means that unless the adaptor is rated for 4k then it won't detect it as a resolution (probably why you can only see 1080p as your max resolution since 3200x1800 is VSR)

For a mini DP to DP cable that supports 4k then you want one that is rated at 8.64 Gbps at least, that's the tricky part.

It's also possible that the monitor won't accept a mDP to DP connection, my Samsung U28E850 for example loses source signal all the time is I use a DP (GPU) to mDP (Monitor) cable while on the other hand if I use a DP to DP connection then it's perfectly fine and FreeSync works.


----------



## nosequeponer

Quote:


> Originally Posted by *Sgt Bilko*
> 
> HDMI 2.0 won't work for a couple of reasons.
> 
> 1. The 295x2 only supports up to HDMI 1.4a (which is a max of 4k @ 30fps) even though it doesn't have a port for it.
> 
> 2. you are using an adaptor which means that unless the adaptor is rated for 4k then it won't detect it as a resolution (probably why you can only see 1080p as your max resolution since 3200x1800 is VSR)
> 
> For a mini DP to DP cable that supports 4k then you want one that is rated at 8.64 Gbps at least, that's the tricky part.
> 
> It's also possible that the monitor won't accept a mDP to DP connection, my Samsung U28E850 for example loses source signal all the time is I use a DP (GPU) to mDP (Monitor) cable while on the other hand if I use a DP to DP connection then it's perfectly fine and FreeSync works.


thanks a lot man,

just a few questions more...

the card only have minidp ports, so how do you conect a dp-dp cable?? with an adapter??
the monitor came with minidp to dp cable , and that´s the one i´m trying to use, i´ll get a new one and try.. hope this solve it..

more to come later if i can find the cable somewhere arround...


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> HDMI 2.0 won't work for a couple of reasons.
> 
> 1. The 295x2 only supports up to HDMI 1.4a (which is a max of 4k @ 30fps) even though it doesn't have a port for it.
> 
> 2. you are using an adaptor which means that unless the adaptor is rated for 4k then it won't detect it as a resolution (probably why you can only see 1080p as your max resolution since 3200x1800 is VSR)
> 
> For a mini DP to DP cable that supports 4k then you want one that is rated at 8.64 Gbps at least, that's the tricky part.
> 
> It's also possible that the monitor won't accept a mDP to DP connection, my Samsung U28E850 for example loses source signal all the time is I use a DP (GPU) to mDP (Monitor) cable while on the other hand if I use a DP to DP connection then it's perfectly fine and FreeSync works.
> 
> 
> 
> thanks a lot man,
> 
> just a few questions more...
> 
> the card only have minidp ports, so how do you conect a dp-dp cable?? with an adapter??
> the monitor came with minidp to dp cable , and that´s the one i´m trying to use, i´ll get a new one and try.. hope this solve it..
> 
> more to come later if i can find the cable somewhere arround...
Click to expand...

Best way to do it (in my opinion) is to get a good quality mDP to DP cable, I'm not sure where you are located on this big blue ball but you should have somewhere local that sells them.

If you can't do that then a mDP to DP adaptor will work as well, I can find much info on your monitor but if it's a cheaper one then the cable that comes with it might not be up to scratch.

In all honestly mate, if you are going to be swapping GPUs or monitors then it always pays off to spend the money and get yourself some nice display cables (hell, I've even got a DP to DVI Active Adaptor), it's worth it in the long run


----------



## nosequeponer

Owe you a beer..(virtual one sorry..)

I'll get a new mini dp to dp after lunch (fingers crossed) hope that solves the problem once for all


----------



## DaFunkWizard

Hey im getting a 295x2 in 2 days jusy wondering what games people have struggled with and what cpus people pair their card with


----------



## diggiddi

Quote:


> Originally Posted by *nosequeponer*
> 
> Owe you a beer..(virtual one sorry..)
> 
> I'll get a new mini dp to dp after lunch (fingers crossed) hope that solves the problem once for all


i think you need the active minidp to HDMI 2.0 adapter
https://www.amazon.com/Club3D-Displayport-1-2-HDMI-CAC-1070/dp/B017BQCUGW?th=1


----------



## nosequeponer

Quote:


> Originally Posted by *diggiddi*
> 
> i think you need the active minidp to HDMI 2.0 adapter
> https://www.amazon.com/Club3D-Displayport-1-2-HDMI-CAC-1070/dp/B017BQCUGW?th=1


Got one of them and a minidp to dp cable on their way...

Hopefully it will work


----------



## diggiddi

Quote:


> Originally Posted by *nosequeponer*
> 
> Got one of them and a minidp to dp cable on their way...
> 
> Hopefully it will work


----------



## nosequeponer

Works perfect, 4k @ 60


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> Works perfect, 4k @ 60


That's with the new cable?


----------



## nosequeponer

With the club3d active one, just connected and got the 4k as recomended resolution,

Now i have to adjust the font and colors and perfect


----------



## Sgt Bilko

Quote:


> Originally Posted by *nosequeponer*
> 
> With the club3d active one, just connected and got the 4k as recomended resolution,
> 
> Now i have to adjust the font and colors and perfect


Good to hear, so it was the adaptor that didn't support the resolution.

if you're on Windows 10 then 150% adjustment is what I use.


----------



## nosequeponer

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's with the new cable?


With the active one from 3dclub


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nosequeponer*
> 
> With the club3d active one, just connected and got the 4k as recomended resolution,
> 
> Now i have to adjust the font and colors and perfect
> 
> 
> 
> Good to hear, so it was the adaptor that didn't support the resolution.
> 
> if you're on Windows 10 then 150% adjustment is what I use.
Click to expand...

Quote:


> Originally Posted by *nosequeponer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That's with the new cable?
> 
> 
> 
> With the active one from 3dclub
Click to expand...

I meant the old adaptor you were using didn't support it.


----------



## nosequeponer

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I meant the old adaptor you were using didn't support it.


Sometimes tapatalk do weird things with the quotes....

The adaptor that came with the card is a passive one, did not support [email protected], andnthe mini dp to dp cable that came with the monitor didn't work..

Yesterday i tried d3 at 4k, really nice experience (but no crossfire support...)

Still have to try bf1..


----------



## DaFunkWizard

hi just wondering if anyone could possibly help me

I just got a 295x2 yesterday and it is terribly underperforming even in games with great crossfire scaling the only game that has worked well is dying light

Also even with crossfire disabled battlefield 1 is not performing to a 290x level


----------



## DaFunkWizard

any help would be greatly appreciated


----------



## nosequeponer

try to run some tests like heaven or 3dmark and check if your scores are in line, for me, firestrike ultra i get arround 5050 points and tyme spy, 6500-6600 points, i get allways between a 1070 and a 1080 (closer to the 1080)

try to see where is your card,


----------



## nosequeponer

by the way, what kind of scores are the people getting on those benchs with the 295x2??


----------



## lanofsong

Hello AMD R9 295X2 owners,

We are having our monthly Foldathon from Monday 16th - 18th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

January 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## nosequeponer

I'm finally going for the trifire route, as nobody seems to like my 290x, i'm going to put it back in with the 295...


----------



## nosequeponer

funny thing, only 2 out of 3 working... one of the chips form the 295 and the 290x..., the other core from the 295 sits at idle...

so far i got the 290x in the first slot, and the 295 in the other one, and the monitor is conected to the 295..

i guess i´ll ha ve to try to conecto the monitor to the 290x first right??

swaping cards would be a pain, because the 290x is watercooled, and thre´s not enough tubbe lenght to move it down


----------



## nosequeponer

now in benchmarks the 3 cards seems to be running, but in bf1 only 2 of them... wierd...


----------



## diggiddi

Try using AMD profile for the title in crimson and see if that makes a difference


----------



## ramos29

i think my 295x2 is dead -_- when i start my rig after 10-15s the pc shuts down, on desktop and even when running bios setup the pc shuts down, when i removed the gpu the pc boots normally , i pluged another gpu to test the pci express slot every thing is ok
my freind opened the gpu and saw no sign of burns in the motherbord of the 295x2 -_-
anyone knows what could be the issue? i did not made any sort of overcloking ,always on stock speeds


----------



## Alex132

Try switch the vBIOS switch and see what happens.


----------



## ramos29

excuse me for bothering you but how do i do that, i reset the bios of the motherboard, did not know that i can do the same with the gpu


----------



## nosequeponer

There's a small switch on the side of the pcb, the card has 2 bios ,

Have you check the psu??


----------



## ramos29

i searched for another power-hungry card to test it and see if it is the psu the origin of the trouble but no one wanted to borrow me its gpu XDD tested the 295x2 on another rig with 900w psu it shut down after a while, that make me think that its the fault of the gpu


----------



## Alex132

Quote:


> Originally Posted by *ramos29*
> 
> excuse me for bothering you but how do i do that, i reset the bios of the motherboard, did not know that i can do the same with the gpu


Flip this switch:


----------



## ramos29

tried that, and pluged the video card with other 8 pins cables but still the ame problem, after a couple of minutes the syst shuts down -_-, without the 295x2 every thing is ok


----------



## Gutfux

Anyone got advice on open water loop systems? Setting mine up at the moment and can't decide whether to intake from outside case for my CPU rad and do the same with my 295x2 rad then have exhaust on the top like in the above pic


----------



## fishingfanatic

There is no real set way except for the obvious, though in general air in the front, exhaust out the back.

I changed my fans for a more aggressive air flow bcuz I have a gpu fan and 3 fans up top besides the usual in/out.

Consider changing the intake to a higher cfm to allow air to be exhausted out the back as well as the top. With greater intake you can do the

same with ur exhaust and rad fans as well. Ur pushing air from inside the case thru the rad, so keeping the air as cool as possible inside helps.

Mine r not on rads, simply for air flow to minimize thermals in the case. I've run tri and quad sli a few times and it really helps in that instance.

a pr of 122 cfm fans for intake, another the same speed for the gpu/s.

1 more intake on top of the case which passes air over the vrms, 107 cfm, and 2 of the same speed exhausting out the top. Just look for the

quietest fans db wise.

If you can install an intake on the bottom of the case that usually brings in the coolest air, closest to the floor.

I bought some of those little screens to put in front of the intakes to help keep out lint....

Air flow works best with the fewest obstructions and decent hardware, but the setup can be critical to allow you to squeeze a bit more

performance out of ur setup.

Push/pull works even better if u can put fans on both sides of the rad.

Do a google search just to check out some different setups, it's possible u may c something u have never thought of b4.

Hope that helps a little...









FF


----------



## Code-Red

I've tried almost everything to get this 295x2 working. Reinstalled Windows, bought a EVGA supernova 1000w PSU, driver swept and new drivers installed multiple times... same problem every time - black screen. Bios switch on the left doesn't even boot to Windows, bios switch on the right gets me past the login screen then goes to black right away. Screen goes black, fan goes very quiet.

Is this thing done for? I was hoping to use it as a placeholder on my Ryzen build.


----------



## Alex132

Have you tried cleaning it out, just incase its overheating on the core/vrm/vram?


----------



## Code-Red

I don't think it's overheating, as it crashes every time the drivers initialize. I'll take it apart anyways tonight, apply some fresh TM and give it a go.

After this and BOTH of my 9800GX2's, I don't think I'll ever touch a dual GPU card again.


----------



## ramos29

my prblm is close to yours, but for me either the pc shuts down or the screen goes black and the fan goes mad
tried several attempts to fix that but nothing worked, removing the card is the only way to prevent shut downs
after a while and for no reason the card worked again without any prblem for two weeks and the prblem rised again
i already bought two radeon fury to replace the 295x2


----------



## Kalmado

I bought a used 295x2 that came with a water block on it, but I had to put the stock cooler on because I don't have a custom loop. Cooler was damaged during shipping because one of the pumps is leaking. I've searched high and low and I cannot find a stock cooler replacement for this guy. Can anyone point me in the right direction please?


----------



## Juris

Quote:


> Originally Posted by *Kalmado*
> 
> I bought a used 295x2 that came with a water block on it, but I had to put the stock cooler on because I don't have a custom loop. Cooler was damaged during shipping because one of the pumps is leaking. I've searched high and low and I cannot find a stock cooler replacement for this guy. Can anyone point me in the right direction please?


They pop up on eBay pretty often. There is a guy in the UK selling one but I've spotted them recently on eBay US as well. http://www.ebay.co.uk/itm/XFX-AMD-Radeon-R9-295x2-8GB-HYDRA-EDITION-ONLY-WATER-COOLER-/122387762635?hash=item1c7ee10dcb:g:4ZMAAOSw2gxYqJTk. I'm not associated with the seller but he showed up in my 295x2 saved search as I'm looking to go the other way and find a waterblock for mine.


----------



## Insan1tyOne

Hello Everyone,

I figured I would pop in here to say that I have a Diamond R9 295x2 (with stock cooler) in storage that is in perfect condition. I would like to sell it as I am never going to use it. It will come with the retail box, as well as the VRM fan control mod. This card has never been overclocked or tampered with in any way, except for installing the VRM fan control cable. I would let it go for $350 or your best offer. The card was pulled from my main machine around 4 - 5 months ago.

P.S. - If you want photos or have any questions just PM me.

Thanks!

- Insan1tyOne


----------



## lanofsong

Hey AMD R9 295X2 owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Kalmado

Finally got my 295x2 up and running. Push/Pull config and temps are good. Installed Crimson 17.3.3 drivers and out of over 600 games on steam I only pulled six, yes six profiles. I've used crossfire before and had so many profiles. Card #2 is "Linked". Device Manager reports both cards as working. On board is disabled. Also, Heaven benchmark at 1080p Ultra is only pulling around 30fps even when reporting crossfire is enabled via the upper right corner of the screen.

Driver issue? Something else? Not sure what is going on here.

edit: Tested Tomb Raider 2013 and scored 112fps average 4k Ultra everything w/AA off. Heaven Benchmark is still 30 ish fps at 1080p and the more I tested the more it was a stuttering mess. So weird.

edit 2: Installed 17.1.2 but same thing with profiles. In MSI Afterburner I set OSD for gpu temps, usage, and core clock to see if GPU #2 was indeed getting used. This time profiles did not pull one for Tomb Raider so I ran it without creating one. The previous benchmark from above must of been at 1440p because this round, with no profile created in Crimson, both gpus were being used and my average fps was 86.6 at 4k. Created a profile and reran. 86.0fps. Tested Fallout 4 and both gpus were in use and at 4k with textures at Ultra and then a mix of normal to high for other things I was getting a solid 60fps (per Afterburner) at 4k in Diamond City.


----------



## fruta

Did anyone ever find a uefi version of the 295x2 oc bios? Or is it just csm?


----------



## Mega Man

I have not


----------



## thisjustanother

im thinking about upgrading my computer. i know this is a great gpu and everything, but is there a general consensus on what the logical upgrade is? im running a 3440x1440 ultrawide monitor. the gpu does a great job running it, but would it be better to look towards another gpu or just stick with this one a bit longer


----------



## diggiddi

Quote:


> Originally Posted by *thisjustanother*
> 
> im thinking about upgrading my computer. i know this is a great gpu and everything, but is there a general consensus on what the logical upgrade is? im running a 3440x1440 ultrawide monitor. the gpu does a great job running it, but would it be better to look towards another gpu or just stick with this one a bit longer


What games do you play and what fps are you getting in them? Some might say wait for Vega but Vega aint gonna be cheap so it depends on your budget
You could however pickup a Fury and sell this to cover the difference if you wish


----------



## martin042068

Just depends on your budget and what games you're playing. I'm going to wait for Vega at this point as it's only maybe a few months from now.. My 1440 monitor and 295x2 are doing very well together.


----------



## thisjustanother

im still getting very reasonable fps and everything when gaming. usually playing, shooters, rpgs, mmos, etc, just about everything. and my budget isnt really set as since i have had the 295x2, and upgraded the monitor it should be assumed im ok with spending some money lol.

i just asked as its been awhile since ive actually payed attention to the gpu market and watched what is coming up and what not. so seeing that vega seems to be the next big thing, i may just wait for that


----------



## Hanibal

Hello everyone
Need help installing the Koolance VID-AR295X2 Water Block for the AMD Radeon R9 295X2 XFX graphics card.
The problem I have with installing a thermal pad, I do not know what thickness to mount in place, I do not have a diagram to position the thermal pad.
Thanks all


----------



## sling00

Here is the manual from koolance: the thermal pads are all 1mmish

https://koolance.com/files/products/manuals/manual_vid-ar295x2_d100eng.pdf


----------



## lanofsong

Hey there R9 295x2 owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Mega Man

No, please stop spamming our thread


----------



## Paul17041993

Anyone in AU that might be thinking of selling their card?


----------



## Sivos

20170528_205001.jpg 6600k .jpg file


----------



## xc0mbatx

That is quite unique.


----------



## nosequeponer

Impressive..


----------



## Sivos

Im not big on posting..but i looked every where too see if it was ever done and could not find a mod like this..i always questioned..what if the pumps fail ..what if there are no other water blocks around..soo i started studying pictures and the brands that used aseteks aio..soo it works very well..my temps are lower..and cools down much quicker after long gaming sessions..my room is no longer a sauna


----------



## Memmento Mori

Hi guys,
I saw the front page and the settings, but wanted to ask directly about the OC settings which are most safe for long term. 1100 Mhz core and 1375 mem is ok? Safe?

It is a saphire R9 295x2 with the stock cooler.

many thanks for any reaction.


----------



## Deathscythes

Hi everyone,

I have made some modifications for the loop. Also I am now using clear EK coolant premix. Works great and it doesn't seem to evaporate.
However i doubt i'll put red die again... last time it turned purple which isn't the ROG color ;p
There are 4 Alphacool rads for a total surface of 1440x120 mm2.
I hope you like It =)


----------



## Deathscythes

Quote:


> Originally Posted by *Memmento Mori*
> 
> Hi guys,
> I saw the front page and the settings, but wanted to ask directly about the OC settings which are most safe for long term. 1100 Mhz core and 1375 mem is ok? Safe?
> 
> It is a saphire R9 295x2 with the stock cooler.
> 
> many thanks for any reaction.


I am not a good overclocker, I am more keen on making the physical build so to speak.
The memory of the 295x2 seems to overclock extremely well. I crancked mine to 1625 without any Issue but a noticeable performance gain.
As for the core clock it's another story. when I go anywhere higher than 1080 Mhz the clock starts to drop. If I push higher than that the drops become more noticeable sometimes below 1000. So it doesn't seem quite stable.


----------



## Memmento Mori

nice build!

Quote:


> Originally Posted by *Deathscythes*
> 
> I am not a good overclocker, I am more keen on making the physical build so to speak.
> The memory of the 295x2 seems to overclock extremely well. I crancked mine to 1625 without any Issue but a noticeable performance gain.
> As for the core clock it's another story. when I go anywhere higher than 1080 Mhz the clock starts to drop. If I push higher than that the drops become more noticeable sometimes below 1000. So it doesn't seem quite stable.


for how long you have it in this state? I mean the mem OC....


----------



## Deathscythes

Quote:


> Originally Posted by *Memmento Mori*
> 
> Hi guys,
> I saw the front page and the settings, but wanted to ask directly about the OC settings which are most safe for long term. 1100 Mhz core and 1375 mem is ok? Safe?
> 
> It is a saphire R9 295x2 with the stock cooler.
> 
> many thanks for any reaction.


Several months. After I reinstalled everything and started playing competitive games. I didn't OC since then.


----------



## misho93

hi
i don't use my r9 295x2 from 2 year and now when i try useing it not work
i just try put it with r9 290x because i don't have DP cabel or DVI just HDMI it work and see r9 290 (3) but when it have same load like game or mining will have BSOD and say "Device Driver got stuck in an Infinite Loop". or Thread Stuck In Device Driver
* this card was work with this pc and my bror pc not the same error i gusse because i try give more cooler to vrm so without i know damage same thing can see the pic
soory for bad english
plz help me if the proplem from this FP1007 from where i can buy it?


----------



## Sivos

20170614_181757.jpg 269k .jpg file
I see a damaged chip..looks like a corner cracked off.


----------



## misho93

Quote:


> Originally Posted by *Sivos*
> 
> 20170614_181757.jpg 269k .jpg file
> I see a damaged chip..looks like a corner cracked off.


yes i know it for vrm(power) name it FP1007R3 R17 16CH13 L or R17 1408 it don't know what exactly and i don't find on web any shop sell it i need know from where can buy it

* this thing damage from a long time and work but now i don't know why


----------



## Sivos

The chip is probly fine..but to crack it like that had to have put stress on the solder points..have you ever heard of the oven trick? https://turbofuture.com/computers/How-to-Fix-a-Dead-Graphics-Card
Could also look at ebay for cards sold just for parts


----------



## Insan1tyOne

Hello Everyone,

I just wanted to do a brief rant and then give all you current 295x2 owners a VERY good piece of advice. But first, the rant. About one month ago I sold my Diamond R9 295x2 (with VRM fan mod) on Ebay for $600, which is pretty good considering I bought it like a year before that for just $400. But then the seller who bought the 295x2 from me turned around and sold it for $1200 due to this Cryptocurrency mining craze as the card can do 43mh/s or something on ETH. The seller even re-used my pictures and listing description! What a loss on my part, but oh well, you can't win them all I guess.

Secondly, if you have been holding off on selling your R9 295x2's because you spent a lot of money on them and you want a worthy upgrade, *SELL IT NOW.* Sell your R9 295x2 as fast as you can and buy a GTX 1080 Ti for $699 and call it a day. _Do not_ let this opportunity go to waste. Once the Crypto craze dies down again people will not want these cards until the next boom, so strike while the iron is hot and don't look back.

Good Luck!

- Insan1ty_One


----------



## Juris

Its good advice Insan1ty_One. The Crypto craze with ETH is nuts at the moment even though I've heard they will be stopping mining in the future. I can't go Nvidia personally as I run a 6 screen Eyefinity rig and Nvidia max at 4 screens but for 99% of people a 1080Ti could be the better option at present.

I've just found a new EK block for my 295x2 so will be sticking with it till Vega drops and likely a while after as I expect the mining companies to absolutely destroy the available Vega stock at launch and for some time after. Those looking to get a Vega you have been warned.

Might convert the 295x2 to a dedicated custom loop miner (less noise than stock) once I grab a Vega and run both in separate systems. It'll pay for itself in no time


----------



## Juris

Does anyone know if any manufacturer ever made a nickel backplate for the 295x2. I could go with the black EK (if I can find one) but I was hoping to match the GPU to the CPU's nickel supremacy.


----------



## misho93

Quote:


> Originally Posted by *Insan1tyOne*
> 
> Hello Everyone,
> 
> I just wanted to do a brief rant and then give all you current 295x2 owners a VERY good piece of advice. But first, the rant. About one month ago I sold my Diamond R9 295x2 (with VRM fan mod) on Ebay for $600, which is pretty good considering I bought it like a year before that for just $400. But then the seller who bought the 295x2 from me turned around and sold it for $1200 due to this Cryptocurrency mining craze as the card can do 43mh/s or something on ETH. The seller even re-used my pictures and listing description! What a loss on my part, but oh well, you can't win them all I guess.
> 
> Secondly, if you have been holding off on selling your R9 295x2's because you spent a lot of money on them and you want a worthy upgrade, *SELL IT NOW.* Sell your R9 295x2 as fast as you can and buy a GTX 1080 Ti for $699 and call it a day. _Do not_ let this opportunity go to waste. Once the Crypto craze dies down again people will not want these cards until the next boom, so strike while the iron is hot and don't look back.
> 
> Good Luck!
> 
> - Insan1ty_One


i have one r9 295x2 work from 2 year
and from 2 month i buy a new one to quad fire and from only 40 day i start mining on one of them and from what the r9 295x2 mining i buy an other one







but now one of them not work the old one
so if you have r9 295x2 just mining eth and sia and you will get 450$ after one month waiting 2 month and buy x1700 and gtx 1080ti


----------



## lanofsong

Hello R9 295X2 owners,

We are having our monthly Foldathon from Monday 19th - Wednesday 21st - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

June 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## misho93

Quote:


> Originally Posted by *misho93*
> 
> hi
> i don't use my r9 295x2 from 2 year and now when i try useing it not work
> i just try put it with r9 290x because i don't have DP cabel or DVI just HDMI it work and see r9 290 (3) but when it have same load like game or mining will have BSOD and say "Device Driver got stuck in an Infinite Loop". or Thread Stuck In Device Driver
> * this card was work with this pc and my bror pc not the same error i gusse because i try give more cooler to vrm so without i know damage same thing can see the pic
> soory for bad english
> plz help me if the proplem from this FP1007 from where i can buy it?


now i find this thing from the same co ( Eaton) and same model FP1007R3-R17-R can replas it with new one ?


----------



## pingolino

Hi All,

I have a peculiar problem with my 295x2.

The video card with stock cooler declock's when it hits around 68C on GPU 2 (according to CAM) and 65 C on GPU1. Limit is 75C right? Maybe something funny going?

So i changed the limit from 75c to 90c. No effect.
I have used clockblock. No effect.

AHA! But you are wondering and saying, while clicking on your mouse heavily, eagerly wanting to click reply and say : " GET BETTER COOLING!"

So i did. I have 2 water-blocks actually which i got off from reddit. First by alphacool, the other by ...errr....its called Vesuvius (the waterblock itself - momentary lapse of memory of trying to recall the water-block name).

I have installed them both. Both I go and say stress test on 3dmark. With alphacool it starts declocking at 48 C and with the other one around 46 C.

Tried everything above, nothing works.

Its irretating. Why is my video card declocking? Could it be bad PSU? Is the card dead?

I have RM1000 gold for PSU. I mean thats not a bad PSU!

Up to you my fellow geeks. FLY!!!! FLY TO REPLLY!!!!!!!!!!!!!!!!

PS:

no issues while gaming though. works fine, because the load aint gonna be 100% all the time. So there are no declocks there. Just stress test, 5 min into it


----------



## Mega Man

Perhaps i am missing it, what water loop did you have. I think the second block you bought was aquacomputer.


----------



## pingolino

Thats right, aqua computer. But does it really matter?

All I am saying is that I get these strange de-clocks even when the temperature is within a normal range, even very cool.

I am trying to find someones Vbios for both slave and master that is tweaked (but clocks unchanged), to see what would happen.

Anyway, any ideas?


----------



## Mega Man

Ok, well if you want my help feel free to answer my question if not, then ok. I'll assume you fixed it you're Self


----------



## pingolino

Sorry man, But i did answer it, I had aquacomputer waterblock.

But again, If you see, i have used 3 different cooling solutions - 2 custom waterblocks and the standard block and all of them have the same problem on 3dmark's burn test.

Funny enough, when i tried ethereum for a couple of hours only to see what the big deal was, it never declocked, worked full speed ahead lolz.


----------



## Mega Man

Ok, I Will assume you did not understand.

What kind of loop do you have? Ie what's in your loop.


----------



## pingolino

2 x 140mm EK with push/pull config, 1 x 140 ek push pull config, 1 x 360 mm (thin) pull config rads....

4820k cpu clocked at 4.6ghz.

EK-XTOP DDC 3.2 PWM Elite - Plexi (incl. pump), thats a strong pump, good head pressure.

At the moment though, the 295x2 is back onto stock cooler.


----------



## Mega Man

My best guess, is it was not installed properly, the reasoning is, how are your temps that high on water, your loop is good enough. Ram, VRM could also be causing it, probably VRM


----------



## nosequeponer

did you check the vrm temps as Megaman suggested??

seems to me it could be the main reason of the downclock,

lucky you that could find the vesuvius, i´ve been trying to find one for quite a bit, with no luck..


----------



## pingolino

What do I need to check the VRM temps?


----------



## nosequeponer

Gpu-z for sure

And i think afterburner also


----------



## pingolino

afterburner does not do it, that i am sure of 1000%. Never looked for it in gpu-z. might give it a go.


----------



## Mega Man

you have to touch them . (generally back of pcb )


----------



## nosequeponer

I used gpu-z to check vrm temp in my 290x, and it showed vrm1 & 2 temps.
Just downloaded it yesterday and it doesn't show vrm temps on my 295x2...


----------



## Mega Man

Yea - we have never had access to them.

The fan on the oem shroud is controlled by the VRM temps.

The pumps and rad fans are controlled by core temps and what we could control via various apps


----------



## MikeLarry

Hello Guys

My friends PC is dying on bootup, and upon looking into things we notice his GPU is hitting temps above 70 degrees, then the pc will crash.
Looking at the GPU, the fan is not spinning, GPUz reports 0 RPM speed, and also 0% GPU load .

The strange thing is when he brought it to my place, it suffered no such issue, we could not replicate it.

He goes back home, same thing.
He has subsequently got power checked at his home,. and it is fine.

Is this a common issue? Quite simply. the fans are failing to spin, the temp hits its limit, and it kills itself. But Why? And why not do the same when at my place? He has an OEM cooler on there, unmodified, no software changes recently, just this new behaviour...

Exact card is Xfx Radeon r9 295x2, I wanted to touch base with other owners to ascertain whether this is a common issue or not...


----------



## MikeLarry

bump


----------



## pepepzu

Hi, I would like to know if this mod enable fan control? Im folding and I need fans at 100%
Thanks, I´m sorry for my english!


----------



## Mega Man

there is no way to control stock vrm fans period, afaik they have not even found it in the bios


----------



## MikeLarry

So i found going down to driver version 16.9.1 seemed to fix the GPU @ 100% issue.
We swapped PCI slots and it no longer overheats, so maybe it was a dirty slot.
Awaiting a run at his house, where it seemed to break more often (as you'd expect with an intermittent issue, always the way)

Fingers crossed.
Also instructed to get a proper PSU, he is getting a HX1200. Corsair.


----------



## rankel

Does anyone have the 295x2 aquacomputer waterblock? I have been searching for one for a long time, specifically the clear acrylic version. If you have one, I would love to buy it.
@pingolino may have one?


----------



## SwishaMane

Hello. I've just recently acquired a power color r9 295x2. On power up, the card lights up, fan spins, and pumps are running. But the card itself seems to be getting no power. The system is booting fine, kb and mouse initialize, and I THINK it may be booting into windows, but maybe it just hangs, cant know because no video. Well, after sitting on for minutes, the back of the pcb (backplate) is still room temp, not a single bit of heat. I even ran a primry display card, and can get into windows, and of course device manager doesnt find the 295x2. Again, the pcb isn't even getting warm.

Any on-board component known to cause this?

I've completely disassembled the card to clean the pcb, and found tons of TIM in the resistor area around cores, but it looks like normal stock paste, nothing conductive. Tons of tabacco dust. Tested pumps individually by themselves and they run nice and quiet. I have a weird feeling maybe an on-board surface mount fuse or device in general has popped. Can't think of any other reason the pcb wouldn't even get warm.

Any ideas??


----------



## pepepzu

Did you try to change the position of the bios switch?


----------



## npittas

I currently use the original rad plus the rad from an old H60 (or H80, dont really remember!!) in serial.


----------



## Mega Man

Yea, i bet they add a ton of restriction.


----------



## catfrog

Hi There! I'm currently looking for a 295x2 Waterblock. I'm flexible on price and prefer Koolance, but I'll accept anything at this point. Thanks!


----------



## HoneyBadger84

Absolute insanity these things are selling for $750-$1200 on EBay used right now. lol I'm listing mine up since I'm upgrading anyway, just to see what I can get for it. Near-mint condition, never used for mining or anything of the sort, and I never bothered overclocking cuz when I first had it, I had 2, and my PSU could only just handle QuadFire (1600W Lepa G).

This should be interesting. ^_^


----------



## Dagamus NM

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Absolute insanity these things are selling for $750-$1200 on EBay used right now. lol I'm listing mine up since I'm upgrading anyway, just to see what I can get for it. Near-mint condition, never used for mining or anything of the sort, and I never bothered overclocking cuz when I first had it, I had 2, and my PSU could only just handle QuadFire (1600W Lepa G).
> 
> This should be interesting. ^_^


I hear you on that. I have a pair that I had gone so far as to list for sale last April. Took the listing down and mined them instead. EK waterblocks and an EVGA 1600W G2 and still don't overclock them.


----------



## migraines suck

My XFX 295X2 sprung a leak in the radiator and neither XFX or Astek were able to offer any assistance in getting it repaired so I wanted to see if anyone here had any suggestions. Would it be easy to cut the lines and attach them to a new 120mm radiator? If so, does anyone know what material the stock radiator is made out of? Also, what size fittings will work with the stock tubing?


----------



## Its L0G4N

So I've been having trouble with my monitors the past couple days. I'm pretty sure it's driver related. But got my dual monitors working again and Windows device manager picks up both cards but HWMonitor doesn't list the temp or utilization of the second card in my single 295x2.


----------



## SavantStrike

migraines suck said:


> My XFX 295X2 sprung a leak in the radiator and neither XFX or Astek were able to offer any assistance in getting it repaired so I wanted to see if anyone here had any suggestions. Would it be easy to cut the lines and attach them to a new 120mm radiator? If so, does anyone know what material the stock radiator is made out of? Also, what size fittings will work with the stock tubing?


You'll have trouble finding fittings. Stock radiator is an aluminum AIO jobbie. You could just scavenge parts from an AIO. Make sure your pump is okay too as you're inside the window for failures.


----------



## Its L0G4N

It appears the latest driver for the 295x2 puts the second GPU to sleep. Anyone know how to fix this? I guess it's not crossfiring anymore.


----------



## Its L0G4N

So, I have 3 monitors connected via MDP and want to connect a 4th via DVI, but windows won't let me enable it. Any Ideas? Thread seems dead.


----------



## diggiddi

Check cable?


----------



## SilverDragon

Its L0G4N said:


> It appears the latest driver for the 295x2 puts the second GPU to sleep. Anyone know how to fix this? I guess it's not crossfiring anymore.


Did you test the GPU under full load to see if it was just being put to sleep when idle? that is the "official" behavior on newer drivers and unless you disable the powersaving functionality its the way it behaves. I am not on up to date drivers so cannot test but will test with the 16.8 drivers i have installed to verify that my crossfire is working as intended.


----------



## lCornholio

Hello, I need help with my XFX R9 295x2, I have a lot of throttling problems.
I don't understand how some people can run stock and not run into throttling issues.
Here are some stuff I've done to try solve it.

*I've changed my case to a Fractal Design Nano S /w window
*I've installad as many fans as I can. (two front 140AF as intakes, 295x2 rad /w dual corsair SP120 fans in push-pull config as intake from bottom, H80i GT /w dual fans SP120 as outtake at the back, two 140AF fans as outtake at top.)
* I opened up my GPU and change the paste to Arctic Silver 5.

My Furmark 1080p test was 6500ish with 110fps. On Farcry 4 I had EXTREME throttling. Core went down to around 400-600. Jumping a lot from 400-1000, unplayable.
My idle temps on GPU 1 is about 44 and GPU 2 is about 41, there is about 4-6c difference depending on whats going on.. And in games and benchmarks GPU1 reached 75c way faster while GPU2 sometimes goes to 75c and sometimes stays at 70c.
The pump is not making any sounds so I don't think it's broken.

So, maybe I've done something wrong like putting to much or to little paste on the GPU's, I doubt it as I put only as much as a rice corn, maybe a tiny bit less. (And the tiny screws to unloose the radiator are now ****ed and I can't get it off anymore..)
Or maybe my config is ****ed or I dunno... What should I do??? I want to be able to use crossfire, without this thing throttling so god damn easily. ****, I so regret buying this card at this point but now I'm stuck with it. Need HELP!


----------



## diggiddi

Prolly algae gunking up the AIO on the card only way to verify is to use another cooling solution or take it apart


----------



## lCornholio

lCornholio said:


> Hello, I need help with my XFX R9 295x2, I have a lot of throttling problems.
> I don't understand how some people can run stock and not run into throttling issues.
> Here are some stuff I've done to try solve it.
> 
> *I've changed my case to a Fractal Design Nano S /w window
> *I've installad as many fans as I can. (two front 140AF as intakes, 295x2 rad /w dual corsair SP120 fans in push-pull config as intake from bottom, H80i GT /w dual fans SP120 as outtake at the back, two 140AF fans as outtake at top.)
> * I opened up my GPU and change the paste to Arctic Silver 5.
> 
> My Furmark 1080p test was 6500ish with 110fps. On Farcry 4 I had EXTREME throttling. Core went down to around 400-600. Jumping a lot from 400-1000, unplayable.
> My idle temps on GPU 1 is about 44 and GPU 2 is about 41, there is about 4-6c difference depending on whats going on.. And in games and benchmarks GPU1 reached 75c way faster while GPU2 sometimes goes to 75c and sometimes stays at 70c.
> The pump is not making any sounds so I don't think it's broken.
> 
> So, maybe I've done something wrong like putting to much or to little paste on the GPU's, I doubt it as I put only as much as a rice corn, maybe a tiny bit less. (And the tiny screws to unloose the radiator are now ****ed and I can't get it off anymore..)
> Or maybe my config is ****ed or I dunno... What should I do??? I want to be able to use crossfire, without this thing throttling so god damn easily. ****, I so regret buying this card at this point but now I'm stuck with it. Need HELP!


Still need help guys...


----------



## lCornholio

OK an update here. I removed the entire AIO when I suspect the error is in it. And by the way.. ANYONE who have had throttling issues with the R9 295x2, this is the issue and problem right here. Probably...
To clarify, AMD R9 295x2 8GB is a dual GPU card with a built-in custom AIO that has two pumps on a radiator, here's a picture.








I disassembled everything and when I emptied the liquid solution, the solution was brown & yellow and had small pieces of junk in the solution.
When I dismantled the coldplate, it looked like this on the other side with the small fins.
















As we can see, the fins are murky and on the other image we clearly have a lot of **** stuck to stop the flow. I mixed up any one of GPU1 and GPU2 when I disassembled but we can clearly say that picture two was GPU1 because GPU1 had 5-10c higher idle temp and throttle much faster under load, though image one has some debris too but the fins look better. You can calmly say that this is why the AIOn has struggled with the cooling ...
The cooling paste I had changed was not in small amount, but it seemed unevenly distributed. I had added so much that it ran a bit on the edges but not so much. But I saw some "spots" where there was less thermal paste.

Now I have a question, does this mean that AIOn is broken / unusable ? Or is it possible to fix the coldplates and replace the liquid or have they damaged the fins? These fins are extremely small, with the naked eye I could barely distinguished them.


----------



## diggiddi

lCornholio said:


> OK an update here. I removed the entire AIO when I suspect the error is in it. And by the way.. ANYONE who have had throttling issues with the R9 295x2, this is the issue and problem right here. Probably...
> To clarify, AMD R9 295x2 8GB is a dual GPU card with a built-in custom AIO that has two pumps on a radiator, here's a picture.
> 
> 
> 
> 
> 
> 
> 
> 
> I disassembled everything and when I emptied the liquid solution, the solution was brown & yellow and had small pieces of junk in the solution.
> When I dismantled the coldplate, it looked like this on the other side with the small fins.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As we can see, the fins are murky and on the other image we clearly have a lot of **** stuck to stop the flow. I mixed up any one of GPU1 and GPU2 when I disassembled but we can clearly say that picture two was GPU1 because GPU1 had 5-10c higher idle temp and throttle much faster under load, though image one has some debris too but the fins look better. You can calmly say that this is why the AIOn has struggled with the cooling ...
> The cooling paste I had changed was not in small amount, but it seemed unevenly distributed. I had added so much that it ran a bit on the edges but not so much. But I saw some "spots" where there was less thermal paste.
> 
> Now I have a question, does this mean that AIOn is broken / unusable ? Or is it possible to fix the coldplates and replace the liquid or have they damaged the fins? These fins are extremely small, with the naked eye I could barely distinguished them.


Exactly what I told you earlier. All you need to do is flush it and refill with Distilled water and biocide or cooling fluid. Take a toothbrush and clean cold plate in 10% vinegar solution


----------



## Mega Man

You won't like my answer, is it possible, yes, would I, no. I have aios because of this and in particular asecrap, or asetroll.

They are junk, get a real block, or if you want, mod it for 2 aio. Afaik they use the same mounting, however I could be wrong.

My real answer, block it. Custom loop


----------



## lCornholio

I cleaned the water cooling and refilled it, now when I inserted it into my PC, it won't even reach POST. It's just black screen, and when I press the power button there is no delay, it shuts down immediately. I tried updating BIOS, clear CMOS, checked cables, nothing. Anyone knows how to fix this?


----------



## lCornholio

Forget it. I tested the card on a second rig, it's dead. Now if anyone in this group has baked an R9 295x2, I would love to know step-by-step how to do this correctly, and MAYBE... It will work again.


----------



## lCornholio

This group is dead...


----------



## diggiddi

Forum change messed everything up for sure


----------



## Dagamus NM

So my waterblocked 295x2s are gone. Long story but that entire PC is no longer. 

As such I still have the boxes, the stock coolers and backplates. I know people often have their stock cooling system fail so I am thinking of offering these for sale. Not sure what to sell them for? Suggestions? XFX 295x2 units.


----------



## skuddermudder

*Dead core/gpu?*

I have a Sapphire R9 295x2. Suddenly the computer only seems to detect one gpu, same when i use speccy,gpu-z,hardware monitor,afterburner and catalyst. Crossfire is disabled and blanked out, so not able to enable it.
Any solutions? I have tried all sorts of catalyst versions, using Windows 7. PSU is 1050W, tried also different psu no difference. Tried flashing bios, but only able to view and flash 1 core (primary core) using atiflash. 
Temperatures are fine and cooling seems to work. Any suggestions? Or is the card dead (meaning it still works, but I guess now its only a quite expensive, watercooled, R9 290X...


Ive tried using google finding others with similar issues, seems like some of them managed to get it working again, but unfortunately those solutions havent really done much for me.



Any help or suggestions would be appreciated.


----------



## ZealotKi11er

Dagamus NM said:


> So my waterblocked 295x2s are gone. Long story but that entire PC is no longer.
> 
> As such I still have the boxes, the stock coolers and backplates. I know people often have their stock cooling system fail so I am thinking of offering these for sale. Not sure what to sell them for? Suggestions? XFX 295x2 units.


50-75.


----------



## Dagamus NM

ZealotKi11er said:


> 50-75.


Cool, thanks. I just boxed them up and put them in storage. If somebody posts looking for a stock cooler some day I have them otherwise they can just sit in storage.


----------



## Pete43

Hi, I've been running a 295x2 since 2014.. I know this is an old thread but I just wanted to say I bought a 2nd 295x2 off Ebay a few weeks ago to play with till the Ampere cards show up. In some benchmarks like 3Dmark and Heaven the cards score extremely well with my 9900k ...But only half of one card runs in BFV... But in BF4 it pegs the fps's to 200 on high settings... (very little stuttering) I know it ain't perfect :-( 

Just thought I'd check and see if anyone else still uses these cards?

Pete


----------



## MSIMAX

just fired up old Bessie to do some 2020 testing

some card info

https://edgeup.asus.com/2014/tested-ares-iii-water-cooled-r9-295-x2-review/


----------



## MSIMAX

testing


----------



## MSIMAX




----------



## Igor Luis

My 295x2 is dead, not from the image. I would like to take advantage of her pump on my am4 processor, does anyone know where I can get an adapter?


----------



## MSIMAX

Igor Luis said:


> My 295x2 is dead, not from the image. I would like to take advantage of her pump on my am4 processor, does anyone know where I can get an adapter?


i wouldnt bother


----------



## MSIMAX




----------



## xzamples

If anyone has a R9 *295X2* laying around that they don't need and would like to give it a good new home, I would like to put it to good use in my nephews PC, shoot me a message, will pay of course but please be reasonable <3


----------



## Dagamus NM

Delete


----------



## K1ngdöm

I have 2 R9 295x2's. Trying to Bios mod them to overclock. I am trying to put the 390x memory timings on my 295's; and push a constant voltage and speed. If anyone knows how lmk.


----------

